Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I use AI/ML for ideas today. I love the simple input/output of the chat style, it will win for most things just as keyword search is the best for search output.

I use it for re-writing content better, writing ideas, simplifying text (legal/verbose -- simplifying terms is a killer feature really) and context even though trust is limited of the output it is helpful.

I love the art / computer vision side of AI/ML. Though I only like to do that with tools on my machine than rely on a dataset or company that is very closed, that is harder to do with AI/ML because of the storage/processing needed.

I hate blackboxes and magic I don't have access to, though I am a big fan of stable unchanging input/output atomic apis, as long as I have access to the flow. The chat input/output is so simple it will win as it will never really have a breaking change. Until commercial AI/ML GPTs are more open in reality it can't be trusted to not be a trojan horse or trap. What happens when it goes away or the model changes or the terms change?

As far as company/commercial, Google seems to be the most open and Google Brain really started this whole thing with transformers.

Transformers, the T in GPT was invented at Google during Google Brain [1][2]. They made possible this round of progress.

> Transformers were introduced in 2017 by a team at Google Brain and are increasingly the model of choice for NLP problems, replacing RNN models such as long short-term memory (LSTM). The additional training parallelization allows training on larger datasets. This led to the development of pretrained systems such as BERT (Bidirectional Encoder Representations from Transformers) and GPT (Generative Pre-trained Transformer), which were trained with large language datasets, such as the Wikipedia Corpus and Common Crawl, and can be fine-tuned for specific tasks.

Google also gave the public TensorFlow [3] and DeepDream [4] that really started the intense excitement of AI/ML. I was super interested when the AI art / computer vision side started to come up. The GANs for style transfer and stable diffusion are intriguing and euphoric almost in output.

In terms of GPT/chat, Bard or some iteration of it, will most likely win long term, though I wish it was just called Google Brain. Bard is a horrible name.

ChatGPT basically used Google Brain created AI tech, transformers. These were used to build ClosedGPT. For that reason it is NopeGPT. ChatGPT is really just datasets, which no one knows, these could swap at any time run some misinformation then swap the next day. This is data blackboxing and gaslighting at the up most level. Not only that it is largely funded by private sources and it could be some authoritarian money. Again, blackboxes create distrust.

Microsoft is trusting OpenAI and that is a risk. Maybe their goal is embrace, extend, extinguish here but it seems with Google and Apple that Microsoft may be a bit behind on this. Github Co-pilot is great though. Microsoft usually comes along later and make an accessible version. The AI/ML offerings on Azure are already solid. AI/ML is suited for large datasets so cloud companies will benefit the most, it also is very, very costly and this unfortunately keeps it in BigCo or wealthy only arenas for a while.

Google Brain and other tech is way more open already than "Open"AI.

ChatGPT/OpenAI just front ran the commercial side, but long term they aren't really innovating like Google is on this. They look like a leader from the marketing/pump but they are a follower.

[1] https://en.wikipedia.org/wiki/Google_Brain

[2] https://en.wikipedia.org/wiki/Transformer_(machine_learning_...

[3] https://en.wikipedia.org/wiki/TensorFlow

[4] https://en.wikipedia.org/wiki/DeepDream



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: