Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Do you think this current AI hype machine came from something other than transformers?


No, but it is weird to use "modern" here. Modern suggest a longer timeframe. I would say machine learning deep NNs is modern AI. It's just not true that everything not transformer is outdated, but it is "kinda" true that everything not deep NN is outdated.


The phrasing kinda makes sense to me.

Consider "modern" to mean NN/connectionist vs GOFAI AI attempts like CYC or SOAR.

I guess it depends on how you define "AI", and whether you accept the media's labelling of anything ML-related as AI.

To me, LLMs are the first thing deserving to be called AI, and other NNs like CNNs better just called ML since there is no intelligence there.


But that's been the case for the last 60 years. Whatever came out in the last 10 years is the first thing deserving to be called AI, and everything else is just basic computer science algorithms that every practitioner should know. Eliza was AI in 1967; now it's just string substitution. Prolog was AI in 1972; now it's logic programming. Beam search and A* were AI in the 1970s; now they're just search algorithms. Expert systems were AI in the 1980s; now they're just rules engines. Handwriting recognition, voice recognition, and speech synthesis were AI in the 90s; now they're just multilayer perceptrons aka artificial neural nets. SVMs were AI in the 00s; now they're just support vector machines. CNNs and LSTMs were AI in the 10s; now they're just CNNs and LSTMs.

https://en.wikipedia.org/wiki/AI_effect


Yeah, but for a while it seemed we'd gotten over that, and in the "modern era" people were just talking about ML. Nobody in 2012, as best I can recall, was referring to AlexNet as "AI", but then (when did it start?) at some point the media started calling everything AI, and eventually the ML community capitulated and started calling it that too - maybe because the VC's wanted to invest in sexy AI, not ML.


> Consider "modern" to mean NN/connectionist vs GOFAI AI attempts like CYC or SOAR.

Well this is what Im trying to say too!


Consider "modern" to mean NN/connectionist vs GOFAI AI attempts like CYC or SOAR.

I dunno. The earliest research into what we now call "neural networks" dates back to at least the 1950's (Frank Rosenblatt and the Perceptron) and arguably into the 1940's (Warren McCulloch and Walter Pitts and the TLU "neuron"). And depending on how generous one is with their interpretation of certain things, arguments have been made that the history of neural network research dates back to before the invention of the digital computer altogether, or even before electrical power was ubiquitous (eg, late 1800's). Regarding the latter bit, I believe it was Jurgen Schmidhuber who advanced that argument in an interview I saw a while back and as best as I can recall, he was referring to a certain line of mathematical research from that era.

In the end, defining "modern" is probably not something we're ever going to reach consensus on, but I really think your proposal misses the mark by a small touch.


Sure, the history of NNs goes back a while, but nobody was attempting to build AI out of perceptrons (single layer), which were famously criticized as not being able to even implement an XOR function.

The modern era of NNs started with being able to train multilayer neural nets using backprop, but the ability to train NNs large enough to actually be useful for complex things AI research, can arguably be dated to the 2012 Imagenet competition when Geoff Hinton's team repurposed GPUs to train AlexNet.

But, AlexNet was just a CNN, a classifier, which IMO is better just considered as ML, not AI, so if we're looking for the first AI in this post-GOFAI world of NN-based experimentation, then it seems we have to give the nod to transformer-based LLMs.


> It's just not true that everything not transformer is outdated.

Am not expert, do you have some links about this? i.e. a neural net construction that outperforms a transformer model of the same size.


ConvNets Match Vision Transformers at Scale (Oct 2023)

https://arxiv.org/abs/2310.16764


Well transformers still have some plausible competition in NLP but besides that, there are other fields of AI where convnets or RNNs still make a lot of sense.


Current AI hype comes from GPT

Transformer+Scaling+$$$ triggered the current hype


No, but "modern AI" and "this current AI hype machine" are not equivalent phrases.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: