I don't think anyone seriously believes AI will disappear without a trace. At the very least, LLMs will remain as the state of the art in high-level language processing (editing, translation, chat interfaces, etc.)
The real problem is the massive over-promises of transforming every industry, replacing most human labor, and eventually reaching super-intelligence based on current models.
I hope we can agree that these are all wholly unattainable, even from a purely technological perspective. However, we are investing as if there were no tomorrow without these outcomes, building massive data-centers filled with "GPUs" that, contrary to investor copium, will quickly become obsolete and are increasingly useless for general-purpose datacenter applications (Blackwell Ultra has NO FP64 hardware, for crying out loud...).
We can agree that the bubble deflating, one way or another, is the best outcome long term. That said, the longer we fuel these delusions, the worse the fallout will be when it does. And what I fear is that one day, a bubble (perhaps this one, perhaps another) will grow so large that it wipes out globalized free-market trade as we know it.
Bubbles bursting aren't bad unless you were overinvested in the bubble. Consider that you'll be wiping your ass with DIMMs once this one bursts; I can always put more memory to good use.
> Bubbles bursting aren't bad unless you were overinvested in the bubble.
That's what I am trying to say: every big technology player, every industry, every government is all in on AI. That means you and I are along for the ride, whether we like it or not.
> Consider that you'll be wiping your ass with DIMMs once this one bursts; I can always put more memory to good use.
Except you can't, because DRAM makers have almost entirely pivoted from making (G)DDR chips to making HBM instead. HBM must be co-integrated at the interposer level and 3D-stacked, resulting in terrible yield. This makes it extremely pricy and impossible to package separately (no DIMMs).
So when I say the world is all in on this, I mean it. With every passing minute, there is less and less we can salvage once this is over; for consumer DRAM, it's already too late.
Games tend to avoid FP64 compute as Nvidia has always gimped it in consumer GPUs, so you are somewhat lucky there. "Lucky" as in, you get to enjoy the broken-ass, glitchy FP32 physics that we've all grown to love so much.
However, if you actually need the much higher precision of FP64 for scientific computing (like most non-AI data center users do) and extremely slow emulation is not an option, consider yourself fucked.
The real problem is the massive over-promises of transforming every industry, replacing most human labor, and eventually reaching super-intelligence based on current models.
I hope we can agree that these are all wholly unattainable, even from a purely technological perspective. However, we are investing as if there were no tomorrow without these outcomes, building massive data-centers filled with "GPUs" that, contrary to investor copium, will quickly become obsolete and are increasingly useless for general-purpose datacenter applications (Blackwell Ultra has NO FP64 hardware, for crying out loud...).
We can agree that the bubble deflating, one way or another, is the best outcome long term. That said, the longer we fuel these delusions, the worse the fallout will be when it does. And what I fear is that one day, a bubble (perhaps this one, perhaps another) will grow so large that it wipes out globalized free-market trade as we know it.