Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Isn't the training most of the cost? In which case the current models could have a very long lifetime even if new models are never trained again. They'll go gradually out of date, but for many purposes will still be useful. If they can pull new info from the web they may stay relevant for decades. It's only if running the chatbots is not cost effective that everything halts and my understanding is that the cost of that is lower relatively. Even now, older models are still being used. Also, performance optimizations seem likely to soon reduce the need for data center build out and reduce costs. Seems too soon to say where this is all going. Who even knows if the GPU chips will improve dramatically or if something else (more AI optimized processor architectures) will replace them? It's true that right now it looks like a bubble, but the future is still very much in flux, and the value of the models already created may not disappear overnight.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: