Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you are Microsoft as GigaScaler with almost unlimited cash and can ignore getting profit of your api/models its pretty easy to undercut all the other companies and offer it very cheap just to gain advantage in the future.


What the cost cutting measures suggest is that AI like this could maybe soon be run on consumer hardware. That combined with actually open source language models could be huge. OpenAI won't allow for that for obvious reasons, but this confirms that the optimizations are there, and that's exciting enough news on its own.


I mean, meta's new LAMA model runs on a single A100 in the 13B parameter variant (which performs similar to GPT3 65B).


"Performs" on paper until they give a demo.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: