Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are dozens and maybe hundreds of different approaches that could theoretically get around the limitations of GPT4 that merely haven't been trained at scale yet. There is absolutely no lack of ideas in this space, including potentially revolutionary ones, but they take time and money to prove out.


I'm sure there are lots of ideas, but it doesn't mean they're any good or will necessarily transform AI to the next level.

It's going to take time to figure out what works and what doesn't.

There's a reason why Sam Altman is saying they're not training GPT5, and it's not because they think GPT4 is good enough.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: