Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are many hopes, and even claims, that LLMs could be AGI with just a little bit of extra intelligence. There are also many claims that they have both a model of the real world, and a system for rational logic and planning. It's useful to test the current status quo in such a simplistic and fixed real-world task.


There's the rub I suppose. I don't think an LLM can achieve AGI on its own. But I bet it could with the help of a Turing machine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: