Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Now with LLMs, I think one of the great leaps is the idea that it’s no longer necessary to be “pedantic” when giving computers instructions

Yes, but they also can't. They can't be pedantic or follow explicit instructions. That's the other side of the coin that isn't being presented.

They can present the right elements of the story in the the right places, but they can't perform it.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: