Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't get the "magic" people are seeing. It makes sense.

>LLMs have somehow learned to fill in the blanks

It's not somehow, it's because they have read a ton of books, documents, etc and can make enough links between cheese and refrigerator and follow that back to know that a refrigerator needs to be opened.

I have seen a lot of very clever AI examples using the latest tools, but I haven't seen anything that seems difficult to deconstruct.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: