Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The use of the term "hallucination" for LLMs is very deceptive, as it implies that there is a "mind".

In ordinary terms, "hallucinations" by a machine would simply be described as the machine being useless, or not fit for purpose.

For example, if a simple calculator (or even a person) returned the value "5" for 2+2= , you wouldn't describe it as "hallucinating" the answer....



"Hallucination" happened because we got AI images before AI text, but "confabulation" is a better term.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: