Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
ml-anon
33 days ago
|
parent
|
context
|
favorite
| on:
Over fifty new hallucinations in ICLR 2026 submiss...
No it’s not. It’s made up bullshit that arises for reasons that literally no one can formalize or reliably prevent. This is the exact opposite of specific.
crazygringo
33 days ago
[–]
Just because we can't reliably
prevent
them doesn't mean they're not an easily recognizable and meaningful category of error for us to talk about.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: