Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
jibal
6 months ago
|
parent
|
context
|
favorite
| on:
What happens when people don't understand how AI w...
Chain of thought is what LLMs
report
to be their internal process--but they have no access to their internal process ... their reports are confabulation, and a study by Anthropic showed how far they are from actual internal processes.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: