Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's just plain wrong.

For scaffolding to be "safe", you basically need that scaffolding to know exactly what the LLM is being used for, and outsmart it at every turn if it misbehaves. That's impractical-to-impossible. There are tasks that need access for legitimate reasons - like human tasks that need hammer access - and the same access can always be used for illegitimate reasons.

It's like trying to engineer a hammer that can't be used to bludgeon someone to death. Good fucking luck.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: