Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was just reading the code: it looks like minor tweaks to utils.c and this should run nicely with local models using Ollama or LM Studio. That should be safe enough.

Off topic, sorry, but to me the real security nightmare is the new ‘AI web browsers’ - I can’t imagine using one of those because of prompt injection attacks.



A local model will be just as happy to provide a shell command that trashes your local disks as any remote one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: