Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Think of it as the early years of UNIX & PC. Running inferences and tools locally and offline opens doors to new industries. We might not even need client/server paradigm locally. LLM is just a probabilistic library we can call.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: