Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ollama is dead easy to use. It's a single binary that downloads models on demand, like docker downloads images.

  pacman -S ollama
  ollama serve
  ollama run llama2:13b 'insert prompt'
https://ollama.ai/


ollama wraps llama.cpp into a docker container, correct? Besides that it seeks like a go server for chat?


The ollama shell is really nice.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: