ollama

Install latest/stable of ollama

Ubuntu 16.04 or later?

Make sure snap support is enabled in your Desktop store.


Install using the command line

sudo snap install ollama

Don't have snapd? Get set up for snaps.

Channel Version Published

Get up and running with large language models, locally.

Ollama – Local AI Model Runner

Run, manage, and switch between a wide range of open‑source LLMs directly on your local machine. Ollama provides fast, offline inference with a simple CLI and API, ensuring your data never leaves the device. Ideal for developers, researchers, and anyone who wants powerful AI without cloud dependencies.

Details for ollama

License
  • MIT

Last updated
  • Yesterday - latest/stable
  • 21 November 2025 - latest/edge

Websites

Contact

Source code

Report a bug

Report a Snap Store violation

Share this snap

Generate an embeddable card to be shared on external websites.


Install ollama on your Linux distribution

Choose your Linux distribution to get detailed installation instructions. If yours is not shown, get more details on the installing snapd documentation.


Where people are using ollama

Users by distribution (log)
Ubuntu 24.04
Ubuntu 22.04
Ubuntu 20.04
Ubuntu 25.10
Ubuntu 25.04
Zorin OS 17
pop 22.04
Debian 12
Ubuntu 18.04
Ubuntu 24.10
Zorin OS 18
Debian 13
Fedora 43
Linux Mint 22.2
Linux Mint 22.1
Manjaro
KDE Neon 24.04
Fedora 42
Kali Linux 2025.4
Linux Mint 21.3