Install latest/stable of ollama

Ubuntu 16.04 or later?

Make sure snap support is enabled in your Desktop store.


Install using the command line

sudo snap install ollama

Don't have snapd? Get set up for snaps.

Channel Version Published

Details for ollama

License

  • MIT

Last updated

  • 8 December 2025 - latest/stable
  • 6 December 2025 - latest/edge

Websites


Contact


Source code


Report a bug


Report a Snap Store violation

Share this snap

Generate an embeddable card to be shared on external websites.

Get up and running with large language models, locally.

Ollama – Local AI Model Runner

Run, manage, and switch between a wide range of open‑source LLMs directly on your local machine. Ollama provides fast, offline inference with a simple CLI and API, ensuring your data never leaves the device. Ideal for developers, researchers, and anyone who wants powerful AI without cloud dependencies.


Install ollama on your Linux distribution

Choose your Linux distribution to get detailed installation instructions. If yours is not shown, get more details on the installing snapd documentation.


Where people are using ollama

Users by distribution (log)

Ubuntu 24.04
Ubuntu 22.04
Ubuntu 20.04
Ubuntu 25.10
Ubuntu 25.04
Zorin OS 17
pop 22.04
Debian 12
Ubuntu 18.04
Zorin OS 18
Fedora 43
Ubuntu 24.10
Debian 13
Linux Mint 22.2
Manjaro