ollama

Install latest/stable of ollama

Ubuntu 16.04 or later?

Make sure snap support is enabled in your Desktop store.


Install using the command line

sudo snap install ollama

Don't have snapd? Get set up for snaps.

Channel Version Published

Get up and running with large language models, locally.

Ollama – Local AI Model Runner

Run, manage, and switch between a wide range of open‑source LLMs directly on your local machine. Ollama provides fast, offline inference with a simple CLI and API, ensuring your data never leaves the device. Ideal for developers, researchers, and anyone who wants powerful AI without cloud dependencies.

Details for ollama

License
  • MIT

Last updated
  • 10 September 2025 - latest/stable
  • 16 September 2025 - latest/edge

Websites

Contact

Source code

Report a bug

Report a Snap Store violation

Share this snap

Generate an embeddable card to be shared on external websites.


Install ollama on your Linux distribution

Choose your Linux distribution to get detailed installation instructions. If yours is not shown, get more details on the installing snapd documentation.