Install latest/stable of ollama
Ubuntu 16.04 or later?
Make sure snap support is enabled in your Desktop store.
Ollama – Local AI Model Runner
Run, manage, and switch between a wide range of open‑source LLMs directly on your local machine. Ollama provides fast, offline inference with a simple CLI and API, ensuring your data never leaves the device. Ideal for developers, researchers, and anyone who wants powerful AI without cloud dependencies.
You are about to open
Do you wish to proceed?
Thank you for your report. Information you provided will help us investigate further.
There was an error while sending your report. Please try again later.
Generate an embeddable card to be shared on external websites.
Choose your Linux distribution to get detailed installation instructions. If yours is not shown, get more details on the installing snapd documentation.