Install latest/stable of ollama
Ubuntu 16.04 or later?
Make sure snap support is enabled in your Desktop store.
Run Llama 2, Code Llama, and other models. Customize and create your own.
## Configuration
open-webui
accepts the following configuration keys & values (which you can modify with sudo snap set open-webui <key>=<value>
)
data-dir
(default: $SNAP_COMMON
, i.e. /var/snap/open-webui/common/data
)enable-signup
(default: true
)host
(default: 0.0.0.0
)ollama-api-base-url
(default: http://localhost:11434/api
)openai-api-base-url
(default: https://api.openai.com/v1
)openai-api-key
(default: ""
)port
(default: 8080
)secret-key
(generated at install time)## open-webui and ollama
open-webui
works with [ollama](https://ollama.com) out of the box, as long as ollama
is installed. Simplest way to install ollama
with settings that will work with open-webui
: sudo snap install ollama --channel=beta
## Features
ollama serve
) automatically at install time.ollama
command line tool as well as the systemd service):sudo snap set ollama host=...
to set OLLAMA_HOST
(see https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network)sudo snap set ollama models=...
to set OLLAMA_MODELS
(see https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-set-them-to-a-different-location)sudo snap set ollama origins=...
to set OLLAMA_ORIGINS
with default value "*://localhost"
(see https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-allow-additional-web-origins-to-access-ollama)Thank you for your report. Information you provided will help us investigate further.
There was an error while sending your report. Please try again later.
You are about to open
Do you wish to proceed?
Generate an embeddable card to be shared on external websites.
Choose your Linux distribution to get detailed installation instructions. If yours is not shown, get more details on the installing snapd documentation.