Get up and running with large language models, locally.
Run Llama 2, Code Llama, and other models. Customize and create your own.
## Configuration
open-webui
accepts the following configuration keys & values (which you can modify with sudo snap set open-webui <key>=<value>
)
data-dir
(default: $SNAP_COMMON
, i.e. /var/snap/open-webui/common/data
)
enable-signup
(default: true
)
host
(default: 0.0.0.0
)
ollama-api-base-url
(default: http://localhost:11434/api
)
openai-api-base-url
(default: https://api.openai.com/v1
)
openai-api-key
(default: ""
)
port
(default: 8080
)
secret-key
(generated at install time)
## open-webui and ollama
open-webui
works with [ollama](https://ollama.com) out of the box, as long as ollama
is installed. Simplest way to install ollama
with settings that will work with open-webui
: sudo snap install ollama --channel=beta
## Features
- Starts the systemd service (
ollama serve
) automatically at install time.
- Offers configuration keys (respected by both the
ollama
command line tool as well as the systemd service):