# Installation Ollama und OpenWebUI

## 1. Verzeichnisstruktur vorbereiten

Alle Containerdaten liegen unter `/opt/docker`, damit System und Daten sauber getrennt bleiben:

<div class="contain-inline-size rounded-2xl relative bg-token-sidebar-surface-primary" id="bkmrk-sudo-mkdir--p-%2Fopt%2Fd"><div class="sticky top-9"><div class="absolute end-0 bottom-0 flex h-9 items-center pe-2"><div class="bg-token-bg-elevated-secondary text-token-text-secondary flex items-center gap-4 rounded-sm px-2 font-sans text-xs">  
</div></div></div><div class="overflow-y-auto p-4" dir="ltr">`sudo <span class="hljs-built_in">mkdir</span> -p /opt/docker/ollamasudo <span class="hljs-built_in">mkdir</span> -p /opt/docker/open-webui`</div></div>Optional Zugriffsrechte setzen:

<div class="contain-inline-size rounded-2xl relative bg-token-sidebar-surface-primary" id="bkmrk-sudo-chown--r-kiadmi"><div class="sticky top-9"><div class="absolute end-0 bottom-0 flex h-9 items-center pe-2"><div class="bg-token-bg-elevated-secondary text-token-text-secondary flex items-center gap-4 rounded-sm px-2 font-sans text-xs">  
</div></div></div><div class="overflow-y-auto p-4" dir="ltr">`sudo <span class="hljs-built_in">chown</span> -R kiadmin:docker /opt/docker`</div></div>---

## 2. Docker-Compose Datei

Pfad: `/opt/docker/docker-compose.yml`

<div class="contain-inline-size rounded-2xl relative bg-token-sidebar-surface-primary" id="bkmrk-version%3A-%273.8%27-servi"><div class="sticky top-9"><div class="absolute end-0 bottom-0 flex h-9 items-center pe-2"><div class="bg-token-bg-elevated-secondary text-token-text-secondary flex items-center gap-4 rounded-sm px-2 font-sans text-xs">  
</div></div></div><div class="overflow-y-auto p-4" dir="ltr">`<span class="hljs-attr">version:</span> <span class="hljs-string">'3.8'</span><span class="hljs-attr">services:</span>  <span class="hljs-attr">ollama:</span>    <span class="hljs-attr">image:</span> <span class="hljs-string">ollama/ollama</span>    <span class="hljs-attr">container_name:</span> <span class="hljs-string">ollama</span>    <span class="hljs-attr">ports:</span>      <span class="hljs-bullet">-</span> <span class="hljs-string">"11434:11434"</span>    <span class="hljs-attr">restart:</span> <span class="hljs-string">always</span>    <span class="hljs-attr">runtime:</span> <span class="hljs-string">nvidia</span>    <span class="hljs-attr">volumes:</span>      <span class="hljs-bullet">-</span> <span class="hljs-string">/opt/docker/ollama:/root/.ollama</span>  <span class="hljs-attr">openwebui:</span>    <span class="hljs-attr">image:</span> <span class="hljs-string">ghcr.io/open-webui/open-webui:ollama</span>    <span class="hljs-attr">container_name:</span> <span class="hljs-string">open-webui</span>    <span class="hljs-attr">restart:</span> <span class="hljs-string">always</span>    <span class="hljs-attr">runtime:</span> <span class="hljs-string">nvidia</span>    <span class="hljs-attr">ports:</span>      <span class="hljs-bullet">-</span> <span class="hljs-string">"3000:8080"</span>    <span class="hljs-attr">environment:</span>      <span class="hljs-bullet">-</span> <span class="hljs-string">OLLAMA_API_BASE_URL=http://ollama:11434</span>    <span class="hljs-attr">volumes:</span>      <span class="hljs-bullet">-</span> <span class="hljs-string">/opt/docker/open-webui:/app/backend/data</span>      <span class="hljs-comment"># Eigene CSS-Anpassungen (optional)</span>      <span class="hljs-comment"># - /home/pleibling/docker/ai/custom.css:/app/build/static/custom.css:ro</span>    <span class="hljs-attr">depends_on:</span>      <span class="hljs-bullet">-</span> <span class="hljs-string">ollama</span>`</div></div>---

## 3. Container starten

<div class="contain-inline-size rounded-2xl relative bg-token-sidebar-surface-primary" id="bkmrk-cd-%2Fopt%2Fdocker-docke"><div class="sticky top-9"><div class="absolute end-0 bottom-0 flex h-9 items-center pe-2"><div class="bg-token-bg-elevated-secondary text-token-text-secondary flex items-center gap-4 rounded-sm px-2 font-sans text-xs">  
</div></div></div><div class="overflow-y-auto p-4" dir="ltr">`<span class="hljs-built_in">cd</span> /opt/dockerdocker compose up -d`</div></div>Status prüfen:

<div class="contain-inline-size rounded-2xl relative bg-token-sidebar-surface-primary" id="bkmrk-docker-ps"><div class="sticky top-9"><div class="absolute end-0 bottom-0 flex h-9 items-center pe-2"><div class="bg-token-bg-elevated-secondary text-token-text-secondary flex items-center gap-4 rounded-sm px-2 font-sans text-xs">  
</div></div></div><div class="overflow-y-auto p-4" dir="ltr">`docker ps`</div></div>- Ollama-API → `http://<Server-IP>:11434`
- OpenWebUI → `http://<Server-IP>:3000`

---

## 4. Modelle installieren

Ollama lädt Modelle nach Bedarf.  
Installation erfolgt z. B. über:

<div class="contain-inline-size rounded-2xl relative bg-token-sidebar-surface-primary" id="bkmrk-docker-exec--it-olla"><div class="sticky top-9"><div class="absolute end-0 bottom-0 flex h-9 items-center pe-2"><div class="bg-token-bg-elevated-secondary text-token-text-secondary flex items-center gap-4 rounded-sm px-2 font-sans text-xs">  
</div></div></div><div class="overflow-y-auto p-4" dir="ltr">`docker <span class="hljs-built_in">exec</span> -it ollama ollama pull gpt-oss:20b`</div></div>**Verwendete LLMs in dieser Umgebung:**

- **GPT-OSS:20b** → Hauptmodell (groß, sehr leistungsfähig)
- **Llama 3.1:12b**
- **Mistral:7b**
- **Gemma3:12b**
- **DeepSeek-R1:8b**
- **Qwen3:14b**

Optional kannst du auch in OpenWebUI für jedes Modell eigene Presets/Workspaces definieren.

---

## 5. Update der Container

<div class="contain-inline-size rounded-2xl relative bg-token-sidebar-surface-primary" id="bkmrk-cd-%2Fopt%2Fdocker-docke-1"><div class="sticky top-9"><div class="absolute end-0 bottom-0 flex h-9 items-center pe-2"><div class="bg-token-bg-elevated-secondary text-token-text-secondary flex items-center gap-4 rounded-sm px-2 font-sans text-xs">  
</div></div></div><div class="overflow-y-auto p-4" dir="ltr">`<span class="hljs-built_in">cd</span> /opt/dockerdocker compose pulldocker compose up -d`</div></div>---

## 6. Backup-Hinweis

Da alle persistenten Daten in `/opt/docker/ollama` und `/opt/docker/open-webui` liegen, reicht ein Backup dieser Ordner.

---