34 lines
1.0 KiB
Markdown
34 lines
1.0 KiB
Markdown
# Open WebUI
|
|
|
|
Open WebUI is a comprehensive, open-source web interface for AI models. It features a user-friendly design similar to ChatGPT and can connect to local or hosted LLM backends.
|
|
|
|
## Dependencies
|
|
|
|
None required, but works best with a local LLM backend like **vLLM** deployed on your cluster.
|
|
|
|
## Configuration
|
|
|
|
Key settings configured through your instance's `config.yaml`:
|
|
|
|
- **domain** - Where the UI will be accessible (default: `chat.{your-cloud-domain}`)
|
|
- **vllmApiUrl** - URL of your LLM backend (default: connects to vLLM on the cluster)
|
|
- **enableSignup** - Whether to allow new account creation (default: `false`)
|
|
- **storage** - Persistent volume size (default: `10Gi`)
|
|
|
|
## Access
|
|
|
|
After deployment, Open WebUI will be available at:
|
|
- `https://chat.{your-cloud-domain}`
|
|
|
|
## First-Time Setup
|
|
|
|
1. Deploy a local LLM backend (e.g., vLLM) if you haven't already
|
|
|
|
2. Add and deploy the app:
|
|
```bash
|
|
wild app add open-webui
|
|
wild app deploy open-webui
|
|
```
|
|
|
|
3. Create your account and start chatting with your local AI models
|