Add README files for various applications: Decidim, Discourse, Example Admin, Example App, Ghost, Immich, Keila, Lemmy, Listmonk, Loomio, Matrix, Memcached, MySQL, Open WebUI, OpenProject, PostgreSQL, Redis, and vLLM
This commit is contained in:
33
open-webui/README.md
Normal file
33
open-webui/README.md
Normal file
@@ -0,0 +1,33 @@
|
||||
# Open WebUI
|
||||
|
||||
Open WebUI is a comprehensive, open-source web interface for AI models. It features a user-friendly design similar to ChatGPT and can connect to local or hosted LLM backends.
|
||||
|
||||
## Dependencies
|
||||
|
||||
None required, but works best with a local LLM backend like **vLLM** deployed on your cluster.
|
||||
|
||||
## Configuration
|
||||
|
||||
Key settings configured through your instance's `config.yaml`:
|
||||
|
||||
- **domain** - Where the UI will be accessible (default: `chat.{your-cloud-domain}`)
|
||||
- **vllmApiUrl** - URL of your LLM backend (default: connects to vLLM on the cluster)
|
||||
- **enableSignup** - Whether to allow new account creation (default: `false`)
|
||||
- **storage** - Persistent volume size (default: `10Gi`)
|
||||
|
||||
## Access
|
||||
|
||||
After deployment, Open WebUI will be available at:
|
||||
- `https://chat.{your-cloud-domain}`
|
||||
|
||||
## First-Time Setup
|
||||
|
||||
1. Deploy a local LLM backend (e.g., vLLM) if you haven't already
|
||||
|
||||
2. Add and deploy the app:
|
||||
```bash
|
||||
wild app add open-webui
|
||||
wild app deploy open-webui
|
||||
```
|
||||
|
||||
3. Create your account and start chatting with your local AI models
|
||||
Reference in New Issue
Block a user