Expand Up @@ -5,6 +5,8 @@ export const APP_NAME = 'Open WebUI'; export const WEBUI_BASE_URL = dev ? `http://${location.hostname}:8080` : ``; export const WEBUI_API_BASE_URL = `${WEBUI_BASE_URL}/api/v1`; export const LITELLM_API_BASE_URL = `${WEBUI_BASE_URL}/...
feat: litellm yaml Feb 24, 2024 tsconfig.json chat feature added Oct 9, 2023 vite.config.ts refac: versioning using package Feb 22, 2024 打开WebUI(以前的 Ollama WebUI)👋 针对LLM 的用户友好的 WebUI,支持的 LLM 运行程序包括 Ollama 和 OpenAI 兼容的 API。有关更多信息,请务必查看我们的O...
feat: litellm yaml 11个月前 tsconfig.json chat feature added 1年前 vite.config.ts refac: versioning using package 11个月前 README MIT Open WebUI (Formerly Ollama WebUI) 👋 User-friendly WebUI for LLMs, supported LLM runners include Ollama and OpenAI-compatible APIs. For more informatio...
🛠️ Native Python Function Calling: Introducing native Python function calling within Open WebUI. We’ve also included a built-in code editor to seamlessly develop and integrate function code within the 'Tools' workspace. With this, you can significantly enhance your LLM’s capabilities by ...
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main In case you want to update your local Docker installation to the latest version, you can do it withWatch...
Open WebUI (Formerly Ollama WebUI) 👋 Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs. For more information, be sure to check out our Open Web...
🛠️ Native Python Function Calling: Introducing native Python function calling within Open WebUI. We’ve also included a built-in code editor to seamlessly develop and integrate function code within the 'Tools' workspace. With this, you can significantly enhance your LLM’s capabilities by ...
🛠️ Native Python Function Calling: Introducing native Python function calling within Open WebUI. We’ve also included a built-in code editor to seamlessly develop and integrate function code within the 'Tools' workspace. With this, you can significantly enhance your LLM’s capabilities by ...
Open WebUI (Formerly Ollama WebUI) 👋 Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs. For more information, be sure to check out our Open Web...
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs. For more information, be sure to check out our Open WebUI Documentation. Key Features of Open Web...