User-friendly AI Interface (Supports Ollama, OpenAI API, ...) - Merge pull request #844 from open-webui/litellm · open-webui/open-webui@1a9a56d
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main Keeping Your Docker Installation Up-to-Date In case you want to update your local Docker installation to ...
🛠️ Native Python Function Calling: Introducing native Python function calling within Open WebUI. We’ve also included a built-in code editor to seamlessly develop and integrate function code within the 'Tools' workspace. With this, you can significantly enhance your LLM’s capabilities by ...
🛠️ Native Python Function Calling: Introducing native Python function calling within Open WebUI. We’ve also included a built-in code editor to seamlessly develop and integrate function code within the 'Tools' workspace. With this, you can significantly enhance your LLM’s capabilities by ...
Open WebUI: Server Connection Error If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the--network=hostflag in your docker command to...
Open WebUI (Formerly Ollama WebUI) 👋 Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs. For more information, be sure to check out our Open Web...
🐳 Docker Launch Issue: Resolved the problem preventing Open-WebUI from launching correctly when using Docker.Changed🔍 Enhanced Search Prompts: Improved the search query generation prompts for better accuracy and user interaction, enhancing the overall search experience....
🛠️ Native Python Function Calling: Introducing native Python function calling within Open WebUI. We’ve also included a built-in code editor to seamlessly develop and integrate function code within the 'Tools' workspace. With this, you can significantly enhance your LLM’s capabilities by ...
Open WebUI: Server Connection Error If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the--network=hostflag in your docker command to...
feat: litellm yaml Feb 24, 2024 tsconfig.json chat feature added Oct 9, 2023 vite.config.ts refac: versioning using package Feb 22, 2024 打开WebUI(以前的 Ollama WebUI)👋 针对LLM 的用户友好的 WebUI,支持的 LLM 运行程序包括 Ollama 和 OpenAI 兼容的 API。有关更多信息,请务必查看我们的O...