Describe the bug I use nginx proxy api_base “https://api.openai.com” on my server : “http://myserverip:port” and get a error msg: My server is can connect https://api.openai.com and i use this way apiBaseUrl on https://github.com/transit...
cache.ai_setting.now_ai_chat_proxy[1]=new_port else: ifcache.ai_setting.ai_chat_setting[cid]<option_len-1: cache.ai_setting.ai_chat_setting[cid]+=1 Expand DownExpand Up@@ -664,6 +711,12 @@ def test_ai(self): ifnow_key_type=="OPENAI_API_KEY": ...
Hello Guys, I am working with this helm chart to deploy open-webui. AFAIK, We can add many connections in Admin settings. Those connections could be any openai API compatible, litellm proxy or pipelines. Said that we have OPENAI_API_BASE_URLS (add many) and OPENAI_API_BASE_URL that ad...
# Provide proxy for OpenAI API. e.g. http://127.0.0.1:7890 https_proxy=$HTTPS_PROXY # Custom base url for OpenAI API. default: https://api.openai.com openai_api_base_url=$OPENAI_API_BASE_URL # Custom base url for OpenAI API. default: https://generativelanguage.googleapis.com api_ba...
Class | 类型 None Feature Request | 功能请求 能不能添加 BASE_URL (optional) Default: https://api.openai.com Examples: http://your-openai-proxy.com Override openai api request base url.
In my company we use dedicated proxy server for OpenAI API requests, so my API KEY has shorter length that validation allows. This fix allow any length of OCO_OPENAI_API_KEY if OCO_OPENAI_BASE_PATH key is set to other value 👍 1 fix(config.ts): pass config object to configValidator...