Is max_tokens = max_input_tokens + max_output_tokens? import litellm print(litellm.get_model_info('openrouter/qwen/qwen-2.5-coder-32b-instruct')) Gives: {'key': 'openrouter/qwen/qwen-2.5-coder-32b-instruct', 'max_tokens': 33792, 'max_input_tokens': 33792, 'max_output_tokens': ...
Pull Request Title Introduce max_output_tokens Field for OpenAI Models https://platform.deepseek.com/api-docs/news/news0725/#4-8k-max_tokens-betarelease-longer-possibilities Description This commit...
Other OpenAI models (4o, 3.5, turbo, etc) were unaffected. With this change we will only includemax_tokenson the request if the OpenAI custom definition from settings includesmax_output_tokensand not on every request. I believe the root cause of this is that GPT-4 (and only GPT-4) has...
For the case there is no value provided by either config.toml or env var formax_output_tokensfor OH: The new base default is 4096 tokens for reliable LLM's max_output_tokenshas precedence (see litellm code), but ifnotpresent, then usemax_tokensvalue (if present in model's info)...
maxOutputTokens) to Chat models#180 Merged kevin-lee merged 1 commit into main from task/171/add-max-output-token Sep 17, 2024 +91 −16 Conversation 1 Commits 1 Checks 11 Files changed 2 Conversation Owner kevin-lee commented Sep 17, 2024 Close #171 - Add Max output tokens (max...
chore: make max_output_tokens configurable (9b68354)(646d3de) Additional details and impacted files @@ Coverage Diff @@## main #2683 +/- ##=== Coverage 58.23% 58.24% === Files 169 169 Lines 15304 15307 +3 ===+Hits 8913 8916 +3Misses 6391 6391 ☔ View full report in Codecov...
baidu qianfan model support stop、system、maxOutputTokens params loganhu and others added 3 commits June 5, 2024 21:13 add qianfan model support stop\system\maxoutputTokens support 58238e6 Merge branch 'loganhu' c891cd1 Merge branch 'main' into main Verified 9e83187 langchain4j approved ...