Update dependency litellm to v1.60.8 #2028
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
1.59.7
->1.60.8
Release Notes
BerriAI/litellm (litellm)
v1.60.8
What's Changed
/cache/ping
+ add timeout value and elapsed time on azure + http calls by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8377/bedrock/invoke
support for all Anthropic models by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8383Full Changelog: BerriAI/litellm@v1.60.6...v1.60.8
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.6
Compare Source
What's Changed
choices=[]
by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8339choices=[]
on llm responses by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8342New Contributors
Full Changelog: BerriAI/litellm@v1.60.5...v1.60.6
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.5
Compare Source
What's Changed
BaseLLMHTTPHandler
class by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8290New Contributors
Full Changelog: BerriAI/litellm@v1.60.4...v1.60.5
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.4
Compare Source
What's Changed
bedrock/nova
models + add utillitellm.supports_tool_choice
by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8264role
based access to proxy by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8260Full Changelog: BerriAI/litellm@v1.60.2...v1.60.4
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.2
Compare Source
What's Changed
sso_user_id
to LiteLLM_UserTable by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8167/vertex_ai/
was not detected as llm_api_route on pass through butvertex-ai
was by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8186mode
as list, fix valid keys error in pydantic, add more testing by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8224New Contributors
Full Changelog: BerriAI/litellm@v1.60.0...v1.60.2
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.0
What's Changed
Important Changes between v1.50.xx to 1.60.0
def async_log_stream_event
anddef log_stream_event
no longer supported forCustomLoggers
https://docs.litellm.ai/docs/observability/custom_callback. If you want to log stream events usedef async_log_success_event
anddef log_success_event
for logging success stream eventsKnown Issues
🚨 Detected issue with Langfuse Logging when Langfuse credentials are stored in DB
bedrock
models + showend_user
by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8118keyTeam.team_alias === "Default Team"
by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8122LoggingCallbackManager
to append callbacks and ensure no duplicate callbacks are added by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8112litellm.disable_no_log_param
param by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8134litellm.turn_off_message_logging=True
by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8156New Contributors
Full Changelog: BerriAI/litellm@v1.59.10...v1.60.0
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.10
Compare Source
What's Changed
model
param by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8105bedrock/converse_like/<model>
route by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8102Full Changelog: BerriAI/litellm@v1.59.9...v1.59.10
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.9
Compare Source
What's Changed
metadata
param preview support + newx-litellm-timeout
request header by @krrishdholakia in https://github.com/BerriAI/litellm/pull/8047New Contributors
Full Changelog: BerriAI/litellm@v1.59.8...v1.59.9
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.8
Compare Source
What's Changed
LANGFUSE_FLUSH_INTERVAL
by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/8007Full Changelog: BerriAI/litellm@v1.59.7...v1.59.8
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Configuration
📅 Schedule: Branch creation - "every weekend" in timezone US/Eastern, Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.