Skip to content

Update dependency litellm to v1.60.8 #2028

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Feb 9, 2025
Merged

Update dependency litellm to v1.60.8 #2028

merged 1 commit into from
Feb 9, 2025

Conversation

renovate[bot]
Copy link
Contributor

@renovate renovate bot commented Feb 8, 2025

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
litellm 1.59.7 -> 1.60.8 age adoption passing confidence

Release Notes

BerriAI/litellm (litellm)

v1.60.8

What's Changed

Full Changelog: BerriAI/litellm@v1.60.6...v1.60.8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.8
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 170.0 189.56173781509457 6.206468643400922 0.0 1855 0 149.30551800000558 3488.08786699999
Aggregated Passed ✅ 170.0 189.56173781509457 6.206468643400922 0.0 1855 0 149.30551800000558 3488.08786699999

v1.60.6

Compare Source

What's Changed

New Contributors

Full Changelog: BerriAI/litellm@v1.60.5...v1.60.6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.6
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 200.0 217.05167674521235 6.288425886864887 0.0 1880 0 164.17646499996863 2306.284880000021
Aggregated Passed ✅ 200.0 217.05167674521235 6.288425886864887 0.0 1880 0 164.17646499996863 2306.284880000021

v1.60.5

Compare Source

What's Changed

New Contributors

Full Changelog: BerriAI/litellm@v1.60.4...v1.60.5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.5
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 251.44053604962153 6.19421782055854 0.0 1854 0 167.35073600000305 4496.06190000003
Aggregated Passed ✅ 210.0 251.44053604962153 6.19421782055854 0.0 1854 0 167.35073600000305 4496.06190000003

v1.60.4

Compare Source

What's Changed

Full Changelog: BerriAI/litellm@v1.60.2...v1.60.4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.4
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 243.98647747354212 6.187158959524932 0.0033407985742575225 1852 1 94.81396500007122 3976.009301999966
Aggregated Passed ✅ 210.0 243.98647747354212 6.187158959524932 0.0033407985742575225 1852 1 94.81396500007122 3976.009301999966

v1.60.2

Compare Source

What's Changed

New Contributors

Full Changelog: BerriAI/litellm@v1.60.0...v1.60.2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.2
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 170.0 187.78487681207412 6.365583292626693 0.0 1905 0 135.5453470000043 3644.0179759999864
Aggregated Passed ✅ 170.0 187.78487681207412 6.365583292626693 0.0 1905 0 135.5453470000043 3644.0179759999864

v1.60.0

What's Changed

Important Changes between v1.50.xx to 1.60.0

Known Issues

🚨 Detected issue with Langfuse Logging when Langfuse credentials are stored in DB

New Contributors

Full Changelog: BerriAI/litellm@v1.59.10...v1.60.0

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.0
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 240.0 281.07272626532927 6.158354312051399 0.0 1843 0 215.79772499995897 3928.489000000013
Aggregated Passed ✅ 240.0 281.07272626532927 6.158354312051399 0.0 1843 0 215.79772499995897 3928.489000000013

v1.59.10

Compare Source

What's Changed

Full Changelog: BerriAI/litellm@v1.59.9...v1.59.10

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.10
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 239.24647793068146 6.21745665443628 0.00334092243655899 1861 1 73.25327600000264 3903.3159660000083
Aggregated Passed ✅ 210.0 239.24647793068146 6.21745665443628 0.00334092243655899 1861 1 73.25327600000264 3903.3159660000083

v1.59.9

Compare Source

What's Changed
New Contributors

Full Changelog: BerriAI/litellm@v1.59.8...v1.59.9

Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.9
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results
Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 270.0 301.01550717582927 6.14169679840119 0.0 1837 0 234.85362500002793 3027.238808999982
Aggregated Failed ❌ 270.0 301.01550717582927 6.14169679840119 0.0 1837 0 234.85362500002793 3027.238808999982

v1.59.8

Compare Source

What's Changed

Full Changelog: BerriAI/litellm@v1.59.7...v1.59.8

Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.8
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results
Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 280.0 325.48398318207154 6.003526201462839 0.0 1796 0 234.56590200004257 3690.442290999954
Aggregated Failed ❌ 280.0 325.48398318207154 6.003526201462839 0.0 1796 0 234.56590200004257 3690.442290999954

Configuration

📅 Schedule: Branch creation - "every weekend" in timezone US/Eastern, Automerge - At any time (no schedule defined).

🚦 Automerge: Enabled.

Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

@renovate renovate bot force-pushed the renovate/litellm-1.x branch 2 times, most recently from a47116d to 8514ac7 Compare February 8, 2025 22:44
@renovate renovate bot force-pushed the renovate/litellm-1.x branch from 8514ac7 to 30602c2 Compare February 9, 2025 02:13
@renovate renovate bot merged commit 5617b50 into main Feb 9, 2025
11 checks passed
@renovate renovate bot deleted the renovate/litellm-1.x branch February 9, 2025 06:26
@odlbot odlbot mentioned this pull request Feb 10, 2025
12 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants