Skip to content

Conversation

luis5tb
Copy link
Contributor

@luis5tb luis5tb commented Oct 3, 2025

This patch ensures if max tokens is not defined, then is set to None instead of 0 when calling openai_chat_completion. This way some providers (like gemini) that cannot handle the max_tokens = 0 will not fail

Issue: #3666

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Oct 3, 2025
@luis5tb luis5tb changed the title Fix BadRequestError due to unvalid max_tokens fix: Avoid BadRequestError due to invalid max_tokens Oct 3, 2025
Copy link
Collaborator

@mattf mattf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hold on this pending discord discussion

@luis5tb luis5tb force-pushed the max_tokens branch 3 times, most recently from 99d2730 to 43fb189 Compare October 3, 2025 16:05
This patch ensures if max tokens is not defined it is set to None.
This avoid some providers to fail, as they don't have protection for
it being set to 0

Issue: llamastack#3666
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants