Skip to content

BadRequestError due to move to OpenAI Chat completions API #3666

@luis5tb

Description

@luis5tb

System Info

Running with latest llamastack main branch

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

The move to OpenAI chat completions API (7e48cc4) broke the requests as it is sending a wong max_output_tokens: Getting the next BadRequestError:
openai.BadRequestError: Error code: 400 - [{'error': {'code': 400, 'message': '* GenerateContentRequest.generation_config.max_output_tokens: max_output_tokens must be positive.\n', 'status': 'INVALID_ARGUMENT'}}]

Error logs

openai.BadRequestError: Error code: 400 - [{'error': {'code': 400, 'message': '* GenerateContentRequest.generation_config.max_output_tokens: max_output_tokens must be positive.\n', 'status': 'INVALID_ARGUMENT'}}]

Expected behavior

If max token is not specified in the llamastack config it should simply pass None and allow the request to go through as before the referenced commit

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions