Skip to content

400 BadRequestError with Gemini provider – Invalid JSON payload ("metadata" and "store" fields) #443

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Mubashar-Bashir opened this issue Apr 5, 2025 · 15 comments
Labels
bug Something isn't working

Comments

@Mubashar-Bashir
Copy link

Please read this first

  • Have you read the custom model provider docs, including the 'Common issues' section? ✅ Yes
  • Have you searched for related issues? ✅ Yes

Describe the question

I'm using the OpenAI Agents SDK with a custom model provider (Google Gemini via https://generativelanguage.googleapis.com/v1beta/openai/) and encountering a BadRequestError.

The Gemini API rejects the request with this error:

BadRequestError: Error code: 400 - [{'error': {'code': 400, 'message': 'Invalid JSON payload received. Unknown name "metadata": Cannot find field.\nInvalid JSON payload received. Unknown name "store": Cannot find field.'}}]

It appears that the Agents SDK is sending metadata and store fields automatically, but the Gemini API doesn't support these.


Debug information

  • Agents SDK version: v0.0.3
  • Python version: 3.11
  • Custom model provider: Google Gemini (Flash 1.5)/(Flash 2.0)
  • Request Type: chat.completions.create

Repro steps

Here’s a minimal script to reproduce the issue:

from openai import AsyncOpenAI
from agents import Agent, Runner, OpenAIChatCompletionsModel
from agents.run import RunConfig
import nest_asyncio

nest_asyncio.apply()

# External Async client for custom provider
client = AsyncOpenAI(
    api_key=google_api_key,
    base_url="https://generativelanguage.googleapis.com/v1beta/openai/",
)

# Wrap client in SDK model
model = OpenAIChatCompletionsModel(
    model="gemini-1.5-flash",  # "gemini-2.0-flash"  
    openai_client=client,
)

config = RunConfig(
    model=model,
    model_provider=client,
    tracing_disabled=True
)

agent = Agent(name="Assistant", instructions="You are a helpful assistant")

# Run agent synchronously
result = Runner.run_sync(agent, "Write a haiku about recursion in programming.")
print(result.final_output)

**Metadata**
Please clarify:

Is there a way to prevent the SDK from sending unsupported fields like metadata or store?

Will future versions of the SDK provide better integration options for non-OpenAI endpoints?
@Mubashar-Bashir Mubashar-Bashir added the bug Something isn't working label Apr 5, 2025
@hamzaadil56
Copy link

Yes, I'm also facing the same bug. OpenAI has updated its library, which is causing some errors when providing Gemini Client.

I've worked with version 0.0.4, and in that, it was working fine. But when I started a new project with the latest version, it is causing issues when providing with Gemini Provider.

@Qureshihasaan
Copy link

Yes, facing the same issue of json payload....Try different things but can't debug it, Maybe, something new Package/Library is added to new version.

@shamoon-ahmed
Copy link

I'm having the same BadRequestError. How do I resolve it?

@DevHammad0
Copy link

DevHammad0 commented Apr 6, 2025

I tested older versions (0.0.7, 0.0.6, 0.0.5, and 0.0.4) and they work fine without the metadata and store payload errors.

So it seems that version 0.0.8 is introducing this issue, possibly by including unsupported fields like metadata and store in the request payload, which are not supported by the Gemini API via the OpenAI-compatible endpoint (https://generativelanguage.googleapis.com/v1beta/openai/).

@shamoon-ahmed
Copy link

Yessir thanks a lot! I just tried the 0.0.4 version and it worked fine! Hope they resolve this issue in the latest versions as well

@Mrkhan9914626
Copy link

same issue

@jhammarstedt
Copy link

+1 on this

@exiao
Copy link

exiao commented Apr 7, 2025

same issue with 0.0.8, breaking change. #431

@drewy-openai , @rm-openai

@BasedLukas
Copy link

same issue

@RahulVerma989
Copy link

I am also facing the same issue with groq and google openai compatible apis. I think we can easily fix it by not passing the metadata when it is null.

Image

@rm-openai
Copy link
Collaborator

Sorry about that, fixing now.

rm-openai added a commit that referenced this issue Apr 7, 2025
Summary: See #443. Causes issues with Gemini.

Test Plan: Tests.
rm-openai added a commit that referenced this issue Apr 7, 2025
Summary: See #443. Causes issues with Gemini.

Test Plan: Tests.
rm-openai added a commit that referenced this issue Apr 7, 2025
Summary: See #443. Causes issues with Gemini.

Test Plan: Tests.
@RahulVerma989
Copy link

For metadata property issue. I have added two review comments in openai_chatcompletions.py and openai_responses.py.

In short the changes will be:

current:
metadata=model_settings.metadata

change to:
metadata=self._non_null_or_not_given(model_settings.metadata)

rm-openai added a commit that referenced this issue Apr 7, 2025
Summary: See #443. Causes issues with Gemini.

Test Plan: Tests. Also tested with Gemini to ensure it works.
@rm-openai
Copy link
Collaborator

fix in #455 is merged, I'll cut a new release once #456 merges

@rm-openai
Copy link
Collaborator

v0.0.9 should be out in a few mins. Let me know if you have any issues with it.

@rm-openai
Copy link
Collaborator

Closing. Please create a new issue for any follow ups!

Lightblues pushed a commit to Lightblues/openai-agents-python that referenced this issue Apr 13, 2025
Summary: See openai#443. Causes issues with Gemini.

Test Plan: Tests. Also tested with Gemini to ensure it works.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests