Skip to content

Responses endpoint - return JSON does not match spec #3040

@RichNasz

Description

@RichNasz

System Info

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

Right now I'm getting an error processing data coming back from the endpoint as it appears LLS isn't returning a complete Response object per the OpenAI platform documentation which is causing a JSON decoding error. The missing "tool_choice" return value is breaking my code now, but there are other fields not being returned as well.

Here is the curl for the request I made:
curl http://localhost:8321/v1/openai/v1/responses -X POST
-H "Content-Type: application/json"
-H "Authorization: Bea................er "
-d '{
"model" : "ollama/llama3.2:3b",
"stream" : false,
"input" : "Say hello!"
}'

Here is the response:
{"created_at":1754053317,"error":null,"id":"resp-c488d856-c083-47c1-9a67-e852ccc61e15","model":"ollama/llama3.2:3b","object":"response","output":[{"content":[{"text":"Hello! It's nice to meet you. Is there something I can help you with, or would you like to chat for a bit?","type":"output_text","annotations":[]}],"role":"assistant","type":"message","id":"msg_10a12839-ea2e-4739-bf79-90ec8dce0c5f","status":"completed"}],"parallel_tool_calls":false,"previous_response_id":null,"status":"completed","temperature":null,"text":{"format":{"type":"text"}},"top_p":null,"truncation":null,"user":null}

Error logs

Error is a result of JSON decoder missing required fields and associated values.

Expected behavior

That an LLS response to the /v1/openai/v1/responses endpoint will return a data compliant with the OpenAI spec.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingstale

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions