This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Description
Describe the issue
</prompt>): litellm.UnsupportedParamsError: openai does not support parameters: {'response_format': {'type': 'json_object'}}, for model=o1. To drop these, set `litellm.drop_params=True` or for proxy:
`litellm_settings:
drop_params: true`
Steps to Reproduce
Reference a file in Edit with CoPilot and select the o1(preview) model
Operating System
MacOS (Arm)
IDE and Version
Version: 1.96.2
Extension and Version
1.253.0
Provider
GitHub Copilot
Model
o1(preview)
Logs
No response
Additional Context
No response