Skip to content

Add support for models via LiteLLM proxy server #116

@haroon0x

Description

@haroon0x

Is your feature request related to a problem? Please describe.
I’d like to add support for OpenAI models using the LiteLLM simple proxy. This will expand the model compatibility of adk-java to include OpenAI endpoints via a standardized proxy interface.

Describe the solution you'd like
-The plan is to use Java's HTTP client to call the proxy endpoint and integrate the response into the existing ADK pipeline.

Metadata

Metadata

Assignees

No one assigned

    Labels

    waiting on reporterWaiting for reaction by reporter. Failing that, maintainers will eventually closed it as stale.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions