-
Notifications
You must be signed in to change notification settings - Fork 223
Closed
Labels
good first issueGood for newcomersGood for newcomershelp wantedExtra attention is neededExtra attention is neededintegrationIntegration, Native builtinsIntegration, Native builtins
Description
LitelLLM provides a hub to connect to different LLM models. It provides a proxy server.
We want to support talking to it, as a separate API in addition to Ollama, OpenAI, Gemini and Anthropic.
Lines 10 to 16 in 800ceb2
#[derive(Debug, Clone, Serialize, Deserialize)] | |
pub enum LlmApiType { | |
Ollama, | |
OpenAi, | |
Gemini, | |
Anthropic, | |
} |
See a similar example PR of adding Anthropic support: #395
Note: LiteLLM proxy API provides an OpenAI-compatible API, so it should be OK to reuse most OpenAI implementation when possible.
❤️ Contributors, please refer to 📙Contributing Guide.
Unless the PR can be sent immediately (e.g. just a few lines of code), we recommend you to leave a comment on the issue like I'm working on it
or Can I work on this issue?
to avoid duplicating work. Our Discord server is always open and friendly.
Metadata
Metadata
Assignees
Labels
good first issueGood for newcomersGood for newcomershelp wantedExtra attention is neededExtra attention is neededintegrationIntegration, Native builtinsIntegration, Native builtins