Skip to content

[FEATURE] Support LiteLLM (Proxy) as LLM #627

@badmonster0

Description

@badmonster0

LitelLLM provides a hub to connect to different LLM models. It provides a proxy server.
We want to support talking to it, as a separate API in addition to Ollama, OpenAI, Gemini and Anthropic.

cocoindex/src/llm/mod.rs

Lines 10 to 16 in 800ceb2

#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum LlmApiType {
Ollama,
OpenAi,
Gemini,
Anthropic,
}

See a similar example PR of adding Anthropic support: #395

Note: LiteLLM proxy API provides an OpenAI-compatible API, so it should be OK to reuse most OpenAI implementation when possible.


❤️ Contributors, please refer to 📙Contributing Guide.
Unless the PR can be sent immediately (e.g. just a few lines of code), we recommend you to leave a comment on the issue like I'm working on it or Can I work on this issue? to avoid duplicating work. Our Discord server is always open and friendly.

Metadata

Metadata

Assignees

Labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions