-
Notifications
You must be signed in to change notification settings - Fork 872
Allowing LocalLLMs #134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Yeah, it can be good |
This should do it:
git clone https://github.com/kortix-ai/suna.git
Ollama speaks an OpenAI-compatible API on port 11434. Set two variables in backend/.env (leave OPENAI_API_KEY as any non-empty string):
services: Run everything: docker compose -f suna-stack.yml up -d |
i tried with other hosted api, like one from siliconflow, i get error: services.llm.LLMRetryError: Failed to make API call after 3 attempts. Last error: litellm.AuthenticationError: AnthropicException - b'{"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}' Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new 2025-04-28 04:04:49,665 - WARNING - Error on attempt 1/3: litellm.AuthenticationError: AnthropicException - b'{"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}' Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new 2025-04-28 04:04:55,927 - WARNING - Error on attempt 2/3: litellm.AuthenticationError: AnthropicException - b'{"type":"error","error":{"type":"authentication_error","message":"invalid x-api-key"}}' |
I'd love for the ability to use local LLMs instead of having to use OpenAI or Anthropic. Are there any plans to add support for locally hosted LLMs?
The text was updated successfully, but these errors were encountered: