-
-
Notifications
You must be signed in to change notification settings - Fork 34
(local) Ollama as AI backend #13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Updated readme to include instructions for local Ollama. openrouter:
api_key: api-key
model: gemma3:1b
base_url: http://localhost:11434/v1 but it's not tested with it - might need some prompt tuning. |
Hi, I just tried with my local installation, but I get 404 errors: I am not sure, if the Ollama API is the same as the OpenAI API. |
same issue. I can confirm that the end point is actually working well: |
It worked after removing the tailing slash from the url in the config file (golang/go#69063). However when using ollama's OpenAi-compatible API there is no way to set the prompt input limit, which is set to 2048 token and thats way too little. The best way would probably be to use the ollama go package (https://pkg.go.dev/github.com/ollama/ollama/api) |
Nice find! |
I may be mistaken, but isn't it because tmuxai uses GET?
|
It does not use GET. But for some reason go's http library transforms the request to GET when there is a double slash in the url. So make sure your url in the config ends without a slash |
Double slash issue should be fixed, just released new version |
There is also this issue with gemma, can you try other models? |
Do you use |
I dont get it! Ollama has no api key. And even i try some random api keys it doesnt work. Do you have some tip? ollama: |
I mean that you need to change your |
Thank you, i tested so many things. That i stop using my brain. Now it works :) |
Uh oh!
There was an error while loading. Please reload this page.
Is your feature request related to a problem? Please describe.
Describe the solution you'd like
An option to connect to a local Ollama instead of using OpenRouter
Describe alternatives you've considered
-
Additional context
https://github.com/ollama/ollama/blob/main/docs/api.md
The text was updated successfully, but these errors were encountered: