-
Notifications
You must be signed in to change notification settings - Fork 14
Description
Describe the issue
Hello,
Before asking I have read I think all the threads about this Nextcloud application, here and everywhere…
I use the last version of Nextcloud AIO on a QNAP ts-464 NAS.
For the AI integration, I use the openAI connector application with a MistralAi paid account.
Because my current installation of the Context Chat Backend seems not to work (" Failed request (500): Embedding Request Error: Error: the embedding server is not responding") , I wonder what is this Embedding server ? I have seen the related configuration in the config.yaml but do I need an external application (server) to use it ? is it an Ollama or Local AI instance ? or an internal server in the application itself ?
Thanks in advance for your precious answers
Setup Details (please complete the following information):
- Nextcloud Version: 31.0.2
- AppAPI Version: 5.0.2
- Context Chat PHP Version php8.3
- Context Chat Backend Version 4.2.0
- Nextcloud deployment method: Docker AIO
- Context Chat Backend deployment method one-click