Skip to content

Endpoint disabled for this model by API configuration 500 #575

@mike-niemand

Description

@mike-niemand

LocalAI version:
1.18.0

Environment, CPU architecture, OS, and Version:
Linux 78b4ecbb1b9f 5.10.16.3-microsoft-standard-WSL2 #1 SMP Fri Apr 2 22:23:49 UTC 2021 x86_64 GNU/Linux

Describe the bug

I am using the langchain-chroma example.

I have Local AI composed and running on Fiber. I am trying to create the storage for it. It creates the /db and sends the embeddings to the Local AI instance which I can see gets them. Local AI then throws a 500: 'endpoint disabled for this model by API configuration'

Expected behavior
Expected to complete succesfully

Logs
DEBUG:openai:message='OpenAI API response' path=http://127.0.0.1:8080/v1/engines/text-embedding-ada-002/embeddings processing_ms=None request_id=None response_code=500
INFO:openai:error_code=500 error_message='endpoint disabled for this model by API configuration' error_param=None error_type= message='OpenAI API error received' stream_error=False
WARNING:langchain.embeddings.openai:Retrying langchain.embeddings.openai.embed_with_retry.._embed_with_retry in 4.0 seconds as it raised APIError: endpoint disabled for this model by API configuration {"error":{"code":500,"message":"endpoint disabled for this model by API configuration","type":""}} 500 {'error': {'code': 500, 'message': 'endpoint disabled for this model by API configuration', 'type': ''}} {'Date': 'Mon, 12 Jun 2023 11:42:50 GMT', 'Content-Type': 'application/json', 'Content-Length': '98'}.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions