-
-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Description
LocalAI version: 2.26
localai/localai:latest-aio-gpu-nvidia-cuda-12
Environment, CPU architecture, OS, and Version:
Linux havenstore 4.4.302+ #72806 SMP Thu Sep 5 13:45:09 CST 2024 x86_64 GNU/Linux synology_broadwellnk_3622xs+
Describe the bug
After composing the docker for GPU usage, with the necessary image and deploy argument inside docker-compose.yml, the running LocalAI is unable to make use of my GPU, with the following error:
WARNING:
localai-api-1 | /sys/class/drm does not exist on this system (likely the host system is a
localai-api-1 | virtual machine or container with no graphics). Therefore,
localai-api-1 | GPUInfo.GraphicsCards will be an empty array.
To Reproduce
docker-compose up
using the webui, simply open chat and ask a question (for instance)
Expected behavior
the docker should be able to use the GPU (which it "sees" with nvidia-smi command)
Logs
Additional context
the resource monitor inside Synology DSM shows that the GPU is not used at all during chat sessions with LocalAI