Skip to content

Bug: server-cuda Docker image failing as of 2 days ago - "error while loading shared libraries: libgomp.so.1: cannot open shared object file" #7774

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
adamreis opened this issue Jun 5, 2024 · 2 comments · Fixed by #7780
Labels
bug-unconfirmed medium severity Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)

Comments

@adamreis
Copy link

adamreis commented Jun 5, 2024

What happened?

System: Ubuntu 22.04, CUDA 12.5, Driver 555.42.02, Ryzen 7950x3D, RTX 4090

ghcr.io/ggerganov/llama.cpp:server-cuda--b1-a10cda5 is the last image that works as expected for me. server-cuda--b1-a5735e4 (2 days ago) and later all yield the following error:

docker compose up llama-cpp
[+] Running 1/0
⠋ Container quill-ops-llama-cpp-1 Recreated 0.0s
Attaching to llama-cpp-1
llama-cpp-1 | /server: error while loading shared libraries: libgomp.so.1: cannot open shared object file: No such file or directory
llama-cpp-1 exited with code 127

Name and Version

ghcr.io/ggerganov/llama.cpp:server-cuda--b1-a5735e4; Ubuntu 22.04, CUDA 12.5, Driver 555.42.02, Ryzen 7950x3D, RTX 4090, libgomp installed on host, Nvidia docker runtime

What operating system are you seeing the problem on?

Linux

Relevant log output

> docker compose up llama-cpp
[+] Running 1/0
 ⠋ Container quill-ops-llama-cpp-1  Recreated                                                                                                          0.0s
Attaching to llama-cpp-1
llama-cpp-1  | /server: error while loading shared libraries: libgomp.so.1: cannot open shared object file: No such file or directory
llama-cpp-1 exited with code 127
@adamreis adamreis added bug-unconfirmed medium severity Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable) labels Jun 5, 2024
@mblunt
Copy link

mblunt commented Jun 5, 2024

I am having the same issue.

@slaren slaren linked a pull request Jun 5, 2024 that will close this issue
@adamreis
Copy link
Author

adamreis commented Jun 6, 2024

Thanks @slaren @ggerganov 🙏🏻

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-unconfirmed medium severity Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants