You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
在kaggle中使用T4×2,运行Sakura-32B-Galgame-Kaggle-llama.cpp.ipynb脚本时,在出现“INFO loading model ...”后紧接着抛出异常如下:
File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 75, in _load_shared_library
return ctypes.CDLL(str(_lib_path), **cdll_args) # type: ignore
File "/opt/conda/lib/python3.10/ctypes/__init__.py", line 374, in __init__
self._handle = _dlopen(self._name, mode)
OSError: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /opt/conda/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/kaggle/working/Sakura-13B-Galgame/server.py", line 108, in <module>
state.init_model(cfg)
File "/kaggle/working/Sakura-13B-Galgame/utils/state.py", line 24, in init_model
sakura_model = SakuraModel(*args, **kwargs)
File "/kaggle/working/Sakura-13B-Galgame/utils/model.py", line 144, in __init__
(tokenizer, model) = load_model(cfg)
File "/kaggle/working/Sakura-13B-Galgame/utils/model.py", line 109, in load_model
from infers.llama import LlamaCpp
File "/kaggle/working/Sakura-13B-Galgame/infers/llama.py", line 5, in <module>
from llama_cpp import Llama
File "/opt/conda/lib/python3.10/site-packages/llama_cpp/__init__.py", line 1, in <module>
from .llama_cpp import *
File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 88, in <module>
_lib = _load_shared_library(_lib_base_name)
File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 77, in _load_shared_library
raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
RuntimeError: Failed to load shared library '/opt/conda/lib/python3.10/site-packages/llama_cpp/lib/libllama.so': /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /opt/conda/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)
The text was updated successfully, but these errors were encountered:
在kaggle中使用T4×2,运行Sakura-32B-Galgame-Kaggle-llama.cpp.ipynb脚本时,在出现“INFO loading model ...”后紧接着抛出异常如下:
The text was updated successfully, but these errors were encountered: