Skip to content

使用Sakura-32B-Galgame-Kaggle-llama.cpp.ipynb出现错误 #11

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
gmgorag opened this issue Aug 10, 2024 · 2 comments
Closed

使用Sakura-32B-Galgame-Kaggle-llama.cpp.ipynb出现错误 #11

gmgorag opened this issue Aug 10, 2024 · 2 comments

Comments

@gmgorag
Copy link

gmgorag commented Aug 10, 2024

在kaggle中使用T4×2,运行Sakura-32B-Galgame-Kaggle-llama.cpp.ipynb脚本时,在出现“INFO loading model ...”后紧接着抛出异常如下:

  File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 75, in _load_shared_library
    return ctypes.CDLL(str(_lib_path), **cdll_args)  # type: ignore
  File "/opt/conda/lib/python3.10/ctypes/__init__.py", line 374, in __init__
    self._handle = _dlopen(self._name, mode)
OSError: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /opt/conda/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/kaggle/working/Sakura-13B-Galgame/server.py", line 108, in <module>
    state.init_model(cfg)
  File "/kaggle/working/Sakura-13B-Galgame/utils/state.py", line 24, in init_model
    sakura_model = SakuraModel(*args, **kwargs)
  File "/kaggle/working/Sakura-13B-Galgame/utils/model.py", line 144, in __init__
    (tokenizer, model) = load_model(cfg)
  File "/kaggle/working/Sakura-13B-Galgame/utils/model.py", line 109, in load_model
    from infers.llama import LlamaCpp
  File "/kaggle/working/Sakura-13B-Galgame/infers/llama.py", line 5, in <module>
    from llama_cpp import Llama
  File "/opt/conda/lib/python3.10/site-packages/llama_cpp/__init__.py", line 1, in <module>
    from .llama_cpp import *
  File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 88, in <module>
    _lib = _load_shared_library(_lib_base_name)
  File "/opt/conda/lib/python3.10/site-packages/llama_cpp/llama_cpp.py", line 77, in _load_shared_library
    raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
RuntimeError: Failed to load shared library '/opt/conda/lib/python3.10/site-packages/llama_cpp/lib/libllama.so': /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /opt/conda/lib/python3.10/site-packages/llama_cpp/lib/libllama.so)
@Isotr0py
Copy link
Owner

Isotr0py commented Aug 10, 2024

可能是因为 llama-cpp-python 更新了 CI 的编译环境到ubuntu-latest:

kaggle ubuntu20.04 的 glibc 版本有点老,只支持到 GLIBC_2.31

晚点我编译一个 kaggle 环境能用的 wheel
(可能得过几天,下周有点事不一定有空)

@Isotr0py
Copy link
Owner

已更新 llama-cpp-python 的 indexl-url,现在应该可以跑了。

@gmgorag gmgorag closed this as completed Aug 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants