Skip to content

error loading model: unexpectedly reached end of file #476

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
jaymon0703 opened this issue Jul 13, 2023 · 3 comments
Closed

error loading model: unexpectedly reached end of file #476

jaymon0703 opened this issue Jul 13, 2023 · 3 comments

Comments

@jaymon0703
Copy link

Any ideas why i may be getting this error when models directory has the model?

image

@jaymon0703
Copy link
Author

Please see #399 for environment and context

@jaymon0703 jaymon0703 closed this as not planned Won't fix, can't repro, duplicate, stale Jul 13, 2023
@Azeirah
Copy link

Azeirah commented Jul 26, 2023

I don't understand. I ran the commands in the last post in the referenced issue and I'm still getting "error loading m omdel: unexpectedly reached end of file".

My model exists at the specified path. I built with cuBLAS, FORCE_CMAKE=1

@jaymon0703
Copy link
Author

jaymon0703 commented Aug 1, 2023

Hi @Azeirah - Try using just !CMAKE_ARGS="-DLLAMA_CUBLAS=on" FORCE_CMAKE=1 pip install --upgrade --force-reinstall llama-cpp-python --no-cache-dir --user and make sure you are using a compatible model...like this one works for me - https://huggingface.co/TheBloke/LLaMa-7B-GGML/resolve/main/llama-7b.ggmlv3.q4_0.bin

antoine-lizee pushed a commit to antoine-lizee/llama-cpp-python that referenced this issue Oct 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants