You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Great catch but this is actually a problem with llama.cpp itself, which handles command line parsing and model file loading. That is, if you go into llama.cpp/ (or clone that repo independently and build it), and run ./llava with nonexistant filenames, the same segfault occurs. I want to refrain from reaching inside of llama.cpp but this would be a good issue to file with them directly.
these files do not exist:
models/*
I get this error
what I kinda expected:
"sorry this file doesn't exist"
thanks in advance
The text was updated successfully, but these errors were encountered: