Bug: llama-cli templating does buf.resize(-1) if the model's template is not supported, causing crash #8149
Labels
bug-unconfirmed
high severity
Used to report high severity bugs in llama.cpp (Malfunctioning hinder important workflow)
What happened?
common.cpp
'sllama_chat_apply_template
says:https://github.com/ggerganov/llama.cpp/blob/ac146628e47451c531a3c7e62e6a973a2bb467a0/common/common.cpp#L2630-L2637
res
can be -1 (e.g. in the case that the model's Jinja template is not matched by any pattern inllama_chat_apply_template_internal
?). When cast tosize_t
it becomes significantly bigger thanbuf.size()
which leads tobuf.resize(-1)
. This is followed by a crash ofllama-cli
on my machine.llama-server
seems to fall back tochatml
if no pattern matches the model's Jinja template:https://github.com/ggerganov/llama.cpp/blob/ac146628e47451c531a3c7e62e6a973a2bb467a0/examples/server/server.cpp#L2600-L2605
Perhaps the same needs to be done for
llama-cli
, and perhapscommon.cpp
'sllama_chat_apply_template
should be more defensive whenllama_chat_apply_template_internal
returns -1, rather than trying to resizebuf
to -1Name and Version
What operating system are you seeing the problem on?
Linux
Relevant log output
The text was updated successfully, but these errors were encountered: