Skip to content

Commit 2aba0cb

Browse files
ggerganovhazelnutcloud
authored andcommitted
server : fix EOS token detection with disabled cache (ggml-org#5938)
1 parent d3c6811 commit 2aba0cb

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

examples/server/server.cpp

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1123,7 +1123,7 @@ struct server_context {
11231123
});
11241124
}
11251125

1126-
if (!slot.cache_tokens.empty() && result.tok == llama_token_eos(model)) {
1126+
if (result.tok == llama_token_eos(model)) {
11271127
slot.stopped_eos = true;
11281128
slot.has_next_token = false;
11291129

0 commit comments

Comments
 (0)