Skip to content

llama-chat : fix multiple system messages for gemma, orion #14246

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 18, 2025

Conversation

ngxson
Copy link
Collaborator

@ngxson ngxson commented Jun 17, 2025

Fix #14151


Important: this won't work with jinja template, because jinja only allow 0 or 1 system message at the beginning of the conversation:

image

@ngxson ngxson requested a review from ggerganov June 17, 2025 15:21
@ngxson ngxson merged commit 9540255 into ggml-org:master Jun 18, 2025
47 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Misc. bug: llama-server builds possibly erroneous prompt for gemma 3
2 participants