Skip to content

Conversation

@ddh0
Copy link
Contributor

@ddh0 ddh0 commented Sep 15, 2025

In llama-perplexity, when using --kl-divergence, the KL divergence statistics output mistakenly displays the 99th percentile twice. This change fixes that and correctly displays the 90th percentile as originally intended (presumably).

Make sure to read the contributing guidelines before submitting a PR

In `llama-perplexity`, when using `--kl-divergence`, the KL divergence statistics output mistakenly displays the 99th percentile twice. This change fixes that and correctly displays the 90th percentile as originally intended (presumably).
@JohannesGaessler JohannesGaessler merged commit a68f31e into ggml-org:master Sep 15, 2025
47 of 48 checks passed
@ddh0 ddh0 deleted the fix-kld-percentile branch October 8, 2025 19:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants