Skip to content

Tokenizer SPM fixes for phi-3 and llama-spm (bugfix) #7425

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
May 21, 2024

Conversation

jaime-m-p
Copy link
Collaborator

Bugfixing #7375 : Need rtrim after the pre-inserted BOS. More details here.

@github-actions github-actions bot added testing Everything test related python python script changes labels May 21, 2024
Copy link
Contributor

github-actions bot commented May 21, 2024

📈 llama.cpp server for bench-server-baseline on Standard_NC4as_T4_v3 for phi-2-q4_0: 531 iterations 🚀

Expand details for performance related PR only
  • Concurrent users: 8, duration: 10m
  • HTTP request : avg=8827.26ms p(95)=23186.6ms fails=, finish reason: stop=468 truncated=63
  • Prompt processing (pp): avg=104.01tk/s p(95)=435.03tk/s
  • Token generation (tg): avg=32.11tk/s p(95)=46.81tk/s
  • ggml-org/models/phi-2/ggml-model-q4_0.gguf parallel=8 ctx-size=16384 ngl=33 batch-size=2048 ubatch-size=256 pp=1024 pp+tg=2048 branch=tokenizer-spm-fixes commit=0594bfdd5e22e5df6ea44583088ca08609a70e00

prompt_tokens_seconds

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 531 iterations"
    y-axis "llamacpp:prompt_tokens_seconds"
    x-axis "llamacpp:prompt_tokens_seconds" 1716282012 --> 1716282640
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 626.54, 626.54, 626.54, 626.54, 626.54, 750.3, 750.3, 750.3, 750.3, 750.3, 745.22, 745.22, 745.22, 745.22, 745.22, 768.62, 768.62, 768.62, 768.62, 768.62, 856.96, 856.96, 856.96, 856.96, 856.96, 847.71, 847.71, 847.71, 847.71, 847.71, 849.26, 849.26, 849.26, 849.26, 849.26, 872.34, 872.34, 872.34, 872.34, 872.34, 866.74, 866.74, 866.74, 866.74, 866.74, 876.91, 876.91, 876.91, 876.91, 876.91, 895.11, 895.11, 895.11, 895.11, 895.11, 909.16, 909.16, 909.16, 909.16, 909.16, 902.35, 902.35, 902.35, 902.35, 902.35, 889.67, 889.67, 889.67, 889.67, 889.67, 894.49, 894.49, 894.49, 894.49, 894.49, 870.24, 870.24, 870.24, 870.24, 870.24, 861.4, 861.4, 861.4, 861.4, 861.4, 863.69, 863.69, 863.69, 863.69, 863.69, 870.1, 870.1, 870.1, 870.1, 870.1, 874.86, 874.86, 874.86, 874.86, 874.86, 874.59, 874.59, 874.59, 874.59, 874.59, 878.46, 878.46, 878.46, 878.46, 878.46, 878.67, 878.67, 878.67, 878.67, 878.67, 881.0, 881.0, 881.0, 881.0, 881.0, 871.94, 871.94, 871.94, 871.94, 871.94, 864.04, 864.04, 864.04, 864.04, 864.04, 878.75, 878.75, 878.75, 878.75, 878.75, 877.1, 877.1, 877.1, 877.1, 877.1, 875.2, 875.2, 875.2, 875.2, 875.2, 875.04, 875.04, 875.04, 875.04, 875.04, 878.79, 878.79, 878.79, 878.79, 878.79, 877.34, 877.34, 877.34, 877.34, 877.34, 875.68, 875.68, 875.68, 875.68, 875.68, 877.99, 877.99, 877.99, 877.99, 877.99, 887.73, 887.73, 887.73, 887.73, 887.73, 886.08, 886.08, 886.08, 886.08, 886.08, 888.57, 888.57, 888.57, 888.57, 888.57, 886.98, 886.98, 886.98, 886.98, 886.98, 882.31, 882.31, 882.31, 882.31, 882.31, 881.79, 881.79, 881.79, 881.79, 881.79, 881.18, 881.18, 881.18, 881.18, 881.18, 880.64, 880.64, 880.64, 880.64, 880.64, 871.01, 871.01, 871.01, 871.01, 871.01, 853.28, 853.28, 853.28, 853.28, 853.28, 853.14, 853.14, 853.14, 853.14, 853.14, 850.75, 850.75, 850.75, 850.75, 850.75, 849.35, 849.35, 849.35, 849.35, 849.35, 847.18, 847.18, 847.18, 847.18, 847.18, 847.58, 847.58, 847.58, 847.58, 847.58, 846.04, 846.04, 846.04, 846.04, 846.04, 849.15, 849.15, 849.15, 849.15, 849.15, 848.22, 848.22, 848.22, 848.22, 848.22, 851.14, 851.14, 851.14, 851.14, 851.14, 854.63, 854.63, 854.63, 854.63, 854.63, 851.73, 851.73, 851.73, 851.73, 851.73, 856.85, 856.85, 856.85, 856.85, 856.85, 854.16, 854.16, 854.16, 854.16, 854.16, 854.53, 854.53, 854.53, 854.53, 854.53, 855.7, 855.7, 855.7, 855.7, 855.7, 855.44, 855.44, 855.44, 855.44, 855.44]
                    
Loading
predicted_tokens_seconds
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 531 iterations"
    y-axis "llamacpp:predicted_tokens_seconds"
    x-axis "llamacpp:predicted_tokens_seconds" 1716282012 --> 1716282640
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 40.43, 40.43, 40.43, 40.43, 40.43, 34.38, 34.38, 34.38, 34.38, 34.38, 28.96, 28.96, 28.96, 28.96, 28.96, 31.05, 31.05, 31.05, 31.05, 31.05, 31.26, 31.26, 31.26, 31.26, 31.26, 31.37, 31.37, 31.37, 31.37, 31.37, 32.69, 32.69, 32.69, 32.69, 32.69, 33.49, 33.49, 33.49, 33.49, 33.49, 33.57, 33.57, 33.57, 33.57, 33.57, 33.62, 33.62, 33.62, 33.62, 33.62, 33.55, 33.55, 33.55, 33.55, 33.55, 33.04, 33.04, 33.04, 33.04, 33.04, 32.4, 32.4, 32.4, 32.4, 32.4, 32.39, 32.39, 32.39, 32.39, 32.39, 31.8, 31.8, 31.8, 31.8, 31.8, 30.53, 30.53, 30.53, 30.53, 30.53, 28.91, 28.91, 28.91, 28.91, 28.91, 28.92, 28.92, 28.92, 28.92, 28.92, 29.22, 29.22, 29.22, 29.22, 29.22, 29.01, 29.01, 29.01, 29.01, 29.01, 28.86, 28.86, 28.86, 28.86, 28.86, 28.81, 28.81, 28.81, 28.81, 28.81, 28.86, 28.86, 28.86, 28.86, 28.86, 29.16, 29.16, 29.16, 29.16, 29.16, 29.21, 29.21, 29.21, 29.21, 29.21, 29.41, 29.41, 29.41, 29.41, 29.41, 29.72, 29.72, 29.72, 29.72, 29.72, 29.66, 29.66, 29.66, 29.66, 29.66, 29.74, 29.74, 29.74, 29.74, 29.74, 29.83, 29.83, 29.83, 29.83, 29.83, 29.99, 29.99, 29.99, 29.99, 29.99, 30.07, 30.07, 30.07, 30.07, 30.07, 30.18, 30.18, 30.18, 30.18, 30.18, 30.29, 30.29, 30.29, 30.29, 30.29, 30.11, 30.11, 30.11, 30.11, 30.11, 30.03, 30.03, 30.03, 30.03, 30.03, 30.01, 30.01, 30.01, 30.01, 30.01, 29.74, 29.74, 29.74, 29.74, 29.74, 29.52, 29.52, 29.52, 29.52, 29.52, 29.49, 29.49, 29.49, 29.49, 29.49, 29.68, 29.68, 29.68, 29.68, 29.68, 29.9, 29.9, 29.9, 29.9, 29.9, 30.03, 30.03, 30.03, 30.03, 30.03, 29.87, 29.87, 29.87, 29.87, 29.87, 29.75, 29.75, 29.75, 29.75, 29.75, 29.44, 29.44, 29.44, 29.44, 29.44, 28.39, 28.39, 28.39, 28.39, 28.39, 28.43, 28.43, 28.43, 28.43, 28.43, 28.48, 28.48, 28.48, 28.48, 28.48, 28.35, 28.35, 28.35, 28.35, 28.35, 28.37, 28.37, 28.37, 28.37, 28.37, 28.4, 28.4, 28.4, 28.4, 28.4, 28.42, 28.42, 28.42, 28.42, 28.42, 28.49, 28.49, 28.49, 28.49, 28.49, 28.41, 28.41, 28.41, 28.41, 28.41, 28.41, 28.41, 28.41, 28.41, 28.41, 28.4, 28.4, 28.4, 28.4, 28.4, 28.48, 28.48, 28.48, 28.48, 28.48, 28.63, 28.63, 28.63, 28.63, 28.63, 28.71, 28.71, 28.71, 28.71, 28.71]
                    
Loading

Details

kv_cache_usage_ratio

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 531 iterations"
    y-axis "llamacpp:kv_cache_usage_ratio"
    x-axis "llamacpp:kv_cache_usage_ratio" 1716282012 --> 1716282640
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2, 0.2, 0.2, 0.2, 0.2, 0.38, 0.38, 0.38, 0.38, 0.38, 0.16, 0.16, 0.16, 0.16, 0.16, 0.13, 0.13, 0.13, 0.13, 0.13, 0.22, 0.22, 0.22, 0.22, 0.22, 0.17, 0.17, 0.17, 0.17, 0.17, 0.13, 0.13, 0.13, 0.13, 0.13, 0.15, 0.15, 0.15, 0.15, 0.15, 0.19, 0.19, 0.19, 0.19, 0.19, 0.23, 0.23, 0.23, 0.23, 0.23, 0.28, 0.28, 0.28, 0.28, 0.28, 0.26, 0.26, 0.26, 0.26, 0.26, 0.27, 0.27, 0.27, 0.27, 0.27, 0.39, 0.39, 0.39, 0.39, 0.39, 0.42, 0.42, 0.42, 0.42, 0.42, 0.38, 0.38, 0.38, 0.38, 0.38, 0.24, 0.24, 0.24, 0.24, 0.24, 0.14, 0.14, 0.14, 0.14, 0.14, 0.2, 0.2, 0.2, 0.2, 0.2, 0.29, 0.29, 0.29, 0.29, 0.29, 0.33, 0.33, 0.33, 0.33, 0.33, 0.23, 0.23, 0.23, 0.23, 0.23, 0.13, 0.13, 0.13, 0.13, 0.13, 0.16, 0.16, 0.16, 0.16, 0.16, 0.27, 0.27, 0.27, 0.27, 0.27, 0.12, 0.12, 0.12, 0.12, 0.12, 0.28, 0.28, 0.28, 0.28, 0.28, 0.22, 0.22, 0.22, 0.22, 0.22, 0.15, 0.15, 0.15, 0.15, 0.15, 0.17, 0.17, 0.17, 0.17, 0.17, 0.16, 0.16, 0.16, 0.16, 0.16, 0.17, 0.17, 0.17, 0.17, 0.17, 0.14, 0.14, 0.14, 0.14, 0.14, 0.15, 0.15, 0.15, 0.15, 0.15, 0.29, 0.29, 0.29, 0.29, 0.29, 0.28, 0.28, 0.28, 0.28, 0.28, 0.29, 0.29, 0.29, 0.29, 0.29, 0.28, 0.28, 0.28, 0.28, 0.28, 0.14, 0.14, 0.14, 0.14, 0.14, 0.09, 0.09, 0.09, 0.09, 0.09, 0.12, 0.12, 0.12, 0.12, 0.12, 0.11, 0.11, 0.11, 0.11, 0.11, 0.39, 0.39, 0.39, 0.39, 0.39, 0.56, 0.56, 0.56, 0.56, 0.56, 0.53, 0.53, 0.53, 0.53, 0.53, 0.52, 0.52, 0.52, 0.52, 0.52, 0.1, 0.1, 0.1, 0.1, 0.1, 0.22, 0.22, 0.22, 0.22, 0.22, 0.28, 0.28, 0.28, 0.28, 0.28, 0.18, 0.18, 0.18, 0.18, 0.18, 0.22, 0.22, 0.22, 0.22, 0.22, 0.13, 0.13, 0.13, 0.13, 0.13, 0.21, 0.21, 0.21, 0.21, 0.21, 0.33, 0.33, 0.33, 0.33, 0.33, 0.12, 0.12, 0.12, 0.12, 0.12, 0.18, 0.18, 0.18, 0.18, 0.18, 0.23, 0.23, 0.23, 0.23, 0.23, 0.12, 0.12, 0.12, 0.12, 0.12, 0.13, 0.13, 0.13, 0.13, 0.13, 0.12, 0.12, 0.12, 0.12, 0.12]
                    
Loading
requests_processing
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 531 iterations"
    y-axis "llamacpp:requests_processing"
    x-axis "llamacpp:requests_processing" 1716282012 --> 1716282640
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 1.0, 1.0, 1.0, 1.0, 1.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 3.0, 3.0, 3.0, 3.0, 3.0, 6.0, 6.0, 6.0, 6.0, 6.0, 3.0, 3.0, 3.0, 3.0, 3.0, 6.0, 6.0, 6.0, 6.0, 6.0, 2.0, 2.0, 2.0, 2.0, 2.0, 4.0, 4.0, 4.0, 4.0, 4.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 2.0, 2.0, 2.0, 2.0, 2.0, 7.0, 7.0, 7.0, 7.0, 7.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 8.0, 8.0, 8.0, 8.0, 8.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 5.0, 5.0, 5.0, 5.0, 5.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 3.0, 3.0, 3.0, 3.0, 3.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 7.0, 7.0, 7.0, 7.0, 7.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 2.0, 2.0, 2.0, 2.0, 2.0, 3.0, 3.0, 3.0, 3.0, 3.0, 7.0, 7.0, 7.0, 7.0, 7.0]
                    
Loading

@ggerganov
Copy link
Member

I reverted the changes that I made to the server tests - these should no longer be necessary

@jaime-m-p jaime-m-p merged commit d7e852c into ggml-org:master May 21, 2024
77 checks passed
Nexesenex pushed a commit to Nexesenex/croco.cpp that referenced this pull request May 21, 2024
* Update brute force test: add_special
* Update brute force test: default values for add_bos_token and add_eos_token
* Enable rtrim when pre-inserting BOS

Co-authored-by: Georgi Gerganov <[email protected]>
* Revert "server : fix test regexes"
@mofosyne
Copy link
Collaborator

On main branch this caused a failure under 'CI / macOS-latest-cmake-arm64 (push) Failing after 2m' for this test

  • 20 - test-backend-ops

@ggerganov
Copy link
Member

It's not related - some of the CPY tests occasionally produce numerical differences that slightly exceed the epsilon threshold:

20:   CPY(type_src=f32,type_dst=q5_1,ne=[256,4,4,4]): [CPY] NMSE = 0.000000730 > 0.000000100 FAIL

The difference is due to different rounding mode between the CPU / GPU: #4698

teleprint-me pushed a commit to teleprint-me/llama.cpp that referenced this pull request May 23, 2024
* Update brute force test: add_special
* Update brute force test: default values for add_bos_token and add_eos_token
* Enable rtrim when pre-inserting BOS

Co-authored-by: Georgi Gerganov <[email protected]>
* Revert "server : fix test regexes"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
examples python python script changes server testing Everything test related
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants