Skip to content

ggml : tag ggml_tensor::backend as deprecated #7290

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 15, 2024

Conversation

slaren
Copy link
Member

@slaren slaren commented May 14, 2024

ggml_tensor::backend will be removed in the future. Instead, backends should use the buffer type to determine the storage location of the backend. Applications can use ggml_backend_buft_is_host as an optimization to determine if a tensor data is accessible from the CPU side.

Also removes some dead code from llava.cpp that used this, but I am not sure what was the intent there.

@slaren slaren force-pushed the sl/deprecate-tensor-backend branch from 9f4ee41 to 1aa55e1 Compare May 15, 2024 00:01
@mofosyne mofosyne added refactoring Refactoring Review Complexity : Medium Generally require more time to grok but manageable by beginner to medium expertise level labels May 15, 2024
Copy link
Contributor

📈 llama.cpp server for bench-server-baseline on Standard_NC4as_T4_v3 for phi-2-q4_0: 541 iterations 🚀

Expand details for performance related PR only
  • Concurrent users: 8, duration: 10m
  • HTTP request : avg=8649.79ms p(95)=21536.05ms fails=, finish reason: stop=487 truncated=54
  • Prompt processing (pp): avg=103.39tk/s p(95)=508.2tk/s
  • Token generation (tg): avg=31.5tk/s p(95)=45.25tk/s
  • ggml-org/models/phi-2/ggml-model-q4_0.gguf parallel=8 ctx-size=16384 ngl=33 batch-size=2048 ubatch-size=256 pp=1024 pp+tg=2048 branch=sl/deprecate-tensor-backend commit=1aa55e14524e17031561e5a5e3416787603f7b75

prompt_tokens_seconds

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 541 iterations"
    y-axis "llamacpp:prompt_tokens_seconds"
    x-axis "llamacpp:prompt_tokens_seconds" 1715732667 --> 1715733303
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 420.28, 420.28, 420.28, 420.28, 420.28, 605.22, 605.22, 605.22, 605.22, 605.22, 604.89, 604.89, 604.89, 604.89, 604.89, 645.04, 645.04, 645.04, 645.04, 645.04, 714.54, 714.54, 714.54, 714.54, 714.54, 715.05, 715.05, 715.05, 715.05, 715.05, 736.97, 736.97, 736.97, 736.97, 736.97, 755.3, 755.3, 755.3, 755.3, 755.3, 769.88, 769.88, 769.88, 769.88, 769.88, 772.42, 772.42, 772.42, 772.42, 772.42, 797.14, 797.14, 797.14, 797.14, 797.14, 829.28, 829.28, 829.28, 829.28, 829.28, 829.31, 829.31, 829.31, 829.31, 829.31, 860.21, 860.21, 860.21, 860.21, 860.21, 840.65, 840.65, 840.65, 840.65, 840.65, 846.97, 846.97, 846.97, 846.97, 846.97, 849.0, 849.0, 849.0, 849.0, 849.0, 846.34, 846.34, 846.34, 846.34, 846.34, 796.96, 796.96, 796.96, 796.96, 796.96, 787.75, 787.75, 787.75, 787.75, 787.75, 787.37, 787.37, 787.37, 787.37, 787.37, 794.48, 794.48, 794.48, 794.48, 794.48, 796.03, 796.03, 796.03, 796.03, 796.03, 817.74, 817.74, 817.74, 817.74, 817.74, 817.57, 817.57, 817.57, 817.57, 817.57, 817.35, 817.35, 817.35, 817.35, 817.35, 832.17, 832.17, 832.17, 832.17, 832.17, 828.64, 828.64, 828.64, 828.64, 828.64, 828.65, 828.65, 828.65, 828.65, 828.65, 828.45, 828.45, 828.45, 828.45, 828.45, 833.94, 833.94, 833.94, 833.94, 833.94, 834.43, 834.43, 834.43, 834.43, 834.43, 832.24, 832.24, 832.24, 832.24, 832.24, 836.46, 836.46, 836.46, 836.46, 836.46, 847.86, 847.86, 847.86, 847.86, 847.86, 851.48, 851.48, 851.48, 851.48, 851.48, 859.41, 859.41, 859.41, 859.41, 859.41, 856.04, 856.04, 856.04, 856.04, 856.04, 855.94, 855.94, 855.94, 855.94, 855.94, 855.7, 855.7, 855.7, 855.7, 855.7, 857.79, 857.79, 857.79, 857.79, 857.79, 863.26, 863.26, 863.26, 863.26, 863.26, 844.17, 844.17, 844.17, 844.17, 844.17, 844.81, 844.81, 844.81, 844.81, 844.81, 843.32, 843.32, 843.32, 843.32, 843.32, 842.11, 842.11, 842.11, 842.11, 842.11, 842.22, 842.22, 842.22, 842.22, 842.22, 841.86, 841.86, 841.86, 841.86, 841.86, 840.6, 840.6, 840.6, 840.6, 840.6, 845.71, 845.71, 845.71, 845.71, 845.71, 845.03, 845.03, 845.03, 845.03, 845.03, 847.45, 847.45, 847.45, 847.45, 847.45, 851.55, 851.55, 851.55, 851.55, 851.55, 850.79, 850.79, 850.79, 850.79, 850.79, 844.47, 844.47, 844.47, 844.47, 844.47, 842.83, 842.83, 842.83, 842.83, 842.83, 843.51, 843.51, 843.51, 843.51, 843.51, 843.76, 843.76, 843.76, 843.76, 843.76, 843.96, 843.96, 843.96, 843.96, 843.96, 845.11, 845.11, 845.11, 845.11, 845.11, 844.76, 844.76, 844.76, 844.76]
                    
Loading
predicted_tokens_seconds
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 541 iterations"
    y-axis "llamacpp:predicted_tokens_seconds"
    x-axis "llamacpp:predicted_tokens_seconds" 1715732667 --> 1715733303
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 43.34, 43.34, 43.34, 43.34, 43.34, 27.3, 27.3, 27.3, 27.3, 27.3, 27.3, 27.3, 27.3, 27.3, 27.3, 30.15, 30.15, 30.15, 30.15, 30.15, 30.96, 30.96, 30.96, 30.96, 30.96, 32.62, 32.62, 32.62, 32.62, 32.62, 33.34, 33.34, 33.34, 33.34, 33.34, 33.85, 33.85, 33.85, 33.85, 33.85, 34.08, 34.08, 34.08, 34.08, 34.08, 33.7, 33.7, 33.7, 33.7, 33.7, 34.15, 34.15, 34.15, 34.15, 34.15, 33.77, 33.77, 33.77, 33.77, 33.77, 31.62, 31.62, 31.62, 31.62, 31.62, 31.35, 31.35, 31.35, 31.35, 31.35, 30.31, 30.31, 30.31, 30.31, 30.31, 29.95, 29.95, 29.95, 29.95, 29.95, 29.48, 29.48, 29.48, 29.48, 29.48, 29.64, 29.64, 29.64, 29.64, 29.64, 29.92, 29.92, 29.92, 29.92, 29.92, 29.48, 29.48, 29.48, 29.48, 29.48, 29.44, 29.44, 29.44, 29.44, 29.44, 29.4, 29.4, 29.4, 29.4, 29.4, 29.51, 29.51, 29.51, 29.51, 29.51, 29.49, 29.49, 29.49, 29.49, 29.49, 29.67, 29.67, 29.67, 29.67, 29.67, 29.81, 29.81, 29.81, 29.81, 29.81, 29.92, 29.92, 29.92, 29.92, 29.92, 29.75, 29.75, 29.75, 29.75, 29.75, 29.7, 29.7, 29.7, 29.7, 29.7, 29.78, 29.78, 29.78, 29.78, 29.78, 30.05, 30.05, 30.05, 30.05, 30.05, 30.08, 30.08, 30.08, 30.08, 30.08, 30.34, 30.34, 30.34, 30.34, 30.34, 30.5, 30.5, 30.5, 30.5, 30.5, 30.49, 30.49, 30.49, 30.49, 30.49, 30.19, 30.19, 30.19, 30.19, 30.19, 30.09, 30.09, 30.09, 30.09, 30.09, 29.87, 29.87, 29.87, 29.87, 29.87, 29.92, 29.92, 29.92, 29.92, 29.92, 30.05, 30.05, 30.05, 30.05, 30.05, 30.16, 30.16, 30.16, 30.16, 30.16, 30.4, 30.4, 30.4, 30.4, 30.4, 30.4, 30.4, 30.4, 30.4, 30.4, 30.25, 30.25, 30.25, 30.25, 30.25, 29.72, 29.72, 29.72, 29.72, 29.72, 28.92, 28.92, 28.92, 28.92, 28.92, 28.59, 28.59, 28.59, 28.59, 28.59, 28.61, 28.61, 28.61, 28.61, 28.61, 28.69, 28.69, 28.69, 28.69, 28.69, 28.73, 28.73, 28.73, 28.73, 28.73, 28.77, 28.77, 28.77, 28.77, 28.77, 28.85, 28.85, 28.85, 28.85, 28.85, 28.88, 28.88, 28.88, 28.88, 28.88, 28.8, 28.8, 28.8, 28.8, 28.8, 28.79, 28.79, 28.79, 28.79, 28.79, 28.73, 28.73, 28.73, 28.73, 28.73, 28.76, 28.76, 28.76, 28.76, 28.76, 28.88, 28.88, 28.88, 28.88, 28.88, 28.94, 28.94, 28.94, 28.94, 28.94, 29.12, 29.12, 29.12, 29.12, 29.12, 29.15, 29.15, 29.15, 29.15]
                    
Loading

Details

kv_cache_usage_ratio

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 541 iterations"
    y-axis "llamacpp:kv_cache_usage_ratio"
    x-axis "llamacpp:kv_cache_usage_ratio" 1715732667 --> 1715733303
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.13, 0.13, 0.13, 0.13, 0.13, 0.43, 0.43, 0.43, 0.43, 0.43, 0.18, 0.18, 0.18, 0.18, 0.18, 0.14, 0.14, 0.14, 0.14, 0.14, 0.21, 0.21, 0.21, 0.21, 0.21, 0.15, 0.15, 0.15, 0.15, 0.15, 0.17, 0.17, 0.17, 0.17, 0.17, 0.14, 0.14, 0.14, 0.14, 0.14, 0.23, 0.23, 0.23, 0.23, 0.23, 0.09, 0.09, 0.09, 0.09, 0.09, 0.23, 0.23, 0.23, 0.23, 0.23, 0.37, 0.37, 0.37, 0.37, 0.37, 0.2, 0.2, 0.2, 0.2, 0.2, 0.4, 0.4, 0.4, 0.4, 0.4, 0.32, 0.32, 0.32, 0.32, 0.32, 0.27, 0.27, 0.27, 0.27, 0.27, 0.19, 0.19, 0.19, 0.19, 0.19, 0.14, 0.14, 0.14, 0.14, 0.14, 0.3, 0.3, 0.3, 0.3, 0.3, 0.32, 0.32, 0.32, 0.32, 0.32, 0.14, 0.14, 0.14, 0.14, 0.14, 0.19, 0.19, 0.19, 0.19, 0.19, 0.15, 0.15, 0.15, 0.15, 0.15, 0.31, 0.31, 0.31, 0.31, 0.31, 0.15, 0.15, 0.15, 0.15, 0.15, 0.16, 0.16, 0.16, 0.16, 0.16, 0.32, 0.32, 0.32, 0.32, 0.32, 0.2, 0.2, 0.2, 0.2, 0.2, 0.12, 0.12, 0.12, 0.12, 0.12, 0.1, 0.1, 0.1, 0.1, 0.1, 0.15, 0.15, 0.15, 0.15, 0.15, 0.21, 0.21, 0.21, 0.21, 0.21, 0.13, 0.13, 0.13, 0.13, 0.13, 0.18, 0.18, 0.18, 0.18, 0.18, 0.27, 0.27, 0.27, 0.27, 0.27, 0.2, 0.2, 0.2, 0.2, 0.2, 0.4, 0.4, 0.4, 0.4, 0.4, 0.19, 0.19, 0.19, 0.19, 0.19, 0.14, 0.14, 0.14, 0.14, 0.14, 0.1, 0.1, 0.1, 0.1, 0.1, 0.13, 0.13, 0.13, 0.13, 0.13, 0.2, 0.2, 0.2, 0.2, 0.2, 0.51, 0.51, 0.51, 0.51, 0.51, 0.67, 0.67, 0.67, 0.67, 0.67, 0.52, 0.52, 0.52, 0.52, 0.52, 0.47, 0.47, 0.47, 0.47, 0.47, 0.16, 0.16, 0.16, 0.16, 0.16, 0.21, 0.21, 0.21, 0.21, 0.21, 0.19, 0.19, 0.19, 0.19, 0.19, 0.2, 0.2, 0.2, 0.2, 0.2, 0.11, 0.11, 0.11, 0.11, 0.11, 0.19, 0.19, 0.19, 0.19, 0.19, 0.29, 0.29, 0.29, 0.29, 0.29, 0.19, 0.19, 0.19, 0.19, 0.19, 0.26, 0.26, 0.26, 0.26, 0.26, 0.19, 0.19, 0.19, 0.19, 0.19, 0.13, 0.13, 0.13, 0.13, 0.13, 0.12, 0.12, 0.12, 0.12, 0.12, 0.16, 0.16, 0.16, 0.16, 0.16, 0.12, 0.12, 0.12, 0.12, 0.12, 0.15, 0.15, 0.15, 0.15]
                    
Loading
requests_processing
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 541 iterations"
    y-axis "llamacpp:requests_processing"
    x-axis "llamacpp:requests_processing" 1715732667 --> 1715733303
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 6.0, 6.0, 6.0, 6.0, 6.0, 1.0, 1.0, 1.0, 1.0, 1.0, 8.0, 8.0, 8.0, 8.0, 8.0, 4.0, 4.0, 4.0, 4.0, 4.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 2.0, 2.0, 2.0, 2.0, 2.0, 5.0, 5.0, 5.0, 5.0, 5.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 7.0, 7.0, 7.0, 7.0, 7.0, 6.0, 6.0, 6.0, 6.0, 6.0, 4.0, 4.0, 4.0, 4.0, 4.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 3.0, 3.0, 3.0, 3.0, 3.0, 1.0, 1.0, 1.0, 1.0, 1.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 2.0, 2.0, 2.0, 2.0, 2.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 2.0, 2.0, 2.0, 2.0, 2.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 6.0, 6.0, 6.0, 6.0, 6.0, 3.0, 3.0, 3.0, 3.0, 3.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 4.0, 4.0, 4.0, 4.0, 4.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.0, 2.0, 2.0, 2.0]
                    
Loading

@slaren slaren merged commit 344f912 into master May 15, 2024
68 checks passed
@slaren slaren deleted the sl/deprecate-tensor-backend branch May 15, 2024 13:08
teleprint-me pushed a commit to teleprint-me/llama.cpp that referenced this pull request May 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
refactoring Refactoring Review Complexity : Medium Generally require more time to grok but manageable by beginner to medium expertise level
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants