-
Notifications
You must be signed in to change notification settings - Fork 13k
Closed
Labels
bug-unconfirmedlow severityUsed to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)stale
Description
Note: Only one datapoint of ci failure, but it would be important to keep track of this behavior over the next few commits
What happened?
Noticed that it said it's failing in 20 - test-backend-ops
, it be good to identify the cause of this issue and potential ways to fix it. The failure in test #20 in test-backend-ops looked like below which doesn't seem to explain much to me. But hopefully it makes sense to someone else here.
[CPY] NMSE = 0.000003149 > 0.000000100
looks interesting however
Name and Version
between commit 504f0c3 and 0e8d8bf
What operating system are you seeing the problem on?
Other? (Please let us know in description)
Relevant log output
�[1;32mOK�[0m
CPY(type_src=f32,type_dst=q4_1,ne=[256,4,4,4]): ggml_backend_cuda_graph_compute: disabling CUDA graphs due to GPU architecture
ggml_backend_cuda_graph_compute: disabling CUDA graphs due to GPU architecture
[CPY] NMSE = 0.000003149 > 0.000000100 ggml_backend_cuda_graph_compute: disabling CUDA graphs due to GPU architecture
ggml_backend_cuda_graph_compute: disabling CUDA graphs due to GPU architecture
ggml_backend_cuda_graph_compute: disabling CUDA graphs due to GPU architecture
�[1;31mFAIL�[0m
CPY(type_src=f32,type_dst=q5_0,ne=[256,4,4,4]): ggml_backend_cuda_graph_compute: disabling CUDA graphs due to GPU architecture
ggml_backend_cuda_graph_compute: disabling CUDA graphs due to GPU architecture
ggml_backend_cuda_graph_compute: disabling CUDA graphs due to GPU architecture
ggml_backend_cuda_graph_compute: disabling CUDA graphs due to GPU architecture
ggml_backend_cuda_graph_compute: disabling CUDA graphs due to GPU architecture
�[1;32mOK�[0m
Metadata
Metadata
Assignees
Labels
bug-unconfirmedlow severityUsed to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)Used to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)stale