Skip to content

Misc. bug: Unsupported op "CPY" / SIGABRT on Apple CPU #13112

Closed
@stephen-ebenezar

Description

@stephen-ebenezar

Name and Version

version: 5171 (2cca6c0)
built with Apple clang version 16.0.0 (clang-1600.0.26.6) for arm64-apple-darwin24.4.0

Operating systems

Mac (M4 Pro / 48 GB)

Which llama.cpp modules do you know to be affected?

llama-cli

Command line

bin/llama-cli --model ./DeepSeek-R1-GGUF/DeepSeek-R1-UD-IQ1_S/DeepSeek-R1-UD-IQ1_S-00001-of-00003.gguf --cache-type-k q4_0 --threads 1 -no-cnv --prio 2 --temp 0.6 --ctx-size 8 --seed 3407 --prompt "<|User|>The first six words of a Tale of Two Cities<|Assistant|>"

Problem description & steps to reproduce

Please find the full log here:
deepseek_llama_cpp_sigabrt.log

Possibly related issues/PRs:
Closed issue #12715
Closed PR #11987

First Bad Commit

Unknown. Takes a while to run on CPU.

Relevant log output

Last few lines of uploaded log:

The first six words of a Tale of Two Cities<think>
Okay, so the user is asking about the first six words of "A Tale of/Users/stephenebenezar/dev/llama.cpp/ggml/src/ggml-blas/ggml-blas.cpp:250: ggml_backend_blas_graph_compute: unsupported op CPY
Process finished with exit code 134 (interrupted by signal 6:SIGABRT)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions