Skip to content

flux crashes with latest ggml #553

Open
@jeffbolznv

Description

@jeffbolznv

I'm using ggml @ c8bd0fe and sd @ dcf91f9, and using the vulkan backend. Trying to run a flux model using this command line:

sd --diffusion-model  flux1-dev-Q2_K.gguf --vae ae.safetensors --clip_l clip_l.safetensors --t5xxl t5xxl_fp16.safetensors -p "a lovely cat holding a sign says 'flux.cpp'" --cfg-scale 1.0 --sampling-method euler -v

I get a divide-by-zero crash in ggml_row_size because GGML_TYPE_Q4_0_4_8 is no longer supported after ggml-org/llama.cpp#10446. Is there a way to repack this, or do I need to use a different model or something? I'm generally just trying to run anything using flux to look at the performance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions