forked from ggml-org/llama.cpp
    
        
        - 
                Notifications
    
You must be signed in to change notification settings  - Fork 574
 
Closed
Description
Unsurprising the new Mixtral-8x7B and more specifically Mixtral-8x7b-instruct-v0.1.Q4_K_M.gguf does not work. As experienced from other users they get the error create_tensor: tensor 'blk.0.ffn_gate.weight' not found. I understand that it just came out and will take some time for it to get up and working I'm just trying to put it on the radar as I haven't seen anyone talk about it here. If support for it gets added in the next update I'd be happy :D
odragora, jasonchuanet and firengate
Metadata
Metadata
Assignees
Labels
No labels