-
Notifications
You must be signed in to change notification settings - Fork 6.6k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
When training LoRAs, sometimes they end up with a unet.unet key like in this case https://huggingface.co/sayakpaul/new-lora-check-v15/blob/main/pytorch_lora_weights.safetensors5/tree/main
unet.unet.up_blocks.1.attentions.2.transformer_blocks.1.attn2.processor.to_v_lora.up.weight
While loading up this format is supported on diffusers, it is probably undesirable to have such keys and maybe better off deprecated and make sure all keys have a single unet prefix
(Internal Slack thread: https://huggingface.slack.com/archives/C03UQJENJTV/p1701261840645189)
Reproduction
Training a LoRA with our example notebook/script yields the unet.unet output https://huggingface.co/LinoyTsaban/corgy_dog_LoRA
Logs
No response
System Info
diffusers==0.23.1
Who can help?
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working