Skip to content

Do not save unet.unet keys when training LoRAs #5977

@apolinario

Description

@apolinario

Describe the bug

When training LoRAs, sometimes they end up with a unet.unet key like in this case https://huggingface.co/sayakpaul/new-lora-check-v15/blob/main/pytorch_lora_weights.safetensors5/tree/main

unet.unet.up_blocks.1.attentions.2.transformer_blocks.1.attn2.processor.to_v_lora.up.weight

While loading up this format is supported on diffusers, it is probably undesirable to have such keys and maybe better off deprecated and make sure all keys have a single unet prefix

(Internal Slack thread: https://huggingface.slack.com/archives/C03UQJENJTV/p1701261840645189)

Reproduction

Training a LoRA with our example notebook/script yields the unet.unet output https://huggingface.co/LinoyTsaban/corgy_dog_LoRA

Logs

No response

System Info

diffusers==0.23.1

Who can help?

@sayakpaul

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions