Skip to content

backwards compatibility breaking change from sharded checkpoints #8443

@bghira

Description

@bghira

Describe the bug

Since #7830 we have SDXL checkpoints sharded automatically, which did not happen before. These checkpoints are not loadable on earlier versions.

Reproduction

I think all one has to do is load from_pretrained() and then save_pretrained()

Logs

$ ls /home/user/training/lite-models/checkpoint-400/unet/
config.json  diffusion_pytorch_model-00001-of-00002.safetensors  diffusion_pytorch_model-00002-of-00002.safetensors  diffusion_pytorch_model.safetensors.index.json

System Info

main git branch

Who can help?

@sayakpaul as discussed

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions