Describe the bug
shift_factor missing in traning code:
https://github.com/huggingface/diffusers/blob/main/examples/dreambooth/train_dreambooth_lora_sd3.py#L1617,
but used in inference code:
https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/stable_diffusion_3/pipeline_stable_diffusion_3.py#L893
Is it resonable that when traning SD3, we do not need to norm latents using vae.config.shift_factor and scale_factor?
Thinks
Reproduction
None
Logs
No response
System Info
None
Who can help?
@sayakpaul