Commit 4e620f3
[Training] Add
* add: script to train lcm lora for sdxl with 🤗 datasets
* suit up the args.
* remove comments.
* fix num_update_steps
* fix batch unmarshalling
* fix num_update_steps_per_epoch
* fix; dataloading.
* fix microconditions.
* unconditional predictions debug
* fix batch size.
* no need to use use_auth_token
* Apply suggestions from code review
Co-authored-by: Suraj Patil <[email protected]>
* make vae encoding batch size an arg
* final serialization in kohya
* style
* state dict rejigging
* feat: no separate teacher unet.
* debug
* fix state dict serialization
* debug
* debug
* debug
* remove prints.
* remove kohya utility and make style
* fix serialization
* fix
* add test
* add peft dependency.
* add: peft
* remove peft
* autocast device determination from accelerator
* autocast
* reduce lora rank.
* remove unneeded space
* Apply suggestions from code review
Co-authored-by: Suraj Patil <[email protected]>
* style
* remove prompt dropout.
* also save in native diffusers ckpt format.
* debug
* debug
* debug
* better formation of the null embeddings.
* remove space.
* autocast fixes.
* autocast fix.
* hacky
* remove lora_sayak
* Apply suggestions from code review
Co-authored-by: Younes Belkada <[email protected]>
* style
* make log validation leaner.
* move back enabled in.
* fix: log_validation call.
* add: checkpointing tests
* taking my chances to see if disabling autocasting has any effect?
* start debugging
* name
* name
* name
* more debug
* more debug
* index
* remove index.
* print length
* print length
* print length
* move unet.train() after add_adapter()
* disable some prints.
* enable_adapters() manually.
* remove prints.
* some changes.
* fix params_to_optimize
* more fixes
* debug
* debug
* remove print
* disable grad for certain contexts.
* Add support for IPAdapterFull (huggingface#5911)
* Add support for IPAdapterFull
Co-authored-by: Patrick von Platen <[email protected]>
---------
Co-authored-by: YiYi Xu <[email protected]>
Co-authored-by: Patrick von Platen <[email protected]>
* Fix a bug in `add_noise` function (huggingface#6085)
* fix
* copies
---------
Co-authored-by: yiyixuxu <yixu310@gmail,com>
* [Advanced Diffusion Script] Add Widget default text (huggingface#6100)
add widget
* [Advanced Training Script] Fix pipe example (huggingface#6106)
* IP-Adapter for StableDiffusionControlNetImg2ImgPipeline (huggingface#5901)
* adapter for StableDiffusionControlNetImg2ImgPipeline
* fix-copies
* fix-copies
---------
Co-authored-by: Sayak Paul <[email protected]>
* IP adapter support for most pipelines (huggingface#5900)
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py
* update tests
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py
* support ip-adapter in src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py
* support ip-adapter in src/diffusers/pipelines/latent_consistency_models/pipeline_latent_consistency_text2img.py
* support ip-adapter in src/diffusers/pipelines/latent_consistency_models/pipeline_latent_consistency_img2img.py
* support ip-adapter in src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py
* revert changes to sd_attend_and_excite and sd_upscale
* make style
* fix broken tests
* update ip-adapter implementation to latest
* apply suggestions from review
---------
Co-authored-by: YiYi Xu <[email protected]>
Co-authored-by: Sayak Paul <[email protected]>
* fix: lora_alpha
* make vae casting conditional/
* param upcasting
* propagate comments from huggingface#6145
Co-authored-by: dg845 <[email protected]>
* [Peft] fix saving / loading when unet is not "unet" (huggingface#6046)
* [Peft] fix saving / loading when unet is not "unet"
* Update src/diffusers/loaders/lora.py
Co-authored-by: Sayak Paul <[email protected]>
* undo stablediffusion-xl changes
* use unet_name to get unet for lora helpers
* use unet_name
---------
Co-authored-by: Sayak Paul <[email protected]>
* [Wuerstchen] fix fp16 training and correct lora args (huggingface#6245)
fix fp16 training
Co-authored-by: Sayak Paul <[email protected]>
* [docs] fix: animatediff docs (huggingface#6339)
fix: animatediff docs
* add: note about the new script in readme_sdxl.
* Revert "[Peft] fix saving / loading when unet is not "unet" (huggingface#6046)"
This reverts commit 4c7e983.
* Revert "[Wuerstchen] fix fp16 training and correct lora args (huggingface#6245)"
This reverts commit 0bb9cf0.
* Revert "[docs] fix: animatediff docs (huggingface#6339)"
This reverts commit 11659a6.
* remove tokenize_prompt().
* assistive comments around enable_adapters() and diable_adapters().
---------
Co-authored-by: Suraj Patil <[email protected]>
Co-authored-by: Younes Belkada <[email protected]>
Co-authored-by: Fabio Rigano <[email protected]>
Co-authored-by: YiYi Xu <[email protected]>
Co-authored-by: Patrick von Platen <[email protected]>
Co-authored-by: yiyixuxu <yixu310@gmail,com>
Co-authored-by: apolinário <[email protected]>
Co-authored-by: Charchit Sharma <[email protected]>
Co-authored-by: Aryan V S <[email protected]>
Co-authored-by: dg845 <[email protected]>
Co-authored-by: Kashif Rasul <[email protected]>datasets version of LCM LoRA SDXL (huggingface#5778)1 parent 8348712 commit 4e620f3
File tree
4 files changed
+1507
-1
lines changed- examples
- advanced_diffusion_training
- consistency_distillation
4 files changed
+1507
-1
lines changedLines changed: 2 additions & 0 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
161 | 161 | | |
162 | 162 | | |
163 | 163 | | |
| 164 | + | |
| 165 | + | |
164 | 166 | | |
165 | 167 | | |
166 | 168 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
111 | 111 | | |
112 | 112 | | |
113 | 113 | | |
114 | | - | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
| 119 | + | |
| 120 | + | |
| 121 | + | |
| 122 | + | |
| 123 | + | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
| 128 | + | |
| 129 | + | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
| 133 | + | |
| 134 | + | |
| 135 | + | |
| 136 | + | |
| 137 | + | |
| 138 | + | |
| 139 | + | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
| 144 | + | |
| 145 | + | |
| 146 | + | |
| 147 | + | |
| 148 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
| 108 | + | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
| 112 | + | |
0 commit comments