You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Community Pipelines]Accelerate inference of stable diffusion xl (SDXL) by IPEX on CPU (#6683)
* add stable_diffusion_xl_ipex community pipeline
* make style for code quality check
* update docs as suggested
---------
Co-authored-by: Patrick von Platen <[email protected]>
Copy file name to clipboardExpand all lines: examples/community/README.md
+106Lines changed: 106 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -63,6 +63,7 @@ If a community doesn't work as expected, please open an issue and ping the autho
63
63
| IP Adapter FaceID Stable Diffusion | Stable Diffusion Pipeline that supports IP Adapter Face ID |[IP Adapter Face ID](#ip-adapter-face-id)| - |[Fabio Rigano](https://github.com/fabiorigano)|
64
64
| InstantID Pipeline | Stable Diffusion XL Pipeline that supports InstantID |[InstantID Pipeline](#instantid-pipeline)|[](https://huggingface.co/spaces/InstantX/InstantID)|[Haofan Wang](https://github.com/haofanwang)|
65
65
| UFOGen Scheduler | Scheduler for UFOGen Model (compatible with Stable Diffusion pipelines) |[UFOGen Scheduler](#ufogen-scheduler)| - |[dg845](https://github.com/dg845)|
66
+
| Stable Diffusion XL IPEX Pipeline | Accelerate Stable Diffusion XL inference pipeline with BF16/FP32 precision on Intel Xeon CPUs with [IPEX](https://github.com/intel/intel-extension-for-pytorch)|[Stable Diffusion XL on IPEX](#stable-diffusion-xl-on-ipex)| - |[Dan Li](https://github.com/ustcuna/)|
66
67
67
68
To load a custom pipeline you just need to pass the `custom_pipeline` argument to `DiffusionPipeline`, as one of the files in `diffusers/examples/community`. Feel free to send a PR with your own pipelines, we will merge them quickly.
68
69
@@ -1707,6 +1708,111 @@ print("Latency of StableDiffusionPipeline--fp32",latency)
1707
1708
1708
1709
```
1709
1710
1711
+
### Stable Diffusion XL on IPEX
1712
+
1713
+
This diffusion pipeline aims to accelarate the inference of Stable-Diffusion XL on Intel Xeon CPUs with BF16/FP32 precision using [IPEX](https://github.com/intel/intel-extension-for-pytorch).
**Note:** For each PyTorch release, there is a corresponding release of IPEX. Here is the mapping relationship. It is recommended to install Pytorch/IPEX2.0 to get the best performance.
2. After pipeline initialization, `prepare_for_ipex()` should be called to enable IPEX accelaration. Supported inference datatypes are Float32 and BFloat16.
1735
+
1736
+
**Note:** The values of `height` and `width` used during preparation with `prepare_for_ipex()` should be the same when running inference with the prepared pipeline.
0 commit comments