Skip to content

Conversation

@lawrence-cj
Copy link
Contributor

What does this PR do?

Adapt the PixArtAlphaPipeline for PixArt-lcm model. Cc: @sayakpaul
Adding the retrieve_timesteps() for LCMScheduler usage.

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Superb! Let's add a test too? @patrickvonplaten WDYT?

Would be very interesting to see some results with timing info.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

@sayakpaul
Copy link
Member

@lawrence-cj let's maybe add some results in here?

Also, would be great to mention this from the PixArt docs and add a code snippet.

@lawrence-cj
Copy link
Contributor Author

@sayakpaul No problem, I will make it done asap

@patrickvonplaten
Copy link
Contributor

@sayakpaul feel free to merge once you're happy with it

@patrickvonplaten
Copy link
Contributor

Great job @lawrence-cj !

@sayakpaul
Copy link
Member

So, we are waiting for the docs here.

@lawrence-cj
Copy link
Contributor Author

The model card is updated: https://huggingface.co/PixArt-alpha/PixArt-LCM-XL-2-1024-MS

@sayakpaul sayakpaul merged commit 4520e12 into huggingface:main Dec 2, 2023
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
* adapt PixArtAlphaPipeline for pixart-lcm model

* remove original_inference_steps from __call__

---------

Co-authored-by: Sayak Paul <[email protected]>
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
* adapt PixArtAlphaPipeline for pixart-lcm model

* remove original_inference_steps from __call__

---------

Co-authored-by: Sayak Paul <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants