Skip to content

Conversation

@stevhliu
Copy link
Member

@stevhliu stevhliu commented May 1, 2024

This PR combines "Distilled Stable Diffusion inference" with the "Speed up inference" doc. It also:

  • updates the table with only "speed" results (adds tf32 and combined results) and not the "memory" related ones like channels last or traced UNet (these are kept in the "Reduce memory usage" doc)
  • I don't have a Titan RTX so my results were obtained from a Colab A100
  • removes the code snippets for timing the inference run in favor of keeping things simpler and allowing users to just copy and use the code

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@stevhliu stevhliu requested review from sayakpaul and yiyixuxu May 1, 2024 21:13
Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Clean 🔮

Copy link
Collaborator

@yiyixuxu yiyixuxu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

very nice! thank you!

@stevhliu stevhliu merged commit 0d23645 into huggingface:main May 6, 2024
@stevhliu stevhliu deleted the distill-sd branch May 6, 2024 22:07
lawrence-cj pushed a commit to lawrence-cj/diffusers that referenced this pull request May 8, 2024
sayakpaul pushed a commit that referenced this pull request Dec 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants