Skip to content

unet time embedding activation function #3048

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

williamberman
Copy link
Contributor

@williamberman williamberman commented Apr 11, 2023

See PR comments

Comment on lines +656 to +680
if self.time_embed_act is not None:
emb = self.time_embed_act(emb)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

optional activation of time embeddings once at at the beginning of the unet

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Out of curiosity.

Is it being used in the private fork?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yessir!

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Apr 11, 2023

The documentation is not available anymore as the PR was closed or merged.

@williamberman williamberman force-pushed the unet_2d_conditional_time_embed_activation branch from 55412f3 to df4eb1b Compare April 11, 2023 01:45
Comment on lines 270 to 275
if act_fn == "swish":
self.time_embed_act = lambda x: F.silu(x)
elif act_fn == "mish":
self.time_embed_act = nn.Mish()
elif act_fn == "silu":
self.time_embed_act = nn.SiLU()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't we do?

if act_fn in ["swish", "silu"]:
    self.time_embed_act = nn.SiLU()

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I would hope we could :) This is how it's done in a few other places in the code base so I'd like to leave it this way for now and do a follow up including a refactor of all the dispatches to the different activation functions

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok for me! BTW I't totally fine with creating a get_act_fn function and an activation file as we do in transformes here: https://github.com/huggingface/transformers/blob/main/src/transformers/activations.py

@williamberman williamberman force-pushed the unet_2d_conditional_time_embed_activation branch from 9ae2f30 to 0660fe0 Compare April 11, 2023 17:51
@williamberman williamberman force-pushed the unet_2d_conditional_time_embed_activation branch from f4a5a17 to e309542 Compare April 11, 2023 21:44
@williamberman williamberman merged commit 2d52e81 into huggingface:main Apr 11, 2023
w4ffl35 pushed a commit to w4ffl35/diffusers that referenced this pull request Apr 14, 2023
* unet time embedding activation function

* typo act_fn -> time_embedding_act_fn

* flatten conditional
dg845 pushed a commit to dg845/diffusers that referenced this pull request May 6, 2023
* unet time embedding activation function

* typo act_fn -> time_embedding_act_fn

* flatten conditional
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
* unet time embedding activation function

* typo act_fn -> time_embedding_act_fn

* flatten conditional
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
* unet time embedding activation function

* typo act_fn -> time_embedding_act_fn

* flatten conditional
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants