Skip to content

Conversation

@pcuenca
Copy link
Member

@pcuenca pcuenca commented Nov 5, 2022

This was assumed in the previous implementation, but now (0b61cea) the default is the opposite.

Fixes #1145.

This was assumed in the previous implementation, but now the default is
the opposite.

Fixes #1145.
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Nov 5, 2022

The documentation is not available anymore as the PR was closed or merged.

@pcuenca
Copy link
Member Author

pcuenca commented Nov 5, 2022

I'm running the slow tests locally.

@kashif
Copy link
Contributor

kashif commented Nov 5, 2022

ah damn! so sorry! My bad! I believe you are right this is the reason. I do not think this function is called by itself... to be sure shall we set the default of the function to be True?

@pcuenca
Copy link
Member Author

pcuenca commented Nov 5, 2022

ah damn! so sorry! My bad! I believe you are right this is the reason. I do not think this function is called by itself... to be sure shall we set the default of the function to be True?

No worries! These issues should be easier to catch going forward with the new TPU tests.

Regarding making it the default, I'm not sure because the default is False in the PyTorch version and maybe it's better to keep them consistent. Let's see what the others think :)

@patrickvonplaten
Copy link
Contributor

ah damn! so sorry! My bad! I believe you are right this is the reason. I do not think this function is called by itself... to be sure shall we set the default of the function to be True?

No worries! These issues should be easier to catch going forward with the new TPU tests.

Regarding making it the default, I'm not sure because the default is False in the PyTorch version and maybe it's better to keep them consistent. Let's see what the others think :)

Yes we should indeed mirror the PyTorch behavior here

@kashif
Copy link
Contributor

kashif commented Nov 5, 2022

In that case I can send a PR to add an option to the FlaxUNet2DConditionModel to set the flip_sin_to_cos to True which is the default for the pytorch version?

@patrickvonplaten
Copy link
Contributor

Ah sorry, I didn't do a thoughout check - will take a look now!

@patrickvonplaten patrickvonplaten merged commit 08a6dc8 into main Nov 5, 2022
@patrickvonplaten patrickvonplaten deleted the fix-flax-time-embeddings branch November 5, 2022 21:17
@patrickvonplaten
Copy link
Contributor

Actually, let's maybe leave the PR as is for now since it solves the problem. In general, it's nicer to mirror PyTorch API - but we need to be sure that it doesn't break anything and we also shouldn't add features just for the sake of having the same API as PyTorch. So if we put the default to True , we need to make sure that it's set to False for Stable Diffusion.

More generally though, I think it's better to not add new function arguments just for the sake of making Flax API similar to PyTorch. E.g. such a PR: https://github.com/huggingface/diffusers/pull/1081/files only really makes sense if it enables new functionality for Flax/JAX which I think it didn't really there. Given that we have very low usage in Flax, let's not spend too much time on giving the API more features than it needs :-)

I did a really bad job at reviewing #1081, so taking full responsibility here :-)

Will do a patch release now - thanks for the quick fix @pcuenca and @kashif !

patrickvonplaten pushed a commit that referenced this pull request Nov 5, 2022
Flip sin to cos in t embeddings.

This was assumed in the previous implementation, but now the default is
the opposite.

Fixes #1145.
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
Flip sin to cos in t embeddings.

This was assumed in the previous implementation, but now the default is
the opposite.

Fixes huggingface#1145.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Flax] 🚨 0.7.0 not working 🚨

5 participants