Skip to content

Conversation

@sidthekidder
Copy link
Contributor

@sidthekidder sidthekidder commented Sep 6, 2022

Added blackbox tests for AttentionBlock and SpatialTransformer - #201. A question - since BasicTransformerBlock, CrossAttention and FeedForward classes are all used within SpatialTransformerBlock, is it required to test them separately? Or testing SpatialTransformer is enough, since any change in them would cause the SpatialTransformerBlockTests to fail.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 6, 2022

The documentation is not available anymore as the PR was closed or merged.

@sidthekidder sidthekidder force-pushed the attention-testing branch 2 times, most recently from af273b5 to 0a68a1f Compare September 6, 2022 07:26
Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for working this ! Left some comments. Let me know if you need any help with it :)

@sidthekidder
Copy link
Contributor Author

Thanks for the helpful comments @patil-suraj! Updated to address them

@sidthekidder
Copy link
Contributor Author

Hmm it seems the dropout test is failing maybe because nn.Dropout is probabilistic, though it was passing locally for me consistently.

@sidthekidder
Copy link
Contributor Author

@patil-suraj I removed a test that was failing (nn.dropout which had different results on my local and the CI server). Any other suggestions

@sidthekidder
Copy link
Contributor Author

@patil-suraj does this PR look ok?

Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just left couple of nits, then it should be good for merge.

@patil-suraj
Copy link
Contributor

@patil-suraj I removed a test that was failing (nn.dropout which had different results on my local and the CI server). Any other suggestions

Aah, I see. We should actually put the blocks in eval mode. This will disable the dropout.

@sidthekidder
Copy link
Contributor Author

Thanks for the eval tip, that worked! Added a test with dropout=0.3 as well.

Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for working on this!

@patil-suraj patil-suraj merged commit f73ca90 into huggingface:main Sep 16, 2022
Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice!


def test_spatial_transformer_context_dim(self):
torch.manual_seed(0)
if torch.cuda.is_available():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(nit) don't think this is needed anymore actually in newer pytorch versions

PhaneeshB pushed a commit to nod-ai/diffusers that referenced this pull request Mar 1, 2023
* Add --local_tank_cache flag and update requirements.

* Update requirements-importer.txt
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants