-
Notifications
You must be signed in to change notification settings - Fork 6.6k
[Tests] Test attention.py #368
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The documentation is not available anymore as the PR was closed or merged. |
af273b5 to
0a68a1f
Compare
patil-suraj
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for working this ! Left some comments. Let me know if you need any help with it :)
|
Thanks for the helpful comments @patil-suraj! Updated to address them |
|
Hmm it seems the dropout test is failing maybe because nn.Dropout is probabilistic, though it was passing locally for me consistently. |
60a74a7 to
9b2a3ca
Compare
|
@patil-suraj I removed a test that was failing (nn.dropout which had different results on my local and the CI server). Any other suggestions |
|
@patil-suraj does this PR look ok? |
patil-suraj
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just left couple of nits, then it should be good for merge.
Aah, I see. We should actually put the blocks in |
892d095 to
701a47d
Compare
|
Thanks for the eval tip, that worked! Added a test with |
701a47d to
20e198a
Compare
patil-suraj
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for working on this!
patrickvonplaten
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice!
|
|
||
| def test_spatial_transformer_context_dim(self): | ||
| torch.manual_seed(0) | ||
| if torch.cuda.is_available(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(nit) don't think this is needed anymore actually in newer pytorch versions
* Add --local_tank_cache flag and update requirements. * Update requirements-importer.txt
Added blackbox tests for AttentionBlock and SpatialTransformer - #201. A question - since BasicTransformerBlock, CrossAttention and FeedForward classes are all used within SpatialTransformerBlock, is it required to test them separately? Or testing SpatialTransformer is enough, since any change in them would cause the SpatialTransformerBlockTests to fail.