Skip to content

Conversation

oke-aditya
Copy link
Contributor

Closes #7103

Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for the PR @oke-aditya !
Some minor suggestions below, LMK what you think.

@@ -124,6 +124,7 @@ def shifted_window_attention_3d(
dropout: float = 0.0,
qkv_bias: Optional[Tensor] = None,
proj_bias: Optional[Tensor] = None,
training: bool = False,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we set the default to True since it's the default of torch.nn.functional.dropout, and also the default of shifted_window_attention ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well caught this was mistake it should have been True

Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot @oke-aditya , LGTM!

@NicolasHug NicolasHug merged commit 8f198e5 into pytorch:main Feb 16, 2023
facebook-github-bot pushed a commit that referenced this pull request Mar 28, 2023
Summary: Co-authored-by: Nicolas Hug <[email protected]>

Reviewed By: vmoens

Differential Revision: D44416635

fbshipit-source-id: 8f90f59beb98e36ae98a33a8e1b2abd2d07a430b
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Dropout in SwinTransformer unable to set training=False
3 participants