-
Notifications
You must be signed in to change notification settings - Fork 7.1k
Rename ConvNormActivation to Conv2dNormActivation #5440
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
💊 CI failures summary and remediationsAs of commit 79f6c6d (more details on the Dr. CI page):
1 failure not recognized by patterns:
This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
FutureWarning, | ||
) | ||
super().__init__(*args, **kwargs) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
1 more new line needed.
def __init__(self, *args, **kwargs): | ||
warnings.warn( | ||
"The ConvNormActivation class are deprecated since 0.13 and will be removed in 0.15. " | ||
"Use torchvision.ops.misc.ConvNormActivation instead.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Use torchvision.ops.misc.ConvNormActivation instead.", | |
"Use torchvision.ops.misc.Conv2dNormActivation instead.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we use layers on line 105 as nn.ModuleList ? I think that's what pytorch recommends for storing list of layers.
Also smallest nit. There is a typo in convolution spelling on Line 79. |
@oke-aditya Thanks for the comments, I wish Github could allow comments on unmodified lines. It would assist reviews so much.
I think it's fine to extend directly from Sequential here. It's an idiom we use often at TorchVision.
Great spot!
We test it as part of multiple models that use it. There is a point to be made that each op could be tested separately, but this is quite common with many layers (for example |
Thank you @datumbox and @oke-aditya for the comments. I will address those. But before that, and taking into account the discussion in #5445, I am thinking if I should instead reformulate this as making Then both In this version we would have the disadvantage of making |
@jdsgomes Sounds good to me. I wouldn't worry too much about trying to detect all the possible incorrect combos that the user might pass to the base class in that case. I think Let's see how the implementation looks like and chat again. |
Yeah we can do this. Also just name the base class as |
Looks like me and @datumbox just jinxed at same time to comment. 👀 I can try doing this, but will take a couple of days time. |
@oke-aditya the timeline works, so fell free to give it a shot. I will cancel this PR and suggest a new PRs:
Finally, if your priorities change just let me know and I can work on it. |
I will give this a go in couple of days. 😄 |
Addresses #5430
Renames
ConvNormActivation
toConv2dNormActivation
and adds deprecation warning inConvNormActivation
cc @datumbox @oke-aditya