Skip to content

[bugfix] Check if norm_layer has not-None weight before init #6082

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 25, 2022

Conversation

YosuaMichael
Copy link
Contributor

I tried to resolve bug reported on #6074.

@datumbox I may not have a full context here, let me know what you think.

@FloCF
Copy link

FloCF commented May 24, 2022

@YosuaMichael : Your bugfix looks reasonable to me. Small remark: Maybe for the user it would be good to also throw out a warning (if weight is None) so that it is clear that zero_init_residual will be skipped since norm_layer has no weights.

Copy link
Contributor

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @YosuaMichael.

@FloCF Thanks for reporting the bug and for the recommendations. The implementation is generally expecting to receive a norm_layer compatible with the original architecture (BN, SyncBN, FrozenBN etc). I'm fine with fixing the breakage though.

@datumbox datumbox merged commit 0f971f6 into pytorch:main May 25, 2022
YosuaMichael added a commit to YosuaMichael/vision that referenced this pull request May 25, 2022
facebook-github-bot pushed a commit that referenced this pull request Jun 1, 2022
…#6082)

Reviewed By: NicolasHug

Differential Revision: D36760938

fbshipit-source-id: 366bbd3fa225431dd5a0871019bb0e70d348ee7c
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants