Skip to content

Commit d28ec48

Browse files
YosuaMichaelfacebook-github-bot
authored andcommitted
[fbsync] Fix bug by checking if norm_layer weight is None before init (#6082)
Reviewed By: NicolasHug Differential Revision: D36760938 fbshipit-source-id: 366bbd3fa225431dd5a0871019bb0e70d348ee7c
1 parent 9364c8e commit d28ec48

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

torchvision/models/resnet.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -217,9 +217,9 @@ def __init__(
217217
# This improves the model by 0.2~0.3% according to https://arxiv.org/abs/1706.02677
218218
if zero_init_residual:
219219
for m in self.modules():
220-
if isinstance(m, Bottleneck):
220+
if isinstance(m, Bottleneck) and m.bn3.weight is not None:
221221
nn.init.constant_(m.bn3.weight, 0) # type: ignore[arg-type]
222-
elif isinstance(m, BasicBlock):
222+
elif isinstance(m, BasicBlock) and m.bn2.weight is not None:
223223
nn.init.constant_(m.bn2.weight, 0) # type: ignore[arg-type]
224224

225225
def _make_layer(

0 commit comments

Comments
 (0)