You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In many situations, we need more flexiable ways to instantiate models in torchvision.models.
For examples, when finetuning a ResNet-50 classification model on a dataset of 10 classes, we need torchvision.models.resnet50(pretrained=True, num_classes=10), but it is not supported now. In current implementation, num_classes should be 1000 when pretrained=True. To implement this case, we should support partial copy from pretrained weights(load weights except for FC layer).
Another example: sometimes we need instantiate the models in torchvision.models as backbone, which means the FC layer is no more needed. Implementation in this case is also needed. We can support this case with an additional argument, like include_top=False in Keras.
Pitch
A possible solution is to modify some codes in models constructions. At least two more features should be realized:
Support for loading partial weights: when num_classes!=1000, weights can still be loaded except for last FC layer.
Support for backbone mode(FC layer moved): when an argument (like include_top) is set to False, the last layer(FC layer) will be moved.
We can apply these modifications to many basic models in torchvision.models.