You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In self.classifier, the correct order of layers should be fc6-relu-dropout-fc7-relu-dropout-fc8, which means that dropout layer should be after the fc and relu layers.
This does not matter when inference. However, when you fine tune AlexNet model on other datasets, the incorrect layer sequence will result in an accuracy drop of 2 percents or more.
Can I make a PR to fix it? The pre-trained model URL need to update too.