-
Notifications
You must be signed in to change notification settings - Fork 7.1k
Refactor & enable JIT tests in all models and add warnings if skipped #3033
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@fmassa Unfortunately the tests did not fail which means that the JIT scripts did not run. This can be confirm by checking any of the run logs for the following warnings:
Any ideas on how we can turn on the Lines 34 to 35 in b2c0b3d
|
Thanks a lot for looking into this, we should get this fixed quickly. I think we should try to switch the flag to always run the slow tests, and we should see if CI timeout or not.
Just note that we have a separate test checking detection models that they run on torchscript, so we should be safe there, see Lines 213 to 215 in a884cb7
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
…pytorch#3033) * Enable jit tests in all models and add warning if checkModule() tests are skipped. * Turning on JIT tests on CI. * Fixing broken unit-tests. * Refactoring and cleaning up duplicate code.
…pytorch#3033) * Enable jit tests in all models and add warning if checkModule() tests are skipped. * Turning on JIT tests on CI. * Fixing broken unit-tests. * Refactoring and cleaning up duplicate code.
Fixes #3028
Initial commits of this PR should have caused the following models to fail:
The failures were related to non JIT problems on the tests but we could not see them because the tests were not running.
This PR fixes the problems on the above tests, refactors the unit-tests and re-enables JIT tests for all models.