-
Notifications
You must be signed in to change notification settings - Fork 7.1k
More cleanup for prototype transforms #6500
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
vfdev-5
reviewed
Aug 26, 2022
datumbox
reviewed
Aug 26, 2022
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some thoughts below:
Conflicts: torchvision/prototype/transforms/_misc.py
Conflicts: torchvision/prototype/transforms/_auto_augment.py torchvision/prototype/transforms/_color.py torchvision/prototype/transforms/_geometry.py torchvision/prototype/transforms/_meta.py torchvision/prototype/transforms/_misc.py torchvision/prototype/transforms/_utils.py
datumbox
approved these changes
Aug 26, 2022
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few questions but LGTM.
You need to patch the tests as well. See this. |
facebook-github-bot
pushed a commit
that referenced
this pull request
Aug 30, 2022
Summary: * add aliases for hflip and vflip * reduce imports from torchvision.transforms in torchvision.prototype.transforms * add aliases for to_pil_image abd pil_to_tensor * deprecate to_tensor * add some FIXME cleanup comments * address reviews * add dimension getters * undeprecate PILToTensor and ToPILImage * address review * fix test Reviewed By: NicolasHug Differential Revision: D39131018 fbshipit-source-id: a1fc5d9dfd1cd587f273674716105f59a01e6cf0
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Addresses #6486 (comment).
ConvertImageDtype
transform, I kept the name and exposedconvert_image_dtype
as is. This also aligns well with other conversion ops likeconvert_bounding_box_format
.torchvision.transforms
inside thetorchvision.prototype.transforms
module. There are 3 places left that I don't think we can or should get rid of:torchvision.prototype.transforms.functional.*
: We have quite a few kernels that are just imported from the old API.torchvision.prototype.transforms._deprecated
: We opted to call the legacy kernels instead of reproducing the behavior with the new ones.torchvision.prototype.transforms._utils
: We use a number of utilities of the old transforms in the new ones. I moved them to this module to have them in one central place to avoid having each other module import them directly.FIXME
comments for minor stuff that I found during cleaning up. Maybe we can address them as part of this PRTodo
get_channels_dim()
+ alias for the old)get_spatial_dims()
+ alias for the of)