Skip to content

add prototype transforms that use the prototype dispatchers #5418

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 26 commits into from
Feb 16, 2022

Conversation

pmeier
Copy link
Collaborator

@pmeier pmeier commented Feb 14, 2022

This PR re-adds the transformations removed in #5407 to use the dispatchers that were introduced in that PR. Since we have the high-level functional API (dispatchers), most transforms only need to subclass from the Transform base class and set the _DISPATCHER attribute. In case the transformation takes additional parameters, __init__ and get_params has to be overwritten. In case the parameters are constant, one can also subclass from ConstantParamTransform to make this even more concise.

Conflicts:
	torchvision/prototype/transforms/__init__.py
@facebook-github-bot
Copy link

facebook-github-bot commented Feb 14, 2022

💊 CI failures summary and remediations

As of commit 7799069 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

Copy link
Contributor

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some early thoughts. Let's simplify further as discussed offline and I'll review again:

Copy link
Contributor

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Few more comments:

@@ -0,0 +1,184 @@
import itertools
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These test are just a quick and dirty implementation to see if the transforms don't error when called with the supported inputs.

Copy link
Contributor

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pmeier Thanks for the patience. We are getting there, mostly questions and nits:

raise ValueError("Scale should be between 0 and 1")
if p < 0 or p > 1:
raise ValueError("Random erasing probability should be between 0 and 1")
# TODO: deprecate p in favor of wrapping the transform in a RandomApply
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This TODO is probably left from earlier when you used more abstraction. Is this still relevant given the changes?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this is still relevant. The idea is to remove all probability arguments from the transforms in favor of wrapping them in a RandomApply transform. In this example this means

transforms.RandomErasing(p=..., ...)

would change to

transforms.RandomApply(
    transforms.RandomErasing(...),
    p=...,
)

Another example is to no longer have a RandomHorizontalFlip. The term "Random" in the name of a transform should mean "random parameters sampled at runtime".

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. Let's discuss this in a separate issue. I have concerns that it will overcomplicate the definition of transforms as for literally every transform you will need 2. So let's take this on a separate thread.

Copy link
Contributor

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the proposed solution is a good candidate. I left one nit and one comment which I would address on a separate PR (to reduce the complexity of this one).

Also Have you checked that all transforms work the way we expect them to work? If not, we should create an issue to review them and verify individually but I wouldn't block this merge as it focuses on the API rather than the functionality.

@pmeier pmeier merged commit 52e6bd0 into pytorch:main Feb 16, 2022
@pmeier pmeier deleted the refactor-transforms branch February 16, 2022 12:12
facebook-github-bot pushed a commit that referenced this pull request Feb 25, 2022
…5418)

Summary: * add prototype transforms that use the prototype dispatchers

Reviewed By: jdsgomes

Differential Revision: D34475309

fbshipit-source-id: 8366d044b8e118c7c360bfd6d828ef87c3055ced
Comment on lines +65 to +71
# Backward compatibility with integer value
if isinstance(interpolation, int):
warnings.warn(
"Argument interpolation should be of type InterpolationMode instead of int. "
"Please, use InterpolationMode enum."
)
interpolation = _interpolation_modes_from_int(interpolation)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can drop the support for int in prototype. Otherwise, we have to make this consistent (for example Resize class does not do that)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants