-
Notifications
You must be signed in to change notification settings - Fork 7.1k
[prototype] Fix BC-breakages on input params of F
#6636
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[prototype] Fix BC-breakages on input params of F
#6636
Conversation
if isinstance(size, int): | ||
image_size = [size, size] | ||
else: | ||
image_size = (size[0], size[0]) if len(size) == 1 else (size[0], size[1]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I dislike the nested if
here. Maybe
if isinstance(size, int): | |
image_size = [size, size] | |
else: | |
image_size = (size[0], size[0]) if len(size) == 1 else (size[0], size[1]) | |
if isinstance(size, int): | |
image_size = [size, size] | |
elif len(size) == 1: | |
image_size = [size[0], size[0]] |
?
This is almost equivalent with what we had before with the only difference that the previous solution silently ignored size[2:]
. Meaning, it would still work if we pass sizes of arbitrary size. My suggestion will now raise an exception in the kernel.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried it but the bbox expected a tuple and this needs to be handled explicitly to make mypy happy. I'll revert the previous approach which just handles the integer and maintains the previous treatment on the else but I'm happy to review on the nits where and how this should be handled. We need to discuss where exactly the exception should be thrown in this case. Possibly on the new_like
method itself?
Plus, |
@pmeier @vfdev-5 Thanks for the comments. The PR is half-baked so I'll make sure the nits and tests are in order before finalizing. I was hoping you can provide feedback around the overall approach described on the PR description (aka where we place the mitigations, how we align the typing etc). I take it from your comments we are aligned? |
9a23fcb
to
a5ad323
Compare
a5ad323
to
e3f93fb
Compare
F
F
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The main changes are mostly done, though there are lots of styles / nits we might want to see.
Several JIT tests are failing but this might be due to #6635 not being merged yet. Perhaps the following error is related to my comment at https://github.com/pytorch/vision/pull/6635/files#r978934554:
test/test_prototype_transforms_functional.py:57 (TestKernels.test_scripted_vs_eager[cpu-resize_image_tensor-167])
Traceback (most recent call last):
File "test/test_prototype_transforms_functional.py", line 66, in test_scripted_vs_eager
actual = kernel_scripted(*args, **kwargs)
RuntimeError: resize_image_tensor() Expected a value of type 'Union[List[int], int]' for argument 'size' but instead found type 'tuple'.
Position: 1
Value: (23, 11)
Declaration: resize_image_tensor(Tensor image, Union(int[], int) size, Enum<__torch__.torchvision.transforms.functional.InterpolationMode> interpolation=Enum<InterpolationMode.BILINEAR>, int? max_size=None, bool antialias=False) -> Tensor
Cast error details: Expected a member of Union[List[int], int] but instead found type Tuple[int, int]
if sigma <= 0: | ||
raise ValueError("If sigma is a single number, it must be positive.") | ||
sigma = (sigma, sigma) | ||
sigma = float(sigma) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since the _setup_float_or_seq()
will do the job for us, we only need to handle the integer case.
@@ -16,6 +16,23 @@ | |||
from typing_extensions import Literal | |||
|
|||
|
|||
def _setup_float_or_seq(arg: Union[float, Sequence[float]], name: str, req_size: int = 2) -> Sequence[float]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moved as-is.
Unfortunately the moment we put all the real types in the signatures (for example
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM if CI is green. Thanks Vasilis!
Summary: * Fix `size` in resize. * Update torchvision/prototype/features/_bounding_box.py * Address some of the comments. * Fix `output_size` in center_crop. * Fix `CenterCrop` transform * Fix `size` in five_crop. * Fix `size` in ten_crop. * Fix `kernel_size` and `sigma` in gaussian_blur. * Fix `angle` and `shear` in affine. * Fixing JIT-scriptability issues. * Update TODOs. * Restore fake types for `Union[int, List[int]]` and `Union[int, float, List[float]]` * Fixing tests * Fix linter * revert unnecessary JIT mitigations. * Cherrypick Philip's 6dfc965 * Linter fix * Adding center float casting Reviewed By: datumbox Differential Revision: D40138728 fbshipit-source-id: 5cacd77921106fab286fb7840c301e62b5a0f905 Co-authored-by: Philip Meier <[email protected]> Co-authored-by: Philip Meier <[email protected]>
Depends on #6635
A number of dispatchers on stable
F
define one typing but handle multiple. This comes from the days where JIT didn't supportUnion
. One such an example, is thesize
parameter ofresize
which is declared asList[int]
butint
is also handled internally. Due to the use of these fake types, the prototypeF
doesn't currently have those mitigations. This PR, tries to restore BC-breakages by correcting the types on prototypeF
and handling all possible inputs. The issue was first reported by @vfdev-5 forresize
. I took a look and here is some additional dispatchers affected:Here is the proposed changes for each method:
the Transform constructors. Edit: Unfortunately this is not possible for the kernels due to JIT. See this [prototype] Fix BC-breakages on input params of_Feature
methods,F
dispatchers, the actual kernels andF
#6636 (review)