-
Notifications
You must be signed in to change notification settings - Fork 7.1k
Add typing to anchor utils #4599
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will try once with Tuple[Tuple[int, ...]] if it works fine.
sizes=((128, 256, 512),), | ||
aspect_ratios=((0.5, 1.0, 2.0),), | ||
): | ||
sizes: Tuple[Tuple[int, int, int]] = ((128, 256, 512),), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See line 43 on this file.
Passing Tuple[int] might work as we are implicitly type casting this to Tuple[Tuple]]
But it is wrong and can lead to to performance errors.
Also I think this should be Tuple[Tuple[int, ...] ] but let's see if JIT would be happy about it.
See #3246
Hmm so Tuple[Tuple[int, ...]] worked fine. I guess this should be good :) cc @datumbox I think Phillip is on PTO. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hopefully CI goes green
super(AnchorGenerator, self).__init__() | ||
|
||
if not isinstance(sizes[0], (list, tuple)): | ||
# TODO change this | ||
sizes = tuple((s,) for s in sizes) | ||
sizes = tuple((s,) for s in sizes) # type: ignore[assignment] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can't do anything much here, we are checking if size is list, which isn't the case, and mypy gets confused.
@@ -116,7 +117,7 @@ def grid_anchors(self, grid_sizes: List[List[int]], strides: List[List[Tensor]]) | |||
return anchors | |||
|
|||
def forward(self, image_list: ImageList, feature_maps: List[Tensor]) -> List[Tensor]: | |||
grid_sizes = [feature_map.shape[-2:] for feature_map in feature_maps] | |||
grid_sizes = [list(feature_map.shape[-2:]) for feature_map in feature_maps] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
.shape[-2:]
gives us a torch.Size()
object and we need List[int]
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similarly:
grid_size = [torch.Size([200, 200]), torch.Size([100, 100]), torch.Size([50, 50]), torch.Size([25, 25]), torch.Size([13, 13])]
grid_sizes = [feature_map.shape[-2:] for feature_map in feature_maps] | ||
image_size = image_list.tensors.shape[-2:] | ||
grid_sizes = [list(feature_map.shape[-2:]) for feature_map in feature_maps] | ||
image_size = list(image_list.tensors.shape[-2:]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above, We need to change torch.Size()
object to List[int]
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The debugger shows that:
grid_sizes = [torch.Size([38, 38]), torch.Size([19, 19]), torch.Size([10, 10]), torch.Size([5, 5]), torch.Size([3, 3]), torch.Size([1, 1])]
image_size = torch.Size([300, 300])
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, but we have annotated grid_sizes
as List[List[int[[ and image_size
as List[int]
.
So a workaround was the above.
I'm not sure, should we change the original annotation of grid_sizes
and image_size
?
Sorry I'm not very clear with what should be done. 😕
This should be good to go now. Sorry for the confusion 😅 |
Does this need more work @datumbox? feel free to let me know if changes are needed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @oke-aditya. Unfortunately I think some of the modifications are not correct.
I believe that some of the original types are incorrect. I think it's better to confirm the actual types by running the tests and setting a breakpoint to the methods.
grid_sizes = [feature_map.shape[-2:] for feature_map in feature_maps] | ||
image_size = image_list.tensors.shape[-2:] | ||
grid_sizes = [list(feature_map.shape[-2:]) for feature_map in feature_maps] | ||
image_size = list(image_list.tensors.shape[-2:]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The debugger shows that:
grid_sizes = [torch.Size([38, 38]), torch.Size([19, 19]), torch.Size([10, 10]), torch.Size([5, 5]), torch.Size([3, 3]), torch.Size([1, 1])]
image_size = torch.Size([300, 300])
@@ -116,7 +117,7 @@ def grid_anchors(self, grid_sizes: List[List[int]], strides: List[List[Tensor]]) | |||
return anchors | |||
|
|||
def forward(self, image_list: ImageList, feature_maps: List[Tensor]) -> List[Tensor]: | |||
grid_sizes = [feature_map.shape[-2:] for feature_map in feature_maps] | |||
grid_sizes = [list(feature_map.shape[-2:]) for feature_map in feature_maps] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similarly:
grid_size = [torch.Size([200, 200]), torch.Size([100, 100]), torch.Size([50, 50]), torch.Size([25, 25]), torch.Size([13, 13])]
Helps #4582