-
Notifications
You must be signed in to change notification settings - Fork 7.1k
Improve transforms #466
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi, Thanks for the feedback! I'm in the process of adding new functionality to torchvision, to extend it to work with other data types (like bounding boxes). |
@fmassa, okay, thanks, it will cool :) Actually, I strongly recommend to look at this libraries instead of write new code, but, of course, you know better :) By the way, don't forget about segmentation! |
Don't worry, I won't forget about instance segmentation :-) |
@fmassa sorry for offtop: I would like to notice that first provided library uses excellent way to transform two or more images. All transforms take list of images rather than just one image like transform in
I suppose it's great idea ^^ |
It's indeed much simpler in some cases, but also more restrictive in more general setups. For the record, this is something that has been bugging me for a long time, see #9 and #230 for some context. In many cases, you don't want to pass all transforms to all data (no color augmentation for segmentation masks for example), and the approach you mentioned doesn't easily allows that (without making the API overly complex). |
The codes of |
@nnop yes, I saw |
Hi there!
At this moment,
torchvision
is so poor; in particular, transforms. It really restricts our ability for data augmentation. However, people have already done a few nice libraries for it. For instance:https://github.com/mdbloice/Augmentor (already has compatibility with PyTorch)
https://github.com/aleju/imgaug
What about contact with authors and merged these great jobs to
torchvision.transforms
?The text was updated successfully, but these errors were encountered: