Skip to content

[feature request] Make transforms.functional_tensor functions differential w.r.t. their parameters #5000

@ain-soph

Description

@ain-soph

🚀 The feature

Make operations in torchvision.transforms.functional_tensor differential w.r.t. hyper-parameters, which is helpful for Faster AutoAugment search (hyper-parameters are learnable parameters via backward). (while keeping the backward compatibility to previous codes)

Some operations are not differential (e.g., Posterize), which might require users to write their own implementations.

Motivation, pitch

The main motivation is for research purpose. Faster Autoaugment proposes to search for augment architectures using a DARTS-like framework, and all magnitudes and weights are trainable parameters. This requires all operations to have gradients w.r.t. magnitudes. This idea provides a faster search strategy as state-of-the-art AutoAugment policy search algorithms.
This work has been maintained by autoalbument and applied on some industrial scenarios from their document claims.

I think adding the backward feature wrt magnitudes would be more convenient and support future research as well.

Alternatives

No response

Additional context

Linked PR: #4995

cc @vfdev-5 @datumbox

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions