Skip to content

Automatic Differentiation and Gradients tf.GradientTape() #285

Open
@danilojsl

Description

@danilojsl

Please make sure that this is a feature request. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:feature_template

System information

  • TensorFlow version (you are using): 2.3.1
  • Are you willing to contribute it (Yes/No): Yes, when able and available

Describe the feature and the current behavior/state.
TensorFlow provides the tf.GradientTape API to differentiate automatically, TensorFlow needs to remember what operations happen in what order during the forward pass. Then, during the backward pass, TensorFlow traverses this list of operations in reverse order to compute gradients.

Details about this feature can be found in the official TensorFlow documentation for Gradient Tapes

Will this change the current api? How?
Yes, I think it will add a new feature to tensorflow-core module

Who will benefit with this feature?
Anyone that requires a very low-level control over training and evaluation of a deep learning model and everyone who is already familiar with TF/Keras.

Any Other info.
tf.GradientTape API is needed when writing a training loop from scratch as described here

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions