Skip to content

Add info about gradient accumulation in logging #78

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
xyhuang opened this issue Jan 9, 2021 · 2 comments
Closed

Add info about gradient accumulation in logging #78

xyhuang opened this issue Jan 9, 2021 · 2 comments
Labels

Comments

@xyhuang
Copy link
Contributor

xyhuang commented Jan 9, 2021

No description provided.

@mwawrzos
Copy link
Contributor

mwawrzos commented Feb 7, 2021

Indicates, after what number of steps optimizer step is called.
If submission uses no gradient accumulation, the value 1 should be reported.

Here is an example in the RNN-T reference: https://github.com/mlcommons/training/blob/651e7c47bcbd7f4708d633afa567205a826438f1/rnn_speech_recognition/pytorch/train.py#L222

xyhuang pushed a commit that referenced this issue Feb 22, 2021
* new keys according to #78 and #80

* RNN-T constants

* RNN-T update constants

* [compliance_checker] update to rules 1.0

* [compliance_checker] add gradient_accumulation_steps

* update constants

* [compliance_checker] RNN-T rules

* add rnnt and unet3d benchmarks

* Revert "RNN-T update constants"

This reverts commit 03b986a.

* Revert "RNN-T constants"

This reverts commit b550182.

* [compliance_checker] check weights_initialization based on metadata

* Add unet3d

* [compliance_checker][RNN-T] missing weights initialization check

* [compliance_checker][Unet3D] target 0.91 -> 0.908

after
mlcommons/training@149c2b8

Co-authored-by: michalm <[email protected]>
xyhuang pushed a commit that referenced this issue Apr 13, 2021
* new keys according to #78 and #80

* RNN-T constants

* RNN-T update constants

* [compliance_checker] update to rules 1.0

* [compliance_checker] add gradient_accumulation_steps

* update constants

* [compliance_checker] RNN-T rules

* add rnnt and unet3d benchmarks

* Revert "RNN-T update constants"

This reverts commit 03b986a.

* Revert "RNN-T constants"

This reverts commit b550182.

* RNN-T constants

* [compliance_checker] check weights_initialization based on metadata

* align naming with other constants

* Add unet3d

* [compliance_checker][RNN-T] update compliance checker

* [compliance_checker][RNN-T] missing weights initialization check

* [logging][rnn-t] weights initialization scale constant

* undo unwanted change in unet3d

Co-authored-by: michalm <[email protected]>
@xyhuang xyhuang closed this as completed Aug 3, 2021
@xyhuang
Copy link
Contributor Author

xyhuang commented Aug 3, 2021

done by #83

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants