Skip to content

Commit 0bb3957

Browse files
RNN-T constants (#84)
* new keys according to #78 and #80 * RNN-T constants * RNN-T update constants * [compliance_checker] update to rules 1.0 * [compliance_checker] add gradient_accumulation_steps * update constants * [compliance_checker] RNN-T rules * add rnnt and unet3d benchmarks * Revert "RNN-T update constants" This reverts commit 03b986a. * Revert "RNN-T constants" This reverts commit b550182. * RNN-T constants * [compliance_checker] check weights_initialization based on metadata * align naming with other constants * Add unet3d * [compliance_checker][RNN-T] update compliance checker * [compliance_checker][RNN-T] missing weights initialization check * [logging][rnn-t] weights initialization scale constant * undo unwanted change in unet3d Co-authored-by: michalm <[email protected]>
1 parent a75cd37 commit 0bb3957

File tree

2 files changed

+22
-1
lines changed

2 files changed

+22
-1
lines changed

mlperf_logging/compliance_checker/1.0.0/closed_rnnt.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@
4040
CHECK: " v['value'] == 1e-9 "
4141

4242
- KEY:
43-
NAME: opt_lamb_learning_rate_poly_decay_power
43+
NAME: opt_lamb_learning_rate_decay_poly_power
4444
REQ: EXACTLY_ONE
4545

4646
- KEY:

mlperf_logging/mllog/constants.py

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -106,6 +106,12 @@
106106
OPT_ADAM_EPSILON = "opt_adam_epsilon"
107107
OPT_NAME = "opt_name"
108108
OPT_BASE_LR = "opt_base_learning_rate"
109+
OPT_LAMB_LR_MIN = "opt_lamb_learning_rate_min"
110+
OPT_LAMB_LR_DECAY_POLY_POWER = "opt_lamb_learning_rate_decay_poly_power"
111+
OPT_LAMB_BETA_1 = "opt_lamb_beta_1"
112+
OPT_LAMB_BETA_2 = "opt_lamb_beta_2"
113+
OPT_LAMB_EPSILON = "opt_lamb_epsilon"
114+
OPT_LAMB_LR_HOLD_EPOCHS = "opt_lamb_learning_rate_hold_epochs"
109115
OPT_LR_ALT_DECAY_FUNC = "opt_learning_rate_alt_decay_func"
110116
OPT_LR_ALT_WARMUP_FUNC = "opt_learning_rate_alt_warmup_func"
111117
OPT_LR_DECAY_BOUNDARY_EPOCHS = "opt_learning_rate_decay_boundary_epochs"
@@ -119,6 +125,21 @@
119125
OPT_LR_WARMUP_FACTOR = "opt_learning_rate_warmup_factor"
120126
OPT_LR_WARMUP_STEPS = "opt_learning_rate_warmup_steps"
121127
OPT_WEIGHT_DECAY = "opt_weight_decay"
128+
OPT_GRADIENT_CLIP_NORM = "opt_gradient_clip_norm"
129+
DATA_SPEED_PERTURBATON_MAX = "data_speed_perturbaton_max"
130+
DATA_SPEED_PERTURBATON_MIN = "data_speed_perturbaton_min"
131+
DATA_SPEC_AUGMENT_FREQ_N = "data_spec_augment_freq_n"
132+
DATA_SPEC_AUGMENT_FREQ_MIN = "data_spec_augment_freq_min"
133+
DATA_SPEC_AUGMENT_FREQ_MAX = "data_spec_augment_freq_max"
134+
DATA_SPEC_AUGMENT_TIME_N = "data_spec_augment_time_n"
135+
DATA_SPEC_AUGMENT_TIME_MIN = "data_spec_augment_time_min"
136+
DATA_SPEC_AUGMENT_TIME_MAX = "data_spec_augment_time_max"
137+
DATA_TRAIN_NUM_BUCKETS = "data_train_num_buckets"
138+
DATA_TRAIN_MAX_DURATION = "data_train_max_duration"
139+
DATA_NUM_BUCKETS = "data_num_buckets"
140+
MODEL_EVAL_EMA_FACTOR = "model_eval_ema_factor"
141+
MODEL_WEIGHTS_INITIALIZATION_SCALE = "model_weights_initialization_scale"
142+
EVAL_MAX_PREDICTION_SYMBOLS = "eval_max_prediction_symbols"
122143

123144
# Log keys - misc.
124145
BBOX = "bbox"

0 commit comments

Comments
 (0)