Skip to content

Commit 83d5853

Browse files
NicolasHugfacebook-github-bot
authored andcommitted
[fbsync] refactor: replace LambdaLR with PolynomialLR in segmentation training script (#6405)
Reviewed By: datumbox Differential Revision: D38824250 fbshipit-source-id: b10950254c0ba0471e0443a7cddba42594324185
1 parent 1c517ce commit 83d5853

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

references/segmentation/train.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@
1010
import utils
1111
from coco_utils import get_coco
1212
from torch import nn
13+
from torch.optim.lr_scheduler import PolynomialLR
1314
from torchvision.transforms import functional as F, InterpolationMode
1415

1516

@@ -184,8 +185,8 @@ def main(args):
184185
scaler = torch.cuda.amp.GradScaler() if args.amp else None
185186

186187
iters_per_epoch = len(data_loader)
187-
main_lr_scheduler = torch.optim.lr_scheduler.LambdaLR(
188-
optimizer, lambda x: (1 - x / (iters_per_epoch * (args.epochs - args.lr_warmup_epochs))) ** 0.9
188+
main_lr_scheduler = PolynomialLR(
189+
optimizer, total_steps=iters_per_epoch * (args.epochs - args.lr_warmup_epochs), power=0.9
189190
)
190191

191192
if args.lr_warmup_epochs > 0:

0 commit comments

Comments
 (0)