Skip to content

Commit 018a50e

Browse files
pre-commit-ci[bot]rustamzh
authored andcommitted
[pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
1 parent b31134a commit 018a50e

File tree

3 files changed

+4
-2
lines changed

3 files changed

+4
-2
lines changed

src/lightning/pytorch/CHANGELOG.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
1111

1212
- Add enable_autolog_hparams argument to Trainer ([#20593](https://github.com/Lightning-AI/pytorch-lightning/pull/20593))
1313

14-
- Add `toggled_optimizer(optimizer)` method to the LightningModule, which is a context manager version of `toggle_optimize` and `untoggle_optimizer`
14+
- Add `toggled_optimizer(optimizer)` method to the LightningModule, which is a context manager version of `toggle_optimize` and `untoggle_optimizer`
1515

1616

1717
### Changed

src/lightning/pytorch/core/module.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1144,7 +1144,7 @@ def untoggle_optimizer(self, optimizer: Union[Optimizer, LightningOptimizer]) ->
11441144
@contextmanager
11451145
def toggled_optimizer(self, optimizer: Union[Optimizer, LightningOptimizer]) -> Generator:
11461146
"""Makes sure only the gradients of the current optimizer's parameters are calculated in the training step to
1147-
prevent dangling gradients in multiple-optimizer setup. Combines :meth:`toggle_optimizer` and
1147+
prevent dangling gradients in multiple-optimizer setup. Combines :meth:`toggle_optimizer` and
11481148
:meth:`untoggle_optimizer` into context manager.
11491149
11501150
Args:
@@ -1159,6 +1159,7 @@ def training_step(...):
11591159
opt.zero_grad()
11601160
self.manual_backward(loss)
11611161
opt.step()
1162+
11621163
"""
11631164
try:
11641165
yield self.toggle_optimizer(optimizer)

tests/tests_pytorch/core/test_lightning_module.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -118,6 +118,7 @@ def test_1_optimizer_toggle_model():
118118
model.untoggle_optimizer(optimizer)
119119
assert not model._param_requires_grad_state
120120

121+
121122
def test_1_optimizer_toggle_model_context_manager():
122123
"""Test toggle_model runs when only one optimizer is used."""
123124
model = BoringModel()

0 commit comments

Comments
 (0)