Skip to content

Conversation

amorehead
Copy link
Contributor

@amorehead amorehead commented May 3, 2025

What does this PR do?

Adds gradient norm clipping support for FSDP. Tests fine locally.

For fun, here's a research deep dive ChatGPT came up with when comparing norm and value-based gradient clipping: https://chatgpt.com/s/dr_68168a3400988191be64b3c743a4ccf3.

Fixes #19235

Before submitting
  • Was this discussed/agreed via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

Reviewer checklist
  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

📚 Documentation preview 📚: https://pytorch-lightning--20784.org.readthedocs.build/en/20784/

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label May 3, 2025
@amorehead
Copy link
Contributor Author

amorehead commented May 3, 2025

For some reason, readthedocs is raising the following build error (that I don't believe my PR has caused):

  File "/tmp/pip-build-env-8dnu3z25/normal/lib/python3.9/site-packages/pbr/packaging.py", line 492, in run
    bs_cmd, 'executable', easy_install.sys_executable)
AttributeError: module 'setuptools.command.easy_install' has no attribute 'sys_executable'
[end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for sphinxcontrib-fulltoc
ERROR: Failed to build installable wheels for some pyproject.toml based projects (sphinxcontrib-fulltoc)
Package         Version
--------------- -----------
awscli          1.40.7
botocore        1.38.8
colorama        0.4.6
distlib         0.3.9
docutils        0.19
filelock        3.16.1
jmespath        1.0.1
pip             25.1.1
platformdirs    3.11.0
py-tree         1.0.1
pyasn1          0.6.1
python-dateutil 2.9.0.post0
PyYAML          6.0.2
rsa             4.7.2
s3transfer      0.12.0
setuptools      58.1.0
six             1.17.0
urllib3         1.26.20
virtualenv      20.21.1

[rtd-command-info] start-time: 2025-05-03T21:35:45.185287Z, end-time: 2025-05-03T21:35:45.237650Z, duration: 0, exit-code: 2
bash docs/rtfd-build.sh
+ '[' 20784 == latest -o 20784 == stable ']'
+ export FAST_DOCS_DEV=1
+ FAST_DOCS_DEV=1
++ pwd
+ root=/home/docs/checkouts/readthedocs.org/user_builds/pytorch-lightning/checkouts/20784
+ for pkg in 'fabric' 'pytorch'
+ cd /home/docs/checkouts/readthedocs.org/user_builds/pytorch-lightning/checkouts/20784/docs/source-fabric
++ nproc
+ make html --jobs 2
/bin/sh: 1: sphinx-build: not found
make: *** [Makefile:19: html] Error 127

@amorehead
Copy link
Contributor Author

I'm also not sure which path the following "Code check / mypy (pull_request)" error is coming from:

src/lightning/pytorch/plugins/precision/fsdp.py:89: error: "Tensor" not callable  [operator]

Copy link

stale bot commented Jul 19, 2025

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. If you need further help see our docs: https://lightning.ai/docs/pytorch/latest/generated/CONTRIBUTING.html#pull-request or ask the assistance of a core contributor here or on Discord. Thank you for your contributions.

@stale stale bot added the won't fix This will not be worked on label Jul 19, 2025
@stale stale bot removed the won't fix This will not be worked on label Aug 19, 2025
@Borda
Copy link
Member

Borda commented Sep 3, 2025

Let's check the typing which is now set for PT 2.8

@amorehead
Copy link
Contributor Author

@Borda,

I have no idea why types-pycurl is flagging the following (last) line of this function:

@override
def clip_grad_by_norm(self, module: Optional[Module], optimizer: Optimizer, clip_val: Union[int, float]) -> None:
    # see https://pytorch.org/docs/stable/fsdp.html#torch.distributed.fsdp.FullyShardedDataParallel.clip_grad_norm_
    if module is None:
        return
    module.clip_grad_norm_(clip_val)

It thinks that module.clip_grad_norm_ can sometimes reference a torch.Tensor object, which in practice will never happen (as the other unit tests show). However, I could add a goofy other if-check, something like if isinstance(module.clip_grad_norm_, Tensor): return, but I'll leave that decision up to you on how to proceed.

@override
def clip_gradients(
self,
module: Optional[Module],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this would be a breaking change, it has to go to the end of arguments

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you referring to how other codebases like Fabric would call clip_gradients? As far as I can see with this PR's unit tests, all references in the Lightning codebase are not broken by this change. And if you are, for clarification, would module have to be made a module: Optional[Module] = None as the last argument in all of the modified functions below?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am saying if user is using positional arguments this will break for him

Copy link
Contributor Author

@amorehead amorehead Sep 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great point. I've just made the new module argument fully optional by listing it as the last optional argument module: Optional[Module] = None. Let me know if you can see anything else that needs to be addressed.

@Borda Borda requested a review from SkafteNicki September 10, 2025 20:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pl Generic label for PyTorch Lightning package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support gradient clipping by norm with FSDP
2 participants