Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Properly implement gradient clipping for FSDP #1740

Merged
merged 8 commits into from
Dec 6, 2022

Conversation

bcui19
Copy link
Contributor

@bcui19 bcui19 commented Nov 18, 2022

What does this PR do?

FSDP with sharding requires us to use this function to clip gradients:

https://pytorch.org/docs/stable/fsdp.html#torch.distributed.fsdp.FullyShardedDataParallel.clip_grad_norm_

What issue(s) does this change relate to?

CO-1427
CO-1452

Before submitting

  • Have you read the contributor guidelines?
  • Is this change a documentation change or typo fix? If so, skip the rest of this checklist.
  • Was this change discussed/approved in a GitHub issue first? It is much more likely to be merged if so.
  • Did you update any related docs and document your change?
  • Did you update any related tests and add any new tests related to your change? (see testing)
  • Did you run the tests locally to make sure they pass?
  • Did you run pre-commit on your change? (see the pre-commit section of prerequisites)

@bcui19 bcui19 changed the title Fsdp grad clip [DRAFT] Fsdp grad clip Nov 18, 2022
@bcui19 bcui19 changed the title [DRAFT] Fsdp grad clip Properly implement gradient clipping for FSDP Dec 6, 2022
@bcui19 bcui19 requested a review from abhi-mosaic December 6, 2022 19:43
@bcui19 bcui19 marked this pull request as ready for review December 6, 2022 19:43
Copy link
Contributor

@mvpatel2000 mvpatel2000 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

minor comments, mostly lgtm

@bcui19 bcui19 requested a review from mvpatel2000 December 6, 2022 21:50
@bcui19 bcui19 merged commit 043ef26 into mosaicml:dev Dec 6, 2022
bmosaicml pushed a commit to bmosaicml/composer that referenced this pull request Dec 13, 2022
Properly implement gradient clipping for FSDP, since because of sharding it has to be treated differently than standard DDP
@bcui19 bcui19 deleted the fsdp_grad_clip branch March 10, 2023 18:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants