Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Trainer] update clear_grad #8829

Merged
merged 1 commit into from
Aug 1, 2024

Conversation

DesmonDay
Copy link
Contributor

@DesmonDay DesmonDay commented Jul 29, 2024

PR types

Others

PR changes

Others

Description

Set self.optimizer.clear_grad(set_to_zero=False).

Copy link

paddle-bot bot commented Jul 29, 2024

Thanks for your contribution!

@DesmonDay DesmonDay force-pushed the add_clear_grad branch 2 times, most recently from 97eb19d to 19311b5 Compare July 30, 2024 04:06
Copy link

codecov bot commented Jul 30, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 55.51%. Comparing base (ee4944e) to head (45e8afd).
Report is 2 commits behind head on develop.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #8829      +/-   ##
===========================================
+ Coverage    55.44%   55.51%   +0.07%     
===========================================
  Files          631      631              
  Lines        98542    98545       +3     
===========================================
+ Hits         54632    54710      +78     
+ Misses       43910    43835      -75     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

wawltor
wawltor previously approved these changes Jul 30, 2024
Copy link
Collaborator

@wawltor wawltor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -355,6 +355,8 @@ class TrainingArguments:
Whether skip profile timer, timer will record time usage of forward/ backward/ step, etc.
distributed_dataloader (`bool`, *optional*):
Whether to use distributed dataloader. Default is `False`.
release_grads (`bool`, *optional*):
Whether to release gradients during training. Default is `False`.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Collaborator

@ZHUI ZHUI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@DesmonDay DesmonDay merged commit 12ba7ee into PaddlePaddle:develop Aug 1, 2024
10 of 12 checks passed
DrownFish19 pushed a commit to DrownFish19/PaddleNLP that referenced this pull request Aug 2, 2024
DrownFish19 pushed a commit to DrownFish19/PaddleNLP that referenced this pull request Aug 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants