Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix sharding < 100 limitation bug #8141

Merged

Conversation

sneaxiy
Copy link
Collaborator

@sneaxiy sneaxiy commented Mar 18, 2024

PR types

Bug fixes

PR changes

Others

Description

Fix sharding < 100 limitation bug.

Copy link

paddle-bot bot commented Mar 18, 2024

Thanks for your contribution!

Copy link
Collaborator

@wawltor wawltor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@sneaxiy sneaxiy merged commit e897d8c into PaddlePaddle:paddlenlp-2.7.2-fleetv Mar 19, 2024
4 of 5 checks passed
@sneaxiy sneaxiy deleted the fix_sharding_100_limit branch March 19, 2024 02:36
jerrywgz added a commit to PaddlePaddle/PaddleMIX that referenced this pull request Mar 21, 2024
Update PaddleNLP to fix sharding_degree < 100 limitation bug. PaddleNLP
PR is PaddlePaddle/PaddleNLP#8141 .

Co-authored-by: wangguanzhong <jerrywgz@126.com>
sneaxiy added a commit to sneaxiy/PaddleNLP that referenced this pull request Apr 9, 2024
sneaxiy added a commit that referenced this pull request Apr 10, 2024
* fix sharding <100 limitation (#8141)

* Fix timer and add loss_cur_dp (#8174)

* remove log_history (#8200)
westfish pushed a commit to westfish/PaddleMIX that referenced this pull request Sep 25, 2024
…addle#478)

Update PaddleNLP to fix sharding_degree < 100 limitation bug. PaddleNLP
PR is PaddlePaddle/PaddleNLP#8141 .

Co-authored-by: wangguanzhong <jerrywgz@126.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants