Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Cherry-pick] Add config for broadcast data in model parallel #57843

Merged
merged 2 commits into from
Oct 17, 2023

Conversation

ForFishes
Copy link
Member

@ForFishes ForFishes commented Sep 28, 2023

PR types

New features

PR changes

Others

Description

[Cherry-pick] Add config for broadcast data in model parallel
Pcard-70448

support remove broadcast data in this pr:
#57567

@paddle-bot
Copy link

paddle-bot bot commented Sep 28, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

sneaxiy
sneaxiy previously approved these changes Sep 28, 2023
@@ -56,6 +56,8 @@ message MpConfig {
optional bool sync_grad= 2 [ default = false ];
optional bool sync_moment= 3 [ default = false ];
optional string sync_mode= 4 [ default = 'broadcast' ];
// Broadcast mp input data
optional bool need_broadcast_data=8 [default = true];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

broadcast_data_for_tp?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

在mp的cofig里面,应该不需要再补“for_tp”

@sneaxiy sneaxiy merged commit 1be4c8d into PaddlePaddle:release/2.5 Oct 17, 2023
16 of 18 checks passed
@ForFishes ForFishes deleted the release/2.5 branch October 17, 2023 03:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants