Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[xdoctest] reformat example code with google style in 192-197 #55926

Merged
merged 15 commits into from
Aug 25, 2023

Conversation

Liyulingyue
Copy link
Contributor

PR types

Others

PR changes

Others

Description

修改如下文件的示例代码为新的格式

#55629

@paddle-bot
Copy link

paddle-bot bot commented Aug 2, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

Comment on lines 80 to 82
# one_hot_label.shape = [4, 4]
# one_hot_label = [[0., 1., 0., 0.],
# [0., 1., 0., 0.],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

需要改成 print 的形式

@luotao1 luotao1 added the HappyOpenSource 快乐开源活动issue与PR label Aug 3, 2023
@Liyulingyue Liyulingyue changed the title [xdoctest] reformat example code with google style in No.78 [xdoctest] reformat example code with google style in No.78, 192-197 Aug 3, 2023
@Liyulingyue Liyulingyue changed the title [xdoctest] reformat example code with google style in No.78, 192-197 [xdoctest] reformat example code with google style in 192-197 Aug 3, 2023
@luotao1 luotao1 added HappyOpenSource Pro 进阶版快乐开源活动,更具挑战性的任务 and removed HappyOpenSource 快乐开源活动issue与PR labels Aug 7, 2023
@@ -52,18 +52,18 @@ def all_gather(tensor_list, tensor, group=None, sync_op=True):
.. code-block:: python

# required: distributed
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

去看看顺师傅怎么说的,需要加相应的 requires 指令

python/paddle/distributed/communication/all_gather.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/all_gather.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/all_gather.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/all_reduce.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/all_to_all.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/broadcast.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/broadcast.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/broadcast.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/gather.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/gather.py Outdated Show resolved Hide resolved
@@ -43,21 +43,21 @@ class P2POp:

# required: distributed
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里没改

Comment on lines +154 to +155
>>> # paddle.tensor([1, 2]) # Rank-0
>>> # paddle.tensor([0, 1]) # Rank-1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里是不是可以写成输出的形式

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

因为俺不知道应该输出啥!(理不直气也壮)

Comment on lines +154 to +155
>>> # paddle.tensor([1, 2]) # Rank-0
>>> # paddle.tensor([0, 1]) # Rank-1
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

因为俺不知道应该输出啥!(理不直气也壮)

... data = paddle.to_tensor([[1, 2, 3], [1, 2, 3]])
>>> dist.all_gather(tensor_list, data)
>>> print(tensor_list)
[[[4, 5, 6], [4, 5, 6]], [[1, 2, 3], [1, 2, 3]]] (2 GPUs)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

恢复成注释形式吧

@paddle-ci-bot
Copy link

paddle-ci-bot bot commented Aug 23, 2023

Sorry to inform you that f1ad0c5's CIs have passed for more than 7 days. To prevent PR conflicts, you need to re-run all CIs manually.

@luotao1
Copy link
Contributor

luotao1 commented Aug 23, 2023

@Liyulingyue 张师傅再改一下,很快就能合了

python/paddle/distributed/communication/all_gather.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/all_gather.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/all_reduce.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/all_to_all.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/all_to_all.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/broadcast.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/broadcast.py Outdated Show resolved Hide resolved
python/paddle/distributed/communication/gather.py Outdated Show resolved Hide resolved
Copy link
Member

@SigureMo SigureMo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTMeow 🐾

@luotao1 luotao1 merged commit bde1096 into PaddlePaddle:develop Aug 25, 2023
BeingGod pushed a commit to BeingGod/Paddle that referenced this pull request Sep 9, 2023
…Paddle#55926)

* Update input.py

* Update input.py

* Update gather.py

* Update broadcast.py

* Update batch_isend_irecv.py

* Update all_to_all.py

* Update all_reduce.py

* Update all_gather.py

* rollback

* Apply suggestions from code review

* Apply suggestions from code review

* Apply suggestions from code review
@Liyulingyue Liyulingyue deleted the xdoc2 branch September 23, 2023 16:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers HappyOpenSource Pro 进阶版快乐开源活动,更具挑战性的任务
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants