Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[auto parallel] add softmax backward spmd rule #59039

Merged
merged 4 commits into from
Nov 30, 2023

Conversation

cxxly
Copy link
Contributor

@cxxly cxxly commented Nov 15, 2023

PR types

New features

PR changes

Others

Description

Pcard-73145
dd softmax backward spmd rule.

@cxxly cxxly force-pushed the softmax_spmd branch 5 times, most recently from ef46bfa to cc8c2f6 Compare November 20, 2023 02:28
axis = axis < 0 ? out.dims().size() + axis : axis;

// TODO(cxxly): Simplifies the code logic of sharding propagation using
// primitive operators.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里写成规则的组合是为了和组合算子对应么?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

目前是想尝试用一种通用策略去推导,比如一些复杂运算逻辑,直观上不太好推导,但是拆成一些小的表达式,数学上一定是能够推导正确。

import paddle.distributed as dist


class TestSoftmaxApiForSemiAutoParallel:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

可以多加一些单测?比如输入三个维度,batch维和softmax维同时切

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DONE

@liuzhenhai93
Copy link
Contributor

LGTM

@liuzhenhai93 liuzhenhai93 self-requested a review November 21, 2023 03:40
@cxxly cxxly force-pushed the softmax_spmd branch 7 times, most recently from 6b15be9 to d236503 Compare November 27, 2023 10:14
Copy link
Contributor

@liuzhenhai93 liuzhenhai93 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@cxxly cxxly force-pushed the softmax_spmd branch 4 times, most recently from 60cfa7c to cd3dab1 Compare November 28, 2023 09:12
risemeup1
risemeup1 previously approved these changes Nov 28, 2023
Copy link
Contributor

@Aurelius84 Aurelius84 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@XieYunshen XieYunshen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@cxxly cxxly merged commit d86f686 into PaddlePaddle:develop Nov 30, 2023
29 of 30 checks passed
cxxly added a commit that referenced this pull request Nov 30, 2023
cxxly added a commit that referenced this pull request Nov 30, 2023
cxxly added a commit that referenced this pull request Dec 3, 2023
* [auto parallel] add softmax backward spmd rule

* update test to new eager parallel api

* Revert "[auto parallel] add softmax backward spmd rule (#59039)"

This reverts commit d86f686.

* [auto parallel] add softmax backward spmd rule

* update test to new eager parallel api
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants