Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Prim][PIR] Support dynamic shape for relu_grad #65482

Merged
merged 2 commits into from
Jun 27, 2024

Conversation

zeroRains
Copy link
Contributor

@zeroRains zeroRains commented Jun 26, 2024

PR Category

Operator Mechanism

PR Types

New features

Description

为relu_grad的反向拆解过程支持动态shape,添加relu_grad和sigmoid_grad的动态shape单测

Copy link

paddle-bot bot commented Jun 26, 2024

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot paddle-bot bot added the contributor External developers label Jun 26, 2024
zeros = backend::full_with_tensor<T>(shape<T>(out), 0.0, out.dtype());
} else {
zeros = full<T>(common::vectorize(out.dims()), 0.0, out.dtype());
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

可以统一使用这个full_scalar(0., variance.dtype())

@cyber-pioneer cyber-pioneer merged commit 7024205 into PaddlePaddle:develop Jun 27, 2024
30 of 32 checks passed
@zeroRains zeroRains deleted the relu_grad branch June 27, 2024 08:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants