Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【Hackathon No.62】digamma, dirichlet算子FP16/BF16单测完善 #52604

Merged
merged 2 commits into from
Apr 14, 2023

Conversation

co63oc
Copy link
Contributor

@co63oc co63oc commented Apr 6, 2023

PR types

Others

PR changes

Others

Describe

digamma, dirichlet算子FP16/BF16单测完善

dirichlet bfloat16 测试值差异在0.3以下

文档修改 PaddlePaddle/docs#5782
已修改digamma文档,没有dirichlet文档

@paddle-bot
Copy link

paddle-bot bot commented Apr 6, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

using MPTypeScalar = typename phi::dtype::MPTypeTrait<ScalarT>::Type;
using MPTypeAccscalar = typename phi::dtype::MPTypeTrait<AccscalarT>::Type;

// AccscalarT scale = 1.0f;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

注释代码删除

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改删除

@@ -83,33 +87,41 @@ HOSTDEVICE ScalarT
sample_gamma(ScalarT alpha,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

外部调用这里的时候,是不是也需要把模版参数修改下?

auto sample =
        sample_gamma<T, T, decltype(uniform_lambda), decltype(normal_lambda)>(
            alpha_[index], standard_uniform, standard_normal);

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

没明白这里是什么问题,调用sample_gamma 参数 alpha, gamma都是T类型,sample_gamma定义是ScalarT,AccscalarT两种类型,可以调用

def init_dtype_type(self):
self.dtype = np.float16

def test_check_grad_normal(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

直接继承了TestDigammaOp,这两行不用加吧

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改删除

@co63oc
Copy link
Contributor Author

co63oc commented Apr 14, 2023

@luotao1 @ZzSean CI已完成,PR-CI-Inference 问题是文件大小,图片

Copy link
Contributor

@ZzSean ZzSean left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@luotao1 luotao1 merged commit 7ecbcc0 into PaddlePaddle:develop Apr 14, 2023
@co63oc co63oc deleted the digamma branch April 18, 2023 15:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants