-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Inference] auto mixed precision inference support white list #56535
[Inference] auto mixed precision inference support white list #56535
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
d94f4f1
to
69b8ac6
Compare
ef5072a
to
ac4f334
Compare
// cast_op_2_out | ||
// -> | ||
// pre_op -> cast_op_2_out | ||
struct FindTwoCastOpPattern : public PatternBase { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
为啥会带上这个PASS的修改?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
见issue讨论
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
…Paddle#56535) * auto mixed precision inference support white list * update * update * update * move down identity_op_clean_pass * fix code style
PR types
New features
PR changes
Others
Description
For custom op, auto mixed precision inference support white list.
白名单中的算子会不经过内部逻辑的判断直接认为默认支持低精度,黑名单与之相反。
NOTE: API更改均是兼容升级。
除此之外,本PR还做了如下工作:
Related issue: #56491
Related doc: PaddlePaddle/Paddle-Inference-Demo#463
Others
Pcard-71500
Bug fix: #56664