-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
使用混合精度报错 #39881
Comments
您好,我们已经收到了您的问题,会安排技术人员尽快解答您的问题,请耐心等待。请您再次检查是否提供了清晰的问题描述、复现代码、环境&版本、报错信息等。同时,您也可以通过查看官网API文档、常见问题、历史Issue、AI社区来寻求解答。祝您生活愉快~ Hi! We've received your issue and please be patient to get responded. We will arrange technicians to answer your questions as soon as possible. Please make sure that you have posted enough message to demo your request. You may also check out the API,FAQ,Github Issue and AI community to get the answer.Have a nice day! |
hi,请问你的模型代码能发出来看下吗? |
https://github.com/jyjfjyjf/DeBERTa 你好这里是模型代码 |
这个问题是因为网络中用了PyLayer,因为该PyLayer的前向在auto cast中跑,部分计算可能用fp16。但是反向的时候没有使用auto cast,所以会报错。 你可以选择(1)在前向中加上 with auto_cast(enable=false)关闭混合精度,或者(2)在反向中加上和前向一样的with auto_cast(enable=true) |
试过了,都不行啊使用(1)会报另一个错
|
where不支持f16? |
backward之前加上了
还是会报之前的错 |
在backward的where前面加上了混合精度,但是又报新的错了
|
请问你用的是哪个版本的paddle呢? |
这个版本的 |
这个是咋改的?能发下上下文代码看看吗 |
|
(2)在反向中加上和前向一样的with auto_cast(enable=true),可以参考下面的示例修改下:
|
Since you haven't replied for more than a year, we have closed this issue/pr. |
代码:
报错信息
The text was updated successfully, but these errors were encountered: