From 3f61755d5657a334d0a93728be5c510f022c52ea Mon Sep 17 00:00:00 2001 From: zhiqiu Date: Sun, 25 Apr 2021 06:32:05 +0000 Subject: [PATCH] add clear_grad for amp sample code --- doc/paddle/api/paddle/amp/GradScaler_cn.rst | 3 +++ 1 file changed, 3 insertions(+) diff --git a/doc/paddle/api/paddle/amp/GradScaler_cn.rst b/doc/paddle/api/paddle/amp/GradScaler_cn.rst index c40be061bc2..b0f869b398c 100644 --- a/doc/paddle/api/paddle/amp/GradScaler_cn.rst +++ b/doc/paddle/api/paddle/amp/GradScaler_cn.rst @@ -46,6 +46,7 @@ GradScaler用于动态图模式下的"自动混合精度"的训练。它控制lo scaled = scaler.scale(loss) # scale the loss scaled.backward() # do backward scaler.minimize(optimizer, scaled) # update parameters + optimizer.clear_grad() .. py:function:: scale(var) @@ -76,6 +77,7 @@ GradScaler用于动态图模式下的"自动混合精度"的训练。它控制lo scaled = scaler.scale(loss) # scale the loss scaled.backward() # do backward scaler.minimize(optimizer, scaled) # update parameters + optimizer.clear_grad() .. py:function:: minimize(optimizer, *args, **kwargs) @@ -106,6 +108,7 @@ GradScaler用于动态图模式下的"自动混合精度"的训练。它控制lo scaled = scaler.scale(loss) # scale the loss scaled.backward() # do backward scaler.minimize(optimizer, scaled) # update parameters + optimizer.clear_grad()