We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gradients calculation or backward processing of CrossChannelNormLayer maybe incorrect.
27: I0523 19:34:54.603669 3745645504 LayerGradUtil.cpp:643] layer_type=norm useGpu=0 27: I0523 19:34:54.604331 3745645504 LayerGradUtil.cpp:683] cost 115.088 27: I0523 19:34:54.604535 3745645504 LayerGradUtil.cpp:43] cross-channel-norm para_0 step=1e-06 cost1=115.099 cost2=115.077 true_delta=0.0222626 analytic_delta=0.0111297 diff=1.00029 *** 27: I0523 19:34:54.604568 3745645504 LayerGradUtil.cpp:50] The previous diff might be caused by not accumulating parameter gradients in backward() 27: I0523 19:34:54.604655 3745645504 LayerGradUtil.cpp:43] cross-channel-norm layer_0 step=1.37416e-05 cost1=115.086 cost2=115.09 true_delta=-0.00338745 analytic_delta=0.00115088 diff=-3.94335 *** 27: I0523 19:34:54.604702 3745645504 LayerGradUtil.cpp:643] layer_type=norm useGpu=0 27: I0523 19:34:54.604790 3745645504 LayerGradUtil.cpp:683] cost -107.057 27: I0523 19:34:54.605002 3745645504 LayerGradUtil.cpp:43] cross-channel-norm layer_0 step=-1.30582e-05 cost1=-107.055 cost2=-107.058 true_delta=0.00298309 analytic_delta=-0.00107057 diff=-3.78646 ***
The text was updated successfully, but these errors were encountered:
pkuyym
Successfully merging a pull request may close this issue.
Gradients calculation or backward processing of CrossChannelNormLayer maybe incorrect.
The text was updated successfully, but these errors were encountered: