Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add weight_norm & remove_weight_norm #26131

Merged
merged 8 commits into from
Aug 20, 2020
Merged

Conversation

ceci3
Copy link
Contributor

@ceci3 ceci3 commented Aug 11, 2020

PR types

Others

PR changes

APIs

Describe

Add weight_norm and remove_weight_norm for dygraph

@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot-old
Copy link

paddle-bot-old bot commented Aug 11, 2020

✅ This PR's description meets the template requirements!
Please wait for other CI results.

__all__ = ['weight_norm', 'remove_weight_norm']


def l2_norm(x, axis, epsilon=1e-12, name=None):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

l2_normalize也一样吧?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

不可以.. layers. l2_normalize只返回了out,但是这里是要用到另一个norm输出..

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh, understood.

@@ -216,6 +216,8 @@ class WeightNormParamAttr(ParamAttr):
It is recommended to use ``minimize(loss, grad_clip=clip)`` to clip gradient.
There are three clipping strategies: :ref:`api_fluid_clip_GradientClipByGlobalNorm` ,
:ref:`api_fluid_clip_GradientClipByNorm` , :ref:`api_fluid_clip_GradientClipByValue` .

Please use 'paddle.nn.utils.weight_norm' in dygraph mode.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

move this line to the beginning of the docstring would be better. (i.e.: Line 206)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thanks

.. code-block:: python

import numpy as np
import paddle.fluid as fluid
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need to import fluid

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thanks

jzhang533
jzhang533 previously approved these changes Aug 17, 2020
Copy link
Contributor

@jzhang533 jzhang533 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

python/paddle/nn/utils/weight_norm_hook.py Show resolved Hide resolved
python/paddle/nn/utils/weight_norm_hook.py Show resolved Hide resolved
conv = Conv2D(3, 5, 3)
wn = weight_norm(conv)
remove_weight_norm(conv)
print(conv.weight.shape)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个print能体现remove_weight_norm生效了么?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

换成 print conv.weight_g 了,谢谢~


def weight_norm(layer, name='weight', dim=0):
"""
This weight_norm layer apply weight normalization to a parameter according to the
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

applies?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thanks

python/paddle/nn/utils/weight_norm_hook.py Show resolved Hide resolved
Copy link
Contributor

@wanghaoshuang wanghaoshuang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ceci3 ceci3 merged commit fd66d76 into PaddlePaddle:develop Aug 20, 2020
@ceci3 ceci3 deleted the weight_norm branch August 20, 2020 11:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants