Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[2.0 API] add SyncBatchNorm.convert_sync_batch_norm #26688

Merged
merged 6 commits into from
Aug 28, 2020

Conversation

ceci3
Copy link
Contributor

@ceci3 ceci3 commented Aug 26, 2020

PR types

Others

PR changes

APIs

Describe

add convert_sync_batch_norm for SyncBatchNorm to convert standard bn to syncbn
fix batch norm when weight_attr=False or bias_attr=False

@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@wanghaoshuang
Copy link
Contributor

PR标题有点太简单了。

@ceci3 ceci3 changed the title add cnvert,test=develop [2.0 API] add SyncBatchNorm.convert_sync_batch_norm Aug 26, 2020
python/paddle/nn/layer/norm.py Outdated Show resolved Hide resolved
python/paddle/nn/layer/norm.py Outdated Show resolved Hide resolved
python/paddle/nn/layer/norm.py Outdated Show resolved Hide resolved
willthefrog
willthefrog previously approved these changes Aug 27, 2020
Copy link
Contributor

@willthefrog willthefrog left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@willthefrog willthefrog left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@jzhang533 jzhang533 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

default_initializer=Constant(1.0))
self.weight.stop_gradient = (self._weight_attr is False) or (
self._weight_attr and self._weight_attr.learning_rate == 0.)
if weight_attr == False:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

False -> None
None != False
看起来这个if-else逻辑跟下面这种写法等价
self.weight = self.create_parameter(
attr=weight_attr,
shape=param_shape,
default_initializer=Constant(1.0))
self.weight.stop_gradient = (weight_attr == None) or (weight_attr.learning_rate == 0.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

如果weight_attr是False的话,self.weight就会是None,设置stop_gradient这一步就会报错..

attr=self._bias_attr, shape=param_shape, is_bias=True)
self.bias.stop_gradient = (self._bias_attr is False) or (
self._bias_attr and self._bias_attr.learning_rate == 0.)
if bias_attr == False:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

False -> None

layer._bias_attr, layer._data_format,
layer._name)

if layer._weight_attr != False and layer._bias_attr != False:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

False -> None

Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ceci3 ceci3 merged commit 42d2915 into PaddlePaddle:develop Aug 28, 2020
@ceci3 ceci3 deleted the convert_syncbn branch August 28, 2020 12:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants