-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[2.0 API] add SyncBatchNorm.convert_sync_batch_norm #26688
Conversation
Thanks for your contribution! |
PR标题有点太简单了。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
default_initializer=Constant(1.0)) | ||
self.weight.stop_gradient = (self._weight_attr is False) or ( | ||
self._weight_attr and self._weight_attr.learning_rate == 0.) | ||
if weight_attr == False: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
False -> None
None != False
看起来这个if-else逻辑跟下面这种写法等价
self.weight = self.create_parameter(
attr=weight_attr,
shape=param_shape,
default_initializer=Constant(1.0))
self.weight.stop_gradient = (weight_attr == None) or (weight_attr.learning_rate == 0.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
如果weight_attr是False的话,self.weight就会是None,设置stop_gradient这一步就会报错..
attr=self._bias_attr, shape=param_shape, is_bias=True) | ||
self.bias.stop_gradient = (self._bias_attr is False) or ( | ||
self._bias_attr and self._bias_attr.learning_rate == 0.) | ||
if bias_attr == False: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
False -> None
layer._bias_attr, layer._data_format, | ||
layer._name) | ||
|
||
if layer._weight_attr != False and layer._bias_attr != False: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
False -> None
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Others
PR changes
APIs
Describe
add convert_sync_batch_norm for SyncBatchNorm to convert standard bn to syncbn
fix batch norm when weight_attr=False or bias_attr=False