Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Swish and ThresholdedReLU for API 2.0 #27758

Merged
merged 5 commits into from
Oct 11, 2020

Conversation

hong19860320
Copy link
Contributor

@hong19860320 hong19860320 commented Oct 5, 2020

PR types

Others

PR changes

APIs

Describe

  1. 修改paddle.nn.functional.swish(x, beta=1.0, name=None)(通过别名引用paddle.fluid.layers.swish)-> paddle.nn.functional.swish(x, name=None)(去掉beta参数);
  2. 增加paddle.nn.Swish(x, name=None)类;
  3. 修改paddle.nn.functional.thresholded_relu(x, threshold=None, name=None)(通过别名引用paddle.fluid.layers.thresholded_relu)-> paddle.nn.functional.thresholded_relu(x, threshold=1.0, name=None)(threshold设置默认值为1.0);
  4. 增加paddle.nn.ThresholdedReLU(x, threshold=1.0, name=None)类。
  5. ReLU6, Tanhshrink, SELU, Softplus, Softshrink和Softsign移除示例中的paddle.disable_static()。

image
image
image
image

@paddle-bot-old
Copy link

paddle-bot-old bot commented Oct 5, 2020

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

Copy link
Contributor

@zhupengyang zhupengyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thresholded_relu是不是可以对齐到torch的threshold。thresholded_relu的默认行为是不是可以和relu一致?

Comment on lines -2129 to +2178
self.attrs = {'beta': beta}
x = np.random.uniform(-1, 1, [10, 12]).astype(self.dtype)
out = ref_swish(x)
self.inputs = {'X': x}
self.attrs = {'slope': 1.0}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

attr原来是beta,也没有改c++端,为什么这里会变?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个是按照之前的fluid的实现的,看样子以前就错了,我后面提个PR修复下。

Copy link
Contributor

@TCChenlong TCChenlong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@hong19860320
Copy link
Contributor Author

thresholded_relu是不是可以对齐到torch的threshold。thresholded_relu的默认行为是不是可以和relu一致?

确实可以,只是这个主要为了对齐tensorflow的tf.keras.layers.ThresholdedReLU(theta=1.0, **kwargs),pytorch可以通过threshold算子实现thresholdedReLU,但它的threshold和value参数没有默认值。

@hong19860320 hong19860320 merged commit 74d3a55 into PaddlePaddle:develop Oct 11, 2020
chen-zhiyu pushed a commit to chen-zhiyu/Paddle that referenced this pull request Oct 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants