-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Swish and ThresholdedReLU for API 2.0 #27758
Add Swish and ThresholdedReLU for API 2.0 #27758
Conversation
test=develop
Thanks for your contribution! |
… hongming/fix_swish test=develop
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thresholded_relu是不是可以对齐到torch的threshold。thresholded_relu的默认行为是不是可以和relu一致?
self.attrs = {'beta': beta} | ||
x = np.random.uniform(-1, 1, [10, 12]).astype(self.dtype) | ||
out = ref_swish(x) | ||
self.inputs = {'X': x} | ||
self.attrs = {'slope': 1.0} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
attr原来是beta,也没有改c++端,为什么这里会变?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个是按照之前的fluid的实现的,看样子以前就错了,我后面提个PR修复下。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
确实可以,只是这个主要为了对齐tensorflow的tf.keras.layers.ThresholdedReLU(theta=1.0, **kwargs),pytorch可以通过threshold算子实现thresholdedReLU,但它的threshold和value参数没有默认值。 |
PR types
Others
PR changes
APIs
Describe