-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add dot_prod_layer #5724
add dot_prod_layer #5724
Conversation
config.layerConfig.add_inputs(); | ||
|
||
for (auto useGpu : {false, true}) { | ||
testLayerGrad(config, "dot_prod", 100, false, useGpu); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里的batch size 改小一些,不需要用这么大的batch size 跑单测。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
@@ -4141,6 +4143,45 @@ def maxid_layer(input, name=None, layer_attr=None): | |||
|
|||
|
|||
@wrap_name_default() | |||
def dot_prod_layer(input1, input2, name=None, layer_attr=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
需要为helper 加单测。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
.. autoclass:: paddle.v2.layer.dot_prod | ||
:noindex: | ||
|
||
out_prod |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
需要像339行那样的横线。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
namespace paddle { | ||
|
||
/** | ||
* @brief A layer for computing the dot product of two vectors |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
句末加上句号吧~
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
Layer::init(layerMap, parameterMap); | ||
|
||
CHECK_EQ(inputLayers_.size(), 2U); | ||
CHECK_EQ(1, getSize()) << "Dimension mismatch"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- 1UL
- The output dimensionality of this layer should be fixed to 1.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
def __init__(self, name, inputs, device=None): | ||
super(DotProdLayer, self).__init__( | ||
name, 'dot_prod', 0, inputs, device=device) | ||
config_assert(len(inputs) == 2, 'DotProdLayer must have 2 inputs') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
完整的一句话注释之后加上句号吧。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
def __init__(self, name, inputs, device=None): | ||
super(DotProdLayer, self).__init__( | ||
name, 'dot_prod', 0, inputs, device=device) | ||
config_assert(len(inputs) == 2, 'DotProdLayer must have 2 inputs.') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里加一个检测,检查一下两个输入的维度必须相同。
self.get_input_layer(0).size == self.get_input_layer(1).size
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
MatrixPtr inV1 = getInputValue(1); | ||
|
||
size_t batchSize = inV0->getHeight(); | ||
CHECK_EQ(inV1->getHeight(), batchSize); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里检查一下 inv0 和 inv1 的宽度必须相同。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
fixes #5496