Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RNN backward create #3490

Merged
merged 12 commits into from
Aug 15, 2017
Merged

Conversation

Superjomn
Copy link
Contributor

@Superjomn Superjomn commented Aug 15, 2017

resolve #3472

/*
* Some special preprocesses after a gradient op is created.
*/
static void Init(const RecurrentOp& op, RecurrentGradientOp* grad_op,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Init is a bad pattern, put Init() into the constructor.

void RecurrentGradientOp::Init(
const RecurrentOp& op, RecurrentGradientOp* grad_op,
const std::unordered_set<std::string>& no_grad_vars) {
auto gradop = Backward(op.stepnet(), no_grad_vars);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

如果是递归调用的话,需要调用的是 BackwardRecursive那个函数,而不是Backward函数,另外还有一些参数,比如uid之类的需要在递归中传递过去。

@@ -178,11 +179,24 @@ std::shared_ptr<OperatorBase> BackwardRecursive(
return false;
});

// process recurrent gradient op as a special operator.
if (forwardOp.Type() == "recurrent_op") {
// NOTE clean up cycle call somewhere (RNN's stepnet constains itself), or
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just write create method here should be simplest.

@Superjomn Superjomn merged commit 9eaef75 into PaddlePaddle:develop Aug 15, 2017
@Superjomn Superjomn deleted the rnn-backward-create branch August 15, 2017 09:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

rnnop adapt framework's create gradient op
2 participants