Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support testing when training and handle dropout and batch_norm operator in testing mode #5734

Merged
merged 13 commits into from
Nov 24, 2017

Conversation

QiJune
Copy link
Member

@QiJune QiJune commented Nov 17, 2017

Fix #5733 and Fix #5814

@QiJune QiJune changed the title handle dropout and batch_norm operator in testing mode support testing with training and handle dropout and batch_norm operator in testing mode Nov 21, 2017
@QiJune QiJune changed the title support testing with training and handle dropout and batch_norm operator in testing mode support testing when training and handle dropout and batch_norm operator in testing mode Nov 21, 2017
op_desc.type() == kBatchNormOpType) {
for (auto& attr : *op_desc.mutable_attrs()) {
if (attr.name() == "is_test") {
attr.set_b(true);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

break;

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

void Prune(const ProgramDesc& input, ProgramDesc* output) {
prune_impl(input, output, 0);
void Prune(const ProgramDesc& input, ProgramDesc* output, bool is_test) {
prune_impl(input, output, 0, is_test);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think setting attribute is_test true has no relationship with prune. We shall add an independent function or module to do this.

Copy link
Member Author

@QiJune QiJune Nov 22, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We now have prune method to transform ProgramDesc and generate another ProgramDesc.
Exactly, we will do some optimization in inference stage and handle some special operators. Then, we will generate an inference ProgramDesc.
You are right. Maybe we need another method like prune, the interface could be:

void InferenceOptimize(const ProgramDesc& input, ProgramDesc* output);

@reyoung reyoung added this to the Release 0.11.0 milestone Nov 23, 2017
@@ -106,5 +108,26 @@ void Prune(const ProgramDesc& input, ProgramDesc* output) {
prune_impl(input, output, 0);
}

void inference_optimize_impl(const ProgramDesc& input, ProgramDesc* output,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The logic of inference_optimize_impl is quite simple. Maybe we can implement it in Python.

@QiJune QiJune merged commit 3a76062 into PaddlePaddle:develop Nov 24, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants