Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhence shrink_rnn_memory_op. #7176

Merged
merged 4 commits into from
Jan 10, 2018
Merged

Enhence shrink_rnn_memory_op. #7176

merged 4 commits into from
Jan 10, 2018

Conversation

pkuyym
Copy link
Contributor

@pkuyym pkuyym commented Jan 3, 2018

Resolves #7173


// should consider multiple levels
size_t height = dst_num_rows;
auto lod_level = rank_table.level();
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rank_table.level() is not related to ShrinkMemory.

Also, lod_level must always be zero? Since you cannot shrink the fine level of LoD without shrink the coarse level of LoD.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, follow comments.

framework::AppendLoD(&remain, lod_offset.first);
for (size_t j = 0; j < remain.size(); ++j) {
out_lod->emplace_back(remain[j]);
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This logic is too complex. Since we just shrink the first level of LoD, we can just drop the last N element of the LoD.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Follow comments.

Copy link
Contributor Author

@pkuyym pkuyym left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do shrink for the first level LoD.


// should consider multiple levels
size_t height = dst_num_rows;
auto lod_level = rank_table.level();
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, follow comments.

framework::AppendLoD(&remain, lod_offset.first);
for (size_t j = 0; j < remain.size(); ++j) {
out_lod->emplace_back(remain[j]);
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Follow comments.

Copy link
Collaborator

@JiayiFeng JiayiFeng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@pkuyym pkuyym merged commit a320276 into PaddlePaddle:develop Jan 10, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants