Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Refactor doc of NNI model compression #2595

Merged
merged 15 commits into from
Jun 29, 2020

Conversation

QuanluZhang
Copy link
Contributor

No description provided.

@@ -1,5 +1,4 @@
Pruner on NNI Compressor
===
# Supported Pruning Algorithms on NNI

Index of supported pruning algorithms
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggest to organize with fine-grained pruning / structural pruning

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you give me a version of such organization?

Copy link
Contributor

@colorjam colorjam Jun 28, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The WeightRankFilterPruner, ActivationRankFilterPruner and GradientRankFilterPruner have been removed from code. Suggest to flatten the structured pruners.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And suggest move lottery ticket to another category , or same category as AGP

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated, could @chicm-ms make further changes in your pr? thanks!


The experiments code can be found at [examples/model_compress]( https://github.com/microsoft/nni/tree/master/examples/model_compress/)

***

## WeightRankFilterPruner
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you remove the WeightRankFilterPruner? It has been removed in code.

@@ -355,6 +387,7 @@ You can view example for more information
- **sparsity:** How much percentage of convolutional filters are to be pruned.
- **op_types:** Only Conv2d is supported in ActivationMeanRankFilterPruner.

***

## GradientRankFilterPruner
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GradientRankFilterPruner can also be removed.

@@ -292,6 +322,8 @@ pruner.compress()
- **sparsity:** This is to specify the sparsity operations to be compressed to
- **op_types:** Only Conv1d and Conv2d is supported in L2Filter Pruner

***

## ActivationRankFilterPruner
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ActivationRankFilterPruner also needs to be removed.


To better demonstrate how to customize a new pruning algorithm, it is necessary for users to first understand the framework for supporting various pruning algorithms in NNI.

### Framework overview for pruning algorithms
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The title doesn't help to understand purpose of its content. The content should like a step by step guidance to create user's puring algorithm.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also leave it to @chicm-ms :)

@chicm-ms
Copy link
Contributor

@scarlett2018 @squirrelsc @colorjam I will create a new PR to address your comments.

@chicm-ms chicm-ms merged commit 487d71a into microsoft:master Jun 29, 2020
squirrelsc pushed a commit to squirrelsc/nni that referenced this pull request Jun 29, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants