-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Refactor model pruning framework #2504
Refactor model pruning framework #2504
Conversation
pull code
pull code
pull code
pull code
Fix gp tuner (microsoft#1592)
pull code
pull code
pull code
pull code
Fix compressor op_types (microsoft#1670)
pull code
pull code
pull code
pull code
pull code
pull code
Filter prune algo implementation (microsoft#1655)
pull code
pull code
pull code
pull code
pull code
pull code
pull code
pull code
pull code
pull code
pull code
pull code
document the dispatcher working dir (microsoft#1866)
src/sdk/pynni/nni/compression/torch/pruning/finegrained_prunning.py
Outdated
Show resolved
Hide resolved
src/sdk/pynni/nni/compression/torch/pruning/finegrained_prunning.py
Outdated
Show resolved
Hide resolved
@chicm-ms this refactor looks great. please update doc (i.e., tutorial) about how to write a new pruner. |
Thanks, the doc updated. |
return {'weight_mask': torch.ones(weight.shape).type_as(weight)} | ||
threshold = torch.topk(w_abs.view(-1), k, largest=False)[0].max() | ||
mask_weight = torch.gt(w_abs, threshold).type_as(weight) | ||
mask = {'weight_mask': mask_weight} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so bias has no mask in our implementation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no, maybe add bias mask later?
``` | ||
### Set wrapper attribute | ||
Sometimes `cal_mask` must save some state data, therefore users can use `set_wrappers_attribute` API to register attribute just like how buffers are registered in PyTorch modules. These buffers will be registered to `module wrapper`. Users can access these buffers through `module wrapper`. | ||
You can reference nni provided [weight masker](https://github.com/microsoft/nni/blob/master/src/sdk/pynni/nni/compression/torch/pruning/structured_pruning.py) implementations to implement your own weight masker. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hard-coded link to source code is not a good idea. Recommend to use link to API docs instead. Still, keep it if you feel necessary.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's keep it for now.
|
||
logger = logging.getLogger('torch pruner') | ||
|
||
class AGP_Pruner(Pruner): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggests AGPPruner
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe change it later, this PR is not intended to update interface.
|
||
__all__ = ['AGP_Pruner'] | ||
|
||
logger = logging.getLogger('torch pruner') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggest using package name. If torch pruner
is actually used by all existing pruners, feel free to keep it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, let's keep it for now.
assert span > 0 | ||
target_sparsity = (final_sparsity + | ||
(initial_sparsity - final_sparsity) * | ||
(1.0 - ((self.now_epoch - start_epoch) / span)) ** 3) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
span could be 0 in default setting?
Before refactor:
A pruner class/instance includes 2 parts:
optimizer.step()
problem: In one Pruner, it is not convenient to reuse another Pruner's pruning algorithm, especially for the algorithms need some intialization such as TaylorOF and APoz.
After refactor
Pruner interface for user is not changed, create a new pruning algorithms interface
WeightMasker
for code reuseA previous pruner class is split into 2 classes: