-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Add network trimming pruning algorithm and fix bias mask(testing) #1867
Conversation
print('# Epoch {} #'.format(epoch)) | ||
train(model, device, train_loader, optimizer_finetune) | ||
top1 = test(model, device, test_loader) | ||
if top1 > best_top1: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need to use a validate dataset here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Strictly speaking it should be a validation dataset, but this way is commonly used in other release code of papers, so I followed this way.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's wired that we test directly in test dataset. I think we could change it.
look great. please resolve comments :) |
assert all(mask1['weight'].numpy() == np.array([0., 0., 0., 1., 1.])) | ||
assert all(mask2['weight'].numpy() == np.array([0., 0., 0., 1., 1.])) | ||
assert all(mask1['bias'].numpy() == np.array([0., 0., 0., 1., 1.])) | ||
assert all(mask2['bias'].numpy() == np.array([0., 0., 0., 1., 1.])) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it possible to add UT for new activation pruners?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes I'm working on it
Add pruning algorithm based on the activation statistics(network trimming in the iteration plan, more detailed docs will be updated later); Fix bias mask in the pruning.