-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add pruning tutorial #605
Add pruning tutorial #605
Conversation
Deploy preview for pytorch-tutorials-preview ready! Built with commit 261d093 https://deploy-preview-605--pytorch-tutorials-preview.netlify.com |
@mickypaganini, tutorial looks great. It wasn't building, so I made a tiny change and now it is. @brianjo, is this because @mickypaganini doesn't have write access to the repo? I think that's the issue I was having before. cc @jlin27 FYI |
In the build (and when I test this locally), I get the error ModuleNotFoundError: No module named 'torch.nn.utils.prune'. We build on 1.2 release. Update - prune hasn't been merged into core yet, so this is going to be on hold for a bit. Thanks! |
Thanks for taking a look so quickly, @SethHWeidman @brianjo. Yep, as Brian said and as I mentioned above, this depends on pruning actually getting merged into PyTorch (PR here: pytorch/pytorch#24076). Until then, it won't build. I will comment here once I have updates on that. |
Summary: Provides implementation for feature request issue #20402. Adds pruning functionalities (structured and unstructured, local and global, as well as pruning from user-provided mask). Associated tutorial here: pytorch/tutorials#605 cc: soumith Pull Request resolved: #24076 Differential Revision: D18400431 Pulled By: mickypaganini fbshipit-source-id: a97bd6ca61f8600ae411da9ff6533c232aae1a51
@SethHWeidman @brianjo if you build off of pytorch master, this tutorial should build fine now. |
Hello... I found this tutorial via google search. I think it will be nice to include. It's not immediately clear to me when the pruning is just an effective pruning implemented with masking and when it leads to a network that actually has a smaller memory footprint. Sorry if these are naive questions, but hopefully they are helpful in reaching the audience the tutorial is aimed for. |
Hi @cranmer, As is, this module is not intended (by itself) to help you with memory savings. All that pruning does is to replace some entries with zeroes. This itself doesn't buy you anything, unless you represent the sparse tensor in a smarter way (which this module itself doesn't handle for you). You can, however, rely on import torch
import torch.nn.utils.prune as prune
t = torch.randn(100, 100)
torch.save(t, 'full.pth')
p = prune.L1Unstructured(amount=0.9)
pruned = p.prune(t)
torch.save(pruned, 'pruned.pth')
sparsified = pruned.to_sparse()
torch.save(sparsified, 'sparsified.pth') When I
By the way, before calling |
Add pruning tutorial. Will create another PR to add it into the ToC.
The tutorial mentioned pruning only a convolutional layer. Is there any method to prune the complete |
Yes, I'm also curious about how to apply it into the complete model. |
Hi used import torch.nn.utils.prune as prune to prune attention weights of the BERT model. e.g. `parameters_to_prune = ()
|
Add tutorial for pruning functionalities discussed in feature request issue pytorch/pytorch#20402
Depends on: pytorch/pytorch#24076