Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inquiry about NMPruner Code Source #3

Open
RoraChen opened this issue Jul 2, 2024 · 1 comment
Open

Inquiry about NMPruner Code Source #3

RoraChen opened this issue Jul 2, 2024 · 1 comment

Comments

@RoraChen
Copy link

RoraChen commented Jul 2, 2024

Hi, I came across the NMPruner class in your repository, particularly interested in its structured fine-grained sparsity methods like group_death and grp_grad_regrow. Could you please let me know which research paper or reference this implementation is based on?Thank you!

@mengjian0502
Copy link
Collaborator

mengjian0502 commented Jul 3, 2024

Hi Rora,

Thank you for your interest. The N:M pruner of torch2chip is a starting from the GraNet (NeurIPS'21), which is designed for element-wise pruning. But I modified the implementation to enable the prune-and-regrow for the structured fine-grained sparsity.

Specifically, we gradually increase the probability/percentage of the N:M groups throughout the training process.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants