Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] BlockFusion bug in model training after PR#242 #266

Closed
xysmlx opened this issue May 13, 2021 · 2 comments · Fixed by #270
Closed

[BUG] BlockFusion bug in model training after PR#242 #266

xysmlx opened this issue May 13, 2021 · 2 comments · Fixed by #270
Assignees
Labels
bug Something isn't working

Comments

@xysmlx
Copy link
Contributor

xysmlx commented May 13, 2021

Test case: mnist training, lstm training

@xysmlx xysmlx added the bug Something isn't working label May 13, 2021
@xysmlx xysmlx self-assigned this May 13, 2021
@nnfbot
Copy link

nnfbot commented May 13, 2021

Thanks for the report @xysmlx! I will look into it ASAP! (I'm a bot).

@xysmlx xysmlx linked a pull request May 19, 2021 that will close this issue
@xysmlx
Copy link
Contributor Author

xysmlx commented May 19, 2021

Update: BlockFusion fuses dead gnodes, while the memory allocator does not allocate memory for dead tensors (i.e., calculating the needed output tensors does not need these dead gnodes and dead tensors). This leads to memory illegal accesses and results in such training problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants