Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to multi-device tensors in llama2_70b #9568

Closed
ayerofieiev-tt opened this issue Jun 20, 2024 · 0 comments
Closed

Switch to multi-device tensors in llama2_70b #9568

ayerofieiev-tt opened this issue Jun 20, 2024 · 0 comments
Assignees

Comments

@ayerofieiev-tt
Copy link
Member

Description
We manually shard tensors in llama2_70b and experimental/llama2_70b
This blocks removal of all_gather op from tt_lib and replacement of it with ttnn.all_gather
Please switch these models to multi-device tensors.

See example PR for falcon_40b
#9544

ayerofieiev-tt added a commit that referenced this issue Jun 24, 2024
* #9486: Revert removal of all_gather bindings from tt_lib

This reverts commit b88cf09.

* #9486: revert from ttnn to tt_lib all_gather in llama2_70 t3k
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants