-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Timeout Error Depending On Tensor Sizes #56
Comments
Based on this line Can you do a card reset using the tt-smi tool? or just doing a system reboot. Then try again. |
Both |
|
@jhlee508 any follow up on the above? |
It makes sense that I have tried a few input shapes and couldn't figure out how pybuda determines which dimension is the batch dimension, since it doesn't seem that always for example 3rd dimension is the batch dimension.
|
Thanks for the additional context @jhlee508. What seems to be happening then is that this particular input shape [32, 32] is causing a hang in the software. Fair to label this as a bug. We've just released a new version of Buda: https://github.com/tenstorrent/tt-buda/releases/tag/v0.19.3 I would suggest that you try that one to see if there is any change. Otherwise, let's keep this bug open and we'll take a look at it. |
Thank you for your support. Please refer to this comment: #57 (comment). |
This single linear layer test code causes an error as below. It works fine if the input_tensor shape is [1, 32] or [1, 32, 32]. However, when the input is [32, 32] it gets a timeout error when reading output (linear.output_add_2).
Could you help me find the solution?
Test code
Error Log
The text was updated successfully, but these errors were encountered: