Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TPU Example / Support #952

Closed
ryanwongsa opened this issue Apr 21, 2020 · 4 comments · Fixed by #956
Closed

TPU Example / Support #952

ryanwongsa opened this issue Apr 21, 2020 · 4 comments · Fixed by #956

Comments

@ryanwongsa
Copy link
Contributor

❓ Questions/Help/Support

I would like to use Ignite with TPUs and was wondering if there are any examples currently available which currently use TPUs with Ignite or does Ignite have some limitations which would not allow for easy compatibility with TPUs.

@vfdev-5
Copy link
Collaborator

vfdev-5 commented Apr 21, 2020

@ryanwongsa unfortunately, there is no yet examples of using Ignite with TPUs.

I think there is no limitations on that as PyTorch xla API is rather similar to GPUs. But, yes, it will be good idea to add some example with Colab link.

@vfdev-5 vfdev-5 self-assigned this Apr 21, 2020
@vfdev-5 vfdev-5 mentioned this issue Apr 21, 2020
3 tasks
@vfdev-5
Copy link
Collaborator

vfdev-5 commented Apr 21, 2020

@ryanwongsa please, checkout added example or notebook in Colab and feel free to give us any feedback.

@ryanwongsa
Copy link
Contributor Author

Thanks.

I am not sure if it is suitable for including in an example notebook but there are currently Kaggle kernels which use TPU training on 8 cores, such as bert-multi-lingual-tpu-training-8-cores and I was wondering what the best approach would be to do it in a similar way with Ignite.

The torch_xla ParallelLoader is also used on top of a Dataloader which might cause issues too but I am not sure.

@vfdev-5
Copy link
Collaborator

vfdev-5 commented Apr 22, 2020

@ryanwongsa let me check this kernel with Ignite.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants