-
-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
will it support cuda.jit ? #14
Comments
Could you be a bit more specific what you mean? |
Using cuda_array_interface torch, cupy and jax are working with numba on cuda.jit nativly u just pass torch.cuda or jax.cuda tensor to them. TF didn't have that but have https://www.tensorflow.org/api_docs/python/tf/experimental/dlpack/from_dlpack so u can add it pretty easy |
Oh, now I understand what you mean. Yes, I am aware of If your question is specifically about whether you can use EagerPy tensors with numba, then I guess the answer is the same: EagerPy tensors are really just wrapped tensors and you can at any time get the raw tensor and then its If you are asking for something else, please provide a specific example of code that you would want to be supported. |
Dunno what OP wants from cuda.jit but its pretty neat function to have working python cuda code with gpu arrays, u can convert almost any function write to work on cpu to gpu without any overhead specialy good for some sort of search for big arrays (for example glowtts). |
will this lib support tensorflow tensor work with cuda.jit (from numba).
The text was updated successfully, but these errors were encountered: