-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Pascal architecture GPUs #32
Comments
We currently support Turing (e.g. RTX 2080) and Ampere (e.g. RTX 3080) GPUs. We rely on tensor cores for matrix multiplication, which older GPUs lack. |
Thanks! Your advice helps a lot.
|
Hi, We have just pushed a PR in facebookresearch/xformers#362 which contains V100 and P100 support as well, and dispatches to FlashAttention for the cases where it is supported. |
Would this PR make support for Maxwell GPUs possible too? |
Wonderful work presented! May I ask if there is any plan to expand support list to include Pascal architecture GPUs? Take Tesla P-Series as examples.
The text was updated successfully, but these errors were encountered: