-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there a plan to support Windows? #1640
Comments
I really want this as well! It would be so useful for so many things! |
I would also be really grateful if it happens. |
This is a pretty frequent request. Let me see what we can do about it. |
+1 |
With torch.compile heavily relying on Triton, lots of Hugging Face users are also interested in this it seems :-) |
We have a number of interested parties optimizing inference times for Invoke AI on Windows. We're currently evaluating alternatives, but as @patrickvonplaten noted above, torch.compile is the most straightforward but requires Triton. |
+1 Get "RuntimeError: Windows not yet supported for torch.compile" in CUDA 12.1 & Pytorch 2.1.0, it seems that Triton is the main main reason which is not available on Windows, how we can get Windows version |
+1 |
+1,many python packages support windows,and hope this as well, |
+1 |
3 similar comments
+1 |
+1 |
+1 |
@ptillet anything we could do to help you implement this? With PyTorch 2.+ becoming more and more dependent on Triton this feature request will only become more and more important I tihnk. Can we help you here in any way? |
please add support to windows i hate to see triton not available on windows message |
The way to help here is probably to just submit a PR that adds windows support :) we won't have CI for it though soon |
The issues / solutions to them found so far: (also somewhat related to #1560)
|
+1 |
1 similar comment
+1 |
Were there any fork for this? There is this repo but i don't know : https://github.com/PrashantSaikia/Triton-for-Windows |
+1 |
5 similar comments
+1 |
+1 |
+1 |
+1 |
+1 |
+1 |
1 similar comment
+1 |
please 🥺 |
+1 to this 🥺 |
+1 |
+1 |
+1 |
1 similar comment
+1 |
+10 |
In #4045 Windows support was declined with "we don't have the bandwidth to commit to supporting Windows at this time". |
That's disappointing to hear |
I can't imagine any possible feature that'd be more important than this such that "there isn't enough bandwidth". |
you have bandwidth money and manpower to support Windows there are even individual developers published pre-compiled wheels for windows this is ridiculous |
Link one... |
|
7 months old. Yeah that's bound to be able to handle all the latest models. |
that is not the point though. point is OpenAI has all the resources on earth to make triton work on windows natively |
Interrogation with CogVLM2 does not work under W11 (triton is not supported on Windows triton-lang/triton#1640)
Where is Triton for Windows? :( |
Still we are missing Triton in 2024, in era of ChatGPT 4 |
+1 |
I have noticed that the README states Linux as the only compatible platform. https://github.com/openai/triton#compatibility
Some people in the past have managed to compile on Windows #871 (there is even a really old PR for Windows support #24). But still going by the README, I suppose something changed and Triton doesn't support Windows anymore? I haven't tried to compile it myself yet.
I'm interested in the development of this repository but my main OS is Windows. I'm aware that I can probably use WSL2 but still I would prefer to run it on Windows natively. So my question is: is there a plan to officially support Windows? If so, I can help.
The text was updated successfully, but these errors were encountered: