-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How could I run this on windows 10? #10
Comments
Hey, it probably doesn't make sense to run this on Windows as you'll need a GPU (which I assume you don't have locally). It's probably best to use some cloud GPU service or just run on Google Colab. You can find a demo Mamba-Chat on Google Colab here |
Does Mamba Chat only run on a GPU? |
@SzaremehrjardiMT Currently yes. There's an open issue in llama.cpp to support the mamba architecture, though, which would make it possible to run without a GPU: ggerganov/llama.cpp#4353 |
|
@KevinRyu It won't be optimized, but you can try mamba-minimal |
Hello,
When I tried to install packages with
requirements.txt
, I got the following error.As I know,
triton
package supports something like linux only.What should I do?
The text was updated successfully, but these errors were encountered: