-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to load Model + Config on a fresh venv install #91
Comments
Can you also post the output for file inference? |
Sure, here is the stack trace for file inference:
|
I have tried both virtualenv and conda and cannot reproduce. However, I have received other reports besides yours. I missed that you were using venv: ...... |
Does the CLI work? |
Using CLI, It downloads the HuBERT model first before doing inference:
After that, I tried using the GUI again and it works:
I believe it failed to work initially because it did not auto download the Hubert model when using the GUI for inferencing. When the Hubert model is present, it resolved itself. |
It is designed to work exactly the same way with the GUI, but unfortunately it doesn't seem to work in some environments for some reason. |
Created a fresh venv, ran:
In the GUI, specified Model path and Config file to a pre-trained model.
Trying the infer/start real time inference gives the following error:
The text was updated successfully, but these errors were encountered: