-
Hi, I tried running the very simple Python script in the readme on a very short 7 second MP3, and it takes about 15 seconds to return the transcript. Is that speed normal? Or is there something wrong with my setup? |
Beta Was this translation helpful? Give feedback.
Answered by
jongwook
Oct 4, 2022
Replies: 1 comment 2 replies
-
Yes it's about the expected throughput, especially if you're running on a CPU. The model loading part is probably taking the majority of the time, and you can transcribe multiple files once the model is loaded. |
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
jongwook
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Yes it's about the expected throughput, especially if you're running on a CPU. The model loading part is probably taking the majority of the time, and you can transcribe multiple files once the model is loaded.