Replies: 1 comment
-
u think creating neural networks is like just add/remove random things until they attain a certain size ? NO
it's not the authors job, there're already a lot of forks that's optimized for low-end/mid-end gpu, even no gpu
it's already free of charge, what bang u want more ? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey,
Thank you for the Whisper.
For what it's worth, I have a 8GB VRAM GPU, and it's suboptimally in between of large (10 GB) and medium (5GB model).
Also, I have 4 GB, too, so while you're at it..
In general, consider optimizing model sizes for common VRAM sizes, so people can have the best bang for their hardware.
Also, in terms of compute and energy investment (carbon footprint & climate change mitigation), it is justified to aim to get the optimal quality per compute out of a particular hardware.
Thanks!
Or what do you people think?
Beta Was this translation helpful? Give feedback.
All reactions