You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Multiply operations are in general an order of magnitude more expensive than additions. So, it might be very useful for people experimenting with hardware not having FMA instructions.
Multiply operations are in general an order of magnitude more expensive than additions. So, it might be very useful for people experimenting with hardware not having FMA instructions.
Do you mean mul (or conv2d) layers consume drastically more flops than bias-add/add ?
SInce I ran some flops caluation in tensorflow, the flops report usually mul (or conv2d) over 90% flops, but bias-add/add only consume less than 5%.
Hi,
This is a great repo! :-)
But can you add functionality to compute the addition count and multiplication count separately?
Thanks!
The text was updated successfully, but these errors were encountered: