You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @postrational
actually neither of it ;) When omitting the batch dimension in 1d case, ptflops pytorch engine can get wild a bit.
I'd suggest to use run the following:
The difference between the versions is that newer ptflops counts nn.functional.relu as well. If we omit relu in your sample, the result would be 102.7 MMac. Also, you can pass backend=aten to double check (only for new ptflops), or always use aten (it will not consider relu as well)
For 1d conv we should have one of these input layout. ptflops automatically inserts the batch dimension to the input presented, and it was the source of confusion: on (48000,) input we should have the following error:
RuntimeError: Expected 2D (unbatched) or 3D (batched) input to conv1d, but got input of size: [48000]
I'm having a problem with ptflops 0.7.1 and higher.
I managed to isolate the problem to this sample code:
In ptflops
0.7.0
I get40.52 KMac
:In version
7.0.1
and higher I get a value of1.96 MMac
:Where does the large discepancy come from and which value is more reliable?
The text was updated successfully, but these errors were encountered: