You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i Add SK Attention after the MobileNetV2 backbone, When i test the flops and params i find out an unnormal params nummber. Is there some mistake in my sittings or in the skAttention code?
channel=1280, reduction=8 when i use the skattention
i Add SK Attention after the MobileNetV2 backbone, When i test the flops and params i find out an unnormal params nummber. Is there some mistake in my sittings or in the skAttention code?
channel=1280, reduction=8 when i use the skattention
+----------------------------------+----------------------+------------+--------------+
| module | #parameters or shape | #flops | #activations |
+----------------------------------+----------------------+------------+--------------+
| model | 0.141G | 9.218G | 9.055M |
| backbone | 2.224M | 0.409G | 8.722M |
| backbone.conv1 | 0.928K | 15.204M | 0.524M |
| backbone.conv1.conv | 0.864K | 14.156M | 0.524M |
| backbone.conv1.bn | 64 | 1.049M | 0 |
| backbone.layer1.0.conv | 0.896K | 14.68M | 0.786M |
| backbone.layer1.0.conv.0 | 0.352K | 5.767M | 0.524M |
| backbone.layer1.0.conv.1 | 0.544K | 8.913M | 0.262M |
| backbone.layer2 | 13.968K | 78.447M | 3.342M |
| backbone.layer2.0.conv | 5.136K | 42.271M | 2.064M |
| backbone.layer2.1.conv | 8.832K | 36.176M | 1.278M |
| backbone.layer3 | 39.696K | 52.15M | 1.622M |
| backbone.layer3.0.conv | 10K | 21.742M | 0.77M |
| backbone.layer3.1.conv | 14.848K | 15.204M | 0.426M |
| backbone.layer3.2.conv | 14.848K | 15.204M | 0.426M |
| backbone.layer4 | 0.184M | 52.085M | 0.901M |
| backbone.layer4.0.conv | 21.056K | 10.404M | 0.262M |
| backbone.layer4.1.conv | 54.272K | 13.894M | 0.213M |
| backbone.layer4.2.conv | 54.272K | 13.894M | 0.213M |
| backbone.layer4.3.conv | 54.272K | 13.894M | 0.213M |
| backbone.layer5 | 0.303M | 77.611M | 0.86M |
| backbone.layer5.0.conv | 66.624K | 17.056M | 0.221M |
| backbone.layer5.1.conv | 0.118M | 30.278M | 0.319M |
| backbone.layer5.2.conv | 0.118M | 30.278M | 0.319M |
| backbone.layer6 | 0.795M | 61.735M | 0.461M |
| backbone.layer6.0.conv | 0.155M | 20.775M | 0.195M |
| backbone.layer6.1.conv | 0.32M | 20.48M | 0.133M |
| backbone.layer6.2.conv | 0.32M | 20.48M | 0.133M |
| backbone.layer7.0.conv | 0.474M | 30.331M | 0.143M |
| backbone.layer7.0.conv.0 | 0.156M | 9.953M | 61.44K |
| backbone.layer7.0.conv.1 | 10.56K | 0.676M | 61.44K |
| backbone.layer7.0.conv.2 | 0.308M | 19.702M | 20.48K |
| backbone.conv2 | 0.412M | 26.378M | 81.92K |
| backbone.conv2.conv | 0.41M | 26.214M | 81.92K |
| backbone.conv2.bn | 2.56K | 0.164M | 0 |
| neck | 0.139G | 8.81G | 0.333M |
| neck.sk | 0.139G | 8.81G | 0.333M |
| neck.sk.convs | 0.138G | 8.809G | 0.328M |
| neck.sk.fc | 0.205M | 0.205M | 0.16K |
| neck.sk.fcs | 0.824M | 0.819M | 5.12K |
| neck.gap | | 81.92K | 0 |
| head | 74.28K | 20.48K | 16 |
| head.loss_module.flow_model | 53.784K | | |
| head.loss_module.flow_model.s | 26.892K | | |
| head.loss_module.flow_model.t | 26.892K | | |
| head.fc | 20.496K | 20.48K | 16 |
| head.fc.weight | (16, 1280) | | |
| head.fc.bias | (16,) | | |
+----------------------------------+------------
Here is the parameters I testd,Any help will be important to me,i am looking forward to your help
The text was updated successfully, but these errors were encountered: