You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The normalization part seems to be integrally important to every multi-loss neural network, with a good rule of thumb being that multicomponent losses (like we use in object detection) should contribute proportionally to their importance to the metric (mAP). Hyperparameter evolution is an effective but extremely slow way to do this. I'm thinking a good shortcut might be to train for one epoch with default normalizers in place, then update the 3 normalizers (box, obj, cls) to bring all of the losses back into balance, targeting a median value of say 5 each at the end of the first epoch. Logic like this might help the community in general train more effective models more easily, as I've noticed that COCO hyperparameters are not always optimal on 3rd party datasets.
The text was updated successfully, but these errors were encountered:
Updating the 3 normalizers (box, obj, cls) for each epoch, so each of 3 losses will be equal.
Related to: #4451 (comment)
The text was updated successfully, but these errors were encountered: