Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating the 3 normalizers (box, obj, cls) for each epoch #4463

Open
AlexeyAB opened this issue Dec 6, 2019 · 0 comments
Open

Updating the 3 normalizers (box, obj, cls) for each epoch #4463

AlexeyAB opened this issue Dec 6, 2019 · 0 comments
Labels
ToDo RoadMap

Comments

@AlexeyAB
Copy link
Owner

AlexeyAB commented Dec 6, 2019

Updating the 3 normalizers (box, obj, cls) for each epoch, so each of 3 losses will be equal.

Related to: #4451 (comment)

The normalization part seems to be integrally important to every multi-loss neural network, with a good rule of thumb being that multicomponent losses (like we use in object detection) should contribute proportionally to their importance to the metric (mAP). Hyperparameter evolution is an effective but extremely slow way to do this. I'm thinking a good shortcut might be to train for one epoch with default normalizers in place, then update the 3 normalizers (box, obj, cls) to bring all of the losses back into balance, targeting a median value of say 5 each at the end of the first epoch. Logic like this might help the community in general train more effective models more easily, as I've noticed that COCO hyperparameters are not always optimal on 3rd party datasets.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ToDo RoadMap
Projects
None yet
Development

No branches or pull requests

1 participant