-
Notifications
You must be signed in to change notification settings - Fork 423
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Log multiple losses #1375
Log multiple losses #1375
Conversation
This comment was marked as outdated.
This comment was marked as outdated.
Is there any way we can get rid of the trailing [EDIT] or wait maybe it has to be that way for WandB to put plots together in the same place? Hmmm Also, I wonder if we can drop the |
Thanks for the suggestion Abhi, good points! Yeah, that would be cleaner, let me try to set it up 🙂 Hmmm that might actually make the closure stuff easier as well! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
''
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks great to me!!
@Landanjs I'm assuming you have some convergence tests on DeepLabV3 that work? And hopefully we have some trainer tests that do loss as a Tensor, tuple, dict?
Approving now for velocity and trusting you on the tests :)
I added some convergence tests for tuple losses and dict losses with |
I mentioned this to Evan, but just for the record, it looks like i need someone from @mosaicml/composer-team-eng to review as well |
Resolves CO-842. Adds logic to log multiple losses. Also, adds loss dictionary to DeepLabv3+ as an example. A couple of scenarios:
1. Scalar tensor loss
This will be logged as
loss/train/total
. Bert example:2. Tuple of tensor losses
Individual losses will be logged as
'loss/train/loss{i}'
where i is the index of the individual loss. There will also be'loss/train/total'
which is the sum of the individual losses.3. Dictionary of losses without total key
Individual losses will be logged as
'loss/train/{loss_name}'
. There will also be'loss/train/total'
which is the sum of the individual losses.4. Dictionary of losses with total key
Individual losses will be logged as
'loss/train/{loss_name}'
. Loss given at'loss/train/total'
will be used as the loss for the backpropagation, individual losses are not summed when'total'
is present.Questions
torch.Tensor
, is this fine?