Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add wandb log #92

Open
wants to merge 3 commits into
base: dev
Choose a base branch
from
Open

Add wandb log #92

wants to merge 3 commits into from

Conversation

Sentz98
Copy link

@Sentz98 Sentz98 commented Jun 28, 2024

Add wandb log, from default is not active.
To start log just set hyperparameter 'log_wandb' = True
During train loop it will automatically log train, validation metrics and lr if there is a lr_scheduler.
It start a wandb project named by default 'micromind' and each run are the name of the experiment. It automatically resume the log if the experiment restart after a crash.

Copy link
Collaborator

@fpaissan fpaissan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @Sentz98!

I left some comments on the code. During the next week, I will proceed to test the code functionalities. If it works well, we can merge once the changes are approved.

"output_folder": "results",
"experiment_name": "micromind_exp",
"opt": "adam", # this is ignored if you are overriding the configure_optimizers
"lr": 0.001, # this is ignored if you are overriding the configure_optimizers
"debug": False,
"log_wandb": False,
"wandb_resume": "auto", # ["allow", "must", "never", "auto" or None]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure what is the best way to do this. But we need to define better the effect of each field of default_cfg.

Maybe a long comment ("""xx""") after the closing bracket?

name=self.hparams.experiment_name,
resume=self.hparams.wandb_resume,
id=self.hparams.experiment_name,
config=self.hparams,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here we can check if the configuration provides any extra arguments to be passed to the init. eventually, we load them (check the usage of ** operator).

@@ -574,6 +594,10 @@ def train(
else:
val_metrics = train_metrics.update({"val_loss": loss_epoch / (idx + 1)})

if self.hparams.log_wandb: # wandb log
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I might be wrong, but it would be nicer if we divide the train and valid panes inside the wandb logging. this would look something like val/{metric_name} will be logged instead of val_{metric_name}. ofc this change only applies to the .log call and should not be propagated to the entire core.

@fpaissan fpaissan assigned fpaissan and Sentz98 and unassigned fpaissan Jul 5, 2024
@fpaissan fpaissan added the enhancement New feature or request label Jul 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants