Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trainers should ignore the ignore parameter #2314

Closed
wants to merge 1 commit into from
Closed

Conversation

calebrob6
Copy link
Member

If I create a new class that extends SemanticSegmentationTask:

class CustomSemanticSegmentationTask(SemanticSegmentationTask):

    # any keywords we add here between *args and **kwargs will be found in self.hparams
    def __init__(self, *args, tmax=50, eta_min=1e-6, **kwargs) -> None:
        super().__init__(*args, **kwargs)  # pass args and kwargs to the parent class

Everything works great when I first instantiate the class

task = CustomSemanticSegmentationTask(model="unet", tmax=100, ...)

However, when I go to load a checkpoint from file:

task = CustomSemanticSegmentationTask.load_from_checkpoint("lightning_logs/version_3/checkpoints/epoch=0-step=117.ckpt")

I get an error:

TypeError: SemanticSegmentationTask.__init__() got an unexpected keyword argument 'ignore'

This happens because there is an ignore parameter stored in task.hparams, because SemanticSegmentationTask passes it to BaseTask, -- https://github.com/microsoft/torchgeo/blob/main/torchgeo/trainers/segmentation.py#L98.

I can add del kwargs["ignore"] before super().... in the constructor of CustomSemanticSegmentationTask but this feels like a bad hack, so I just make BaseTask ignore ignore as a hyperparameter.

@github-actions github-actions bot added the trainers PyTorch Lightning trainers label Sep 23, 2024
@adamjstewart adamjstewart added this to the 0.6.1 milestone Sep 24, 2024
@adamjstewart
Copy link
Collaborator

Alternatives:

  1. Remove save_hyperparameters from BaseTask, add to all subclasses. But we use these hparams in configure_*, so we would have to move all of those too.
  2. Add ignore as a parameter to all subclasses. Kind of feels silly to do this.
  3. Add **kwargs as a parameter to all subclasses. Ignores any unknown keys, which is very bad in the case of typos.
  4. Remove MoCo and SimCLR parameters that require ignoring.

EDIT: I think I have a better idea. Let me open a PR to demonstrate it.

@adamjstewart
Copy link
Collaborator

P.S. I don't actually think 4 is a horrible idea, as it isn't possible to restore these hparams from a checkpoint right now.

@calebrob6
Copy link
Member Author

What's wrong with this fix?

@adamjstewart
Copy link
Collaborator

It feels like a workaround, that's all. It actually makes sense if you think about it, but it doesn't look as clean as what I'm now thinking of.

@calebrob6
Copy link
Member Author

Ah yep, completely agree that it is a workaround

@calebrob6
Copy link
Member Author

Closing in favor of #2317

@calebrob6 calebrob6 closed this Sep 27, 2024
@calebrob6 calebrob6 deleted the trainer_fix branch September 27, 2024 15:55
@adamjstewart adamjstewart removed this from the 0.6.1 milestone Sep 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
trainers PyTorch Lightning trainers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants