Video Classification lr_schedulers and custom optimizer #1223
Replies: 1 comment 2 replies
-
Hello @dudeperf3ct , The issue But until then a fix has to be implemented for it's compatibility with Flash. Regarding the custom optimizers, currently I don't think it is not possible to do something like that.
The above lines implement the instantiation of the optimizer inside a flash task. As you can see that the model parameters that are un-frozen will be passed on to the function that is registered. Since a generator of the parameters are directly passed, it is difficult to differentiate between the head or the backbone just using the parameters. So, the answer would be NO. Thanks for pointing this use case out which clearly I had missed out during the implementation. Will fix it soon. |
Beta Was this translation helpful? Give feedback.
-
What is best way to access length of
train_dataloader
?Using OneCycleLR requires to specify either
total_steps
or bothepochs
andsteps_per_epoch
. There's a get_num_training_steps() to obtain total steps but how can we use it in above example?Bug Report:
Replacing
onecyclelr
with hugging face scheduler (installed transformers before running)Bug report:
Versions
pl/flash/transformers/torch : 1.5.10 / 0.7.1 / 4.17.0 / 1.9.0
Is it possible to write custom optimizer with different learning rates?
Beta Was this translation helpful? Give feedback.
All reactions