Setting gradient attributes to Trainer
when using manual optimization
#20463
Unanswered
clement-pages
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey lightning team,
I am currently working on adding manual optimization to
pyannote.audio
in this pull request. The goal is to let the user choose between manual and automatic optimization by settingmodel.automatic_optimization
. In our code, we want to retrieve gradient attributes (for instancegradient_clip_val
orgradient_clip_algorithm
) from the trainer by doingmodel.trainer.gradient_clip_val
for example.The problem is that we cannot instanciate a
Trainer
object withgradient_clip_val
if we setautomatic_optimization
toFalse
. Lightning throws the following exception:So my question is : Is there any way to instanciate a trainer with gradient attributes and manual optimization without rising this error ? Maybe it could be a warning message instead of an error ? I have found a work around, but not having this error would clearly simplify the code.
Have a nice day!
Beta Was this translation helpful? Give feedback.
All reactions