restart chgnet training (finetune) #172
chtchelkatchev
started this conversation in
General
Replies: 2 comments
-
Four finetune I tried the following procedure (is it correct?): Here 803--is the last epoch of the previous finetuning. CosRestartLR repalces CosLR. |
Beta Was this translation helpful? Give feedback.
0 replies
-
You can try directly loading the trainer. This way you don't have to redefine parameters. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is there possibility to restart training (finetune) of chgnet?
Due to supercomputer limitations I can use A100 gpus only 24 hours, after that I must restart any calculations. 24 hours is not enough sometimes for training (finetune) chgnet.
Is there possiblity to use several gpu for training (if this speeds up)?
Beta Was this translation helpful? Give feedback.
All reactions