-
Notifications
You must be signed in to change notification settings - Fork 214
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MACE 0.3.9 can not jit compile model trained with previous versions #741
Comments
What is your torch version? It is normal that the torch jit does not work. I think it is just you can not compile old models but only models trained with 0.3.9 (we have tests for that). If you want to compile a model trained with an old version, that you need to use that version. |
Thanks, I'll check the |
Exactly, mace-off was trained with an older version of mace such that you can not compile it with 0.3.9. You need to use an older version, (probably anything up to 0.3.8). |
Thanks, that makes sense. However, if that's the case, shouldn't there be logic to download a compatible version at runtime? Surely the models encode the version of MACE that they were trained with? I guess this is probably a question for the MACE-OFF team so I can ask there. |
Unfortunately no, models at the time did not encode the version. You can still evaluate the model with the current version, just not jit compile. For now, I recommand you to just compile using 0.3.6, and save the compiled model somewhere. |
Thanks, no problem. I'll just add a pin for now. |
Just to confirm that, as suggested, MACE-OFF works with MACE up to and including version 0.3.8.
Thanks, this would be really helpful. Our use case is to create a dual MACE-EMLE model at runtime (EMLE does the elecstrostatic embedding). We need it to be serializable so that it can be loaded with OpenMM-ML, or directly in C++. Having self-consistency between the version of MACE and MACE-OFF would be ideal. |
@lohedges, while you wait for the update you can just load the state dict of the old model into an instance of the model with the latest MACE version.
For the medium mace_off the hyperparams are:
|
Thanks @RokasEl, that's a good idea. One can even extract the hypers directly using:
You can swap to any MACE model for the source model, it will extract the hypers. |
Brilliant, many thanks for this @RokasEl. I'll add this in now. |
Can confirm that this works perfectly. Thanks again for your help. |
It appears that the MACE models generated by version 0.3.9 aren't TorchScript compliant. To reproduce:
Gives (truncated):
The same error is triggered using
e3nn
to compile the model, e.g:This works fine with version 0.3.6, which is what I was using previously. For reference, I am using Python 3.10 on Linux x86 and installed
mace-torch
viapip
.Cheers.
The text was updated successfully, but these errors were encountered: