v1.12
- Fix a bug when transfering the optimizer on another device caused by a new feature in PyTorch 1.12, i.e. the "capturable" parameter in Adam and AdamW.
- Add utilitary functions for saving (
save_random_states
) and loading (load_random_states
) Python, Numpy and Pytorch's (both CPU and GPU) random states. Furthermore, we also add theRandomStatesCheckpoint
callback. This callback is now used in ModelBundle.