Releases: GRAAL-Research/poutyne
Releases · GRAAL-Research/poutyne
v1.17.3
v1.17.2
v1.17.1
v1.17
FBeta
is using the non-deterministic torch functionbincount
. Either by passing the argumentmake_deterministic
to theFBeta
class or by using one of the PyTorch functionstorch.set_deterministic_debug_mode
ortorch.use_deterministic_algorithms
, you can now make this function deterministic. Note that this might make your code slower.
v1.16
- Add
run_id
andterminate_on_end
arguments to MLFlowLogger.
Breaking change:
- In MLFlowLogger, except for
experiment_name
, all arguments must now be passed as keyword arguments. Passingexperiment_name
as a positional argument is also deprecated and will be removed in future versions.
v1.15
v1.14
v1.13
Breaking changes:
- The deprecated
torch_metrics
keyword argument has been removed. Users should use thebatch_metrics
orepoch_metrics
keyword argument for torchmetrics' metrics. - The deprecated
EpochMetric
class has been removed. Users should implement theMetric
class instead.
v1.12.1
v1.12
- Fix a bug when transfering the optimizer on another device caused by a new feature in PyTorch 1.12, i.e. the "capturable" parameter in Adam and AdamW.
- Add utilitary functions for saving (
save_random_states
) and loading (load_random_states
) Python, Numpy and Pytorch's (both CPU and GPU) random states. Furthermore, we also add theRandomStatesCheckpoint
callback. This callback is now used in ModelBundle.