1.2.0
Misc changes
- Updated to PyTorch 1.4.0 and torchvision 0.5.0.
- Changed license from MIT to Apache 2.0 and removed Uber CLA as part of Pyro's move to the Linux foundation.
Reparameterization
This release adds a new effect handler and a collection of strategies that reparameterize models to improve geometry. These tools are largely orthogonal to other inference tools in Pyro, and can be used with SVI, MCMC, and other inference algorithms.
- poutine.reparam() is a new effect handler that transforms models into other models for which inference may be easier (Gorinova et al. 2019).
- pyro.infer.reparam is a collection of reparameterization strategies following a standard Reparam interface:
- Decentering transforms for location-scale families (Gorinova et al. 2019).
- Transform unwrapping to deconstruct
TransformedDistribution
s. - Discrete Cosine transforms for frequency-domain parameterizations (useful for inference in time series).
- Auxiliary variable methods for Levy Stable and StudentT distributions.
- Linear Hidden Markov Model reparameterization, allowing a range of non-Gaussian HMMs to be treated as conditional Gaussian processes.
- Neural Transport uses SVI to learn the geometry of a model before drawing samples using HMC (Hoffman et al. 2019).
- A tutorial on inference with Levy Stable distrubutions, demonstrating StableReparam, DiscreteCosineReparam, and EnergyDistance.
Other new features
- A tutorial on Dirichlet process mixture modeling, contributed by @m-k-S
- Added a LinearHMM distribution with an
.rsample()
method. This supports non-Gaussian noise such as Levy Stable and StudentT, but requires reparameterization for inference. - Implemented a GaussianHMM.rsample() method for drawing joint samples from a linear-Gaussian HMM.
- Added a LowerCholeskyAffine transform.
- #2264 improves speed and numerical stability of
MultivariateNormal
conversion fromscale_tril
toprecision
.