Skip to content

A normalizing flow using Bernstein polynomials for conditional density estimation.

License

Notifications You must be signed in to change notification settings

MArpogaus/bernstein_flow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Contributors Forks Stargazers Issues Apache License 2.0 LinkedIn

Bernstein-Polynomials as TensorFlow Probability Bijector

This Repository contains an implementation of a normalizing flow for conditional density estimation using Bernstein polynomials, as proposed in:

Sick Beate, Hothorn Torsten and Dürr Oliver, Deep transformation models: Tackling complex regression problems with neural network based transformation models, 2020. online

The tfp.Bijector interface is used for the implementation to benefit from the powerful TensorFlow Probability framework.

The Need for Flexible Distributions

Traditional regression models assume normality and homoscedasticity of the data, i.e. the residuals for each input value are expected to be normally distributed with constant variance. However, the shape of the data distribution in many real use cases is much more complex.

The following example of a classical data set containing the waiting time between eruptions of the Old Faithful Geyser in Yellowstone National Park is used as an example.

Gaussian Normalizing Flow
gauss flow

As shown in the left figure, the normality assumption is clearly violated by the bimodal nature of the data. However, the proposed transformation model has the flexibility to adapt to this complexity.

Getting Started

To start using my code follow these simple steps.

Installation

Pull and install it directly from git using pip:

pip install git+https://github.com/MArpogaus/TensorFlow-Probability-Bernstein-Polynomial-Bijector.git

Or clone this repository and install it from there:

git clone https://github.com/MArpogaus/TensorFlow-Probability-Bernstein-Polynomial-Bijector.git ./bernstein_flow
cd bernstein_flow
pip install -e .

Prerequisites

Pip should take care of installing the required dependencies on its own. For completeness, these are the packages used in the implementation:

Usage

Package Structure

This python package consists of four main components:

  • berstein_flow.bijectors.BernsteinBijector: The implementation of Bernstein polynomials using the tfb.Bijector interface for transformations of tfd.Distribution samples.
  • berstein_flow.distributions.BernsteinFlow: The implementation of a tfd.TransformedDistribution using the Bernstein polynomials as the bijector.
  • berstein_flow.losses.BernsteinFlowLoss: The implementation of a tfk.losses.Loss function to calculate the negative logarithmic likelihood using the BernstinFlow distribution.
  • berstein_flow.util.visualization: Contains of some convenient helper functions for visualization.

Using the Model as a tfpl.DistributionLambda

A tfd.TransformedDistribution using the BernsteinBijector is provided in the module bernstein_flow.distributions.BernsteinFlow:

from bernstein_flow.distributions import BernsteinFlow

Use it like any other distribution, i.e. as a tfpl.DistributionLambda.

The two example plots shown above have been generated using the following two models.

Gaussian Model

gauss_model = tf.keras.Sequential()
gauss_model.add(InputLayer(input_shape = (1)))
#Here could come a gigantus network
gauss_model.add(Dense(2)) # mean and the std of the Gaussian
gauss_model.add(tfp.layers.DistributionLambda(
    lambda pv:
        tfd.Normal(loc=pv[:,0], scale=1e-3 + tf.math.softplus(0.05 * pv[:,1]))))

Normalizing FLow

flow_model = tf.keras.Sequential()
flow_model.add(InputLayer(input_shape = (1)))
#Here could come a gigantus network
flow_model.add(Dense(4 + 5)) # Bernstein coefficients and 2 times scale and shift
flow_model.add(tfp.layers.DistributionLambda(BernsteinFlow))

Examples

You can find two examples in the ipynb directory:

  • TheoreticalBackground.ipynb: Some explanation of the theoretical fundamentals
  • Gaussian_vs_Transformation_Model.ipynb: Bimodal data example shown in the figures above.

Contributing

If you have any technical issues or suggestion regarding my implementation, please feel free to either contact me, open an issue or send me a Pull Request:

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

Any contributions are greatly appreciated.

License

Distributed under the Apache License 2.0