Skip to content
Mark Mikofski edited this page Apr 10, 2016 · 7 revisions

Welcome to the UncertaintyWrapper wiki!

Uncertainty theory

Uncertainty of f_i(x_j) can be determined with the covariance matrix, Sx_i,x_j and the Jacobian matrix, J = df_i/dx_j as

df * df = J * S * J.T

where J.T is the transpose of J. The diagonals of df * df are the variance of each function and the square root of the variance is the standard deviation. For more detail see Wikipedia entry for uncertainty propagation.

Jacobians and Derivatives

There are several methods to derive the Jacobian matrix. All of these methods require the calculation of derivatives.

Automatic Differentiation (AD)

Automatic Differentiation, also known as Algorithmic Differentiation is a method of automatically calculating derivatives. Actual AD methods vary, but usually involve overloading basic numerical operations, therefore use of AD generally requires alteration of the source code. For more detailed information and a list of AD tools by platform and application please consult the Community Portal for Automatic Differentiation, a site that promotes auto-differentiation. The COIN|OR project also host several AD projects including ADOL-C and CppAD.

Univariate Taylor Series Polynomials and Matrices (UTP/M)

This methods of AD uses a property of Taylor series expansions of unitary operations to propagate derivatives. Essentially it can be shown that Taylor series can be convolved using the same algebra as the underlying functions and the derivatives will be the coefficients of the resulting Taylor series. This method (is or is not) faster than propagating analytical derivatives using the chain rule.

Analytical Derivatives and the Chain Rule

Most analytical expressions have known derivatives, therefore derivatives of analytical expressions can be expressed using the chain rule.

Symbolic Derivatives

Symbolic representation of functions, similar to SymPy, MAPLE, MathCAD, Mathematica and MuPAD.

AlgoPy

UncertaintyWrapper uses AlgoPy a Python AD package by Heidelberg University professor Sebastian Walker that is based on ADOL-C and CppAD. Prof. Walker published his work on AlgoPy in the Journal of Computational Science. The AlgoPy documentation is also hosted online and the source is on GitHub.

Numdifftools

UncertaintyWrapper attempts to use AlgoPy to automatically calculate derivatives. If the underlying function can't be overloaded, for example if it is a C/C++ or FORTRAN library, then UncertaintyWrapper falls back on numerical differentiation using Numdifftools by Per A. Brodtkorb. Numdifftools conveniently wraps calls to AlgoPy so that both methods can be used interchangably in most situations. The numerical differentiation technique in Numdifftools is based on adaptive numerical differentiation in MATLAB by John D'Errico. The documentation for Numdifftools is online and the source is on GitHub.

Other tools

There are many tools that provide either uncertainty propagation, derivatives or Jacobian matrices. Attempting to benchmark and choose the best tool is an ongoing process. Notes and links are below.

Uncertainties

  • Uncertainties - Docs and source. Uses analytical derivatives and the chain rule to propagate uncertainty.
  • Quantities - Linear combinations only using analytical derivatives and the chain rule.
  • Pint - Will use Uncertainties if available, otherwise performs linear combinations like Quantities.
  • mcerp - Monte-Carlo Error Propagation, by ad author Abraham Lee.
  • soerp - Second Order Error Propagation by ad author Abraham Lee.

Numerical Differentiation

Automatic Differentiation

  • ad - Docs and source. Like Uncertainties, ad computes derivatives analytically and propagates them using the chain rule.
  • PyADOLC - wrappers around ADOL-C by AlgoPy author Sebastian Walker.
  • PyCppAD - wrappers around CppAD by AlgoPy author Sebastian Walker
  • AUDi - Source
  • CasADi - Source