-
Notifications
You must be signed in to change notification settings - Fork 3
Home
Welcome to the UncertaintyWrapper wiki!
Uncertainty of f_i(x_j)
can be determined with the covariance matrix, Sx_i,x_j
and the Jacobian matrix, J = df_i/dx_j
as
df * df = J * S * J.T
where J.T
is the transpose of J
. The diagonals of df * df
are the variance of each function and the square root of the variance is the standard deviation. For more detail see Wikipedia entry for uncertainty propagation.
There are several methods to derive the Jacobian matrix. All of these methods require the calculation of derivatives.
Automatic Differentiation, also known as Algorithmic Differentiation is a method of automatically calculating derivatives. Actual AD methods vary, but usually involve overloading basic numerical operations, therefore use of AD generally requires alteration of the source code. For more detailed information and a list of AD tools by platform and application please consult the Community Portal for Automatic Differentiation, a site that promotes auto-differentiation. The COIN|OR project also host several AD projects including ADOL-C and CppAD.
This methods of AD uses a property of Taylor series expansions of unitary operations to propagate derivatives. Essentially it can be shown that Taylor series can be convolved using the same algebra as the underlying functions and the derivatives will be the coefficients of the resulting Taylor series. This method (is or is not) faster than propagating analytical derivatives using the chain rule.
Most analytical expressions have known derivatives, therefore derivatives of analytical expressions can be expressed using the chain rule.
Symbolic representation of functions, similar to SymPy, MAPLE, MathCAD, Mathematica and MuPAD.
UncertaintyWrapper uses AlgoPy a Python AD package by Heidelberg University professor Sebastian Walker that is based on ADOL-C and CppAD. Prof. Walker published his work on AlgoPy in the Journal of Computational Science. The AlgoPy documentation is also hosted online and the source is on GitHub.
UncertaintyWrapper attempts to use AlgoPy to automatically calculate derivatives. If the underlying function can't be overloaded, for example if it is a C/C++ or FORTRAN library, then UncertaintyWrapper falls back on numerical differentiation using Numdifftools by Per A. Brodtkorb. Numdifftools conveniently wraps calls to AlgoPy so that both methods can be used interchangably in most situations. The numerical differentiation technique in Numdifftools is based on adaptive numerical differentiation in MATLAB by John D'Errico. The documentation for Numdifftools is online and the source is on GitHub.
There are many tools that provide either uncertainty propagation, derivatives or Jacobian matrices. Attempting to benchmark and choose the best tool is an ongoing process. Notes and links are below.
- Uncertainties - Docs and source. Uses analytical derivatives and the chain rule to propagate uncertainty.
- Quantities - Linear combinations only using analytical derivatives and the chain rule.
- Pint - Will use Uncertainties if available, otherwise performs linear combinations like Quantities.
- mcerp - Monte-Carlo Error Propagation, by ad author Abraham Lee.
- soerp - Second Order Error Propagation by ad author Abraham Lee.