Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation fixes #1120

Merged
merged 16 commits into from
Oct 2, 2023
Merged
2 changes: 2 additions & 0 deletions doc/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ API reference
pypesto.history
pypesto.logging
pypesto.objective
pypesto.objective.aesara
pypesto.objective.jax
pypesto.objective.julia
pypesto.optimize
pypesto.petab
Expand Down
27 changes: 26 additions & 1 deletion pypesto/hierarchical/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,32 @@
Hierarchical
============

Hierarchical optimization sub-package.
Contains an implementation of the hierarchical optimization approach, which
decomposes the parameter estimation problem into an outer and an inner problem.
In the outer problem, only dynamic parameters are optimized.
In the inner problem, conditional on the outer solution, static parameters are
optimized.
Static parameters can be parameters affecting directly the model observables,
such as scaling factors, offsets, and noise parameters.

Hierarchical optimization has the advantage that the outer problem is typically
less complex than the full problem, and thus can be solved more efficiently.
Further, the inner problem can often be solved analytically, which is more
efficient.
Thus, hierarchical optimization can be used to speed up parameter estimation,
finding optimal values more efficiently and reliably.

The implementation in this package is based on:

* Loos et al. 2018 (https://doi.org/10.1093/bioinformatics/bty514),
who give an analytic solution for the inner problem for scaling factors and
noise standard deviations, for Gaussian and Laplace noise, using forward
sensitivity analysis (FSA).
* Schmiester et al. 2020 (https://doi.org/10.1093/bioinformatics/btz581),
who give an analytic solution for the inner problem for scaling factors,
offsets and noise standard deviations, for Gaussian and Laplace noise,
using adjoint sensitivity analysis (ASA). ASA allows to calculate gradients
substantially more efficiently in high dimension.
"""

from .calculator import HierarchicalAmiciCalculator
Expand Down
2 changes: 1 addition & 1 deletion tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,7 @@ description =

[testenv:doc]
extras =
doc,amici,petab
doc,amici,petab,aesara,jax
commands =
sphinx-build -W -b html doc/ doc/_build/html
description =
Expand Down
Loading