diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml
index 592b7753d..2375428f3 100644
--- a/.github/workflows/main.yml
+++ b/.github/workflows/main.yml
@@ -120,9 +120,9 @@ jobs:
strategy:
fail-fast: false
steps:
- - uses: actions/checkout@v3
+ - uses: actions/checkout@v4
- name: create build environment
- uses: mamba-org/provision-with-micromamba@main
+ uses: mamba-org/setup-micromamba@v1
with:
environment-file: ./.envs/testenv-linux.yml
environment-name: optimagic
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 43ba1e7cf..ab79e1952 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -50,8 +50,6 @@ repos:
- id: trailing-whitespace
exclude: docs/
- id: check-ast
- - id: check-docstring-first
- exclude: src/optimagic/optimization/algo_options.py
- repo: https://github.com/adrienverge/yamllint.git
rev: v1.35.1
hooks:
diff --git a/CHANGES.md b/CHANGES.md
index 51683d4cb..d2fab99e8 100644
--- a/CHANGES.md
+++ b/CHANGES.md
@@ -18,10 +18,43 @@ the major changes are:
- Rename the package from `estimagic` to `optimagic` (while keeping the `estimagic`
namespace for the estimation capabilities).
-### Breaking changes
- {gh}`500` removes the dashboard, the support for simopt optimizers and the
`derivative_plot` ({ghuser}`janosg`)
+- {gh}`504` aligns `maximize` and `minimize` more closely with scipy. All related
+ deprecations and breaking changes are listed below. As a result, scipy code that uses
+ minimize with the arguments `x0`, `fun`, `jac` and `method` will run without changes
+ in optimagic. Similarly, to `OptimizeResult` gets some aliases so it behaves more
+ like SciPy's.
+
+### Breaking changes
+
+
+### Deprecations
+
+- The `criterion` argument of `maximize` and `minimize` is renamed to `fun` (as in
+ SciPy).
+- The `derivative` argument of `maximize` and `minimize` is renamed to `jac` (as
+ in SciPy)
+- The `criterion_and_derivative` argument of `maximize` and `minimize` is renamed
+ to `fun_and_jac` to align it with the other names.
+- The `criterion_kwargs` argument of `maximize` and `minimize` is renamed to
+ `fun_kwargs` to align it with the other names.
+- The `derivative_kwargs` argument of `maximize` and `minimize` is renamed to
+ `jac_kwargs` to align it with the other names.
+- The `criterion_and_derivative_kwargs` argument of `maximize` and `minimize` is
+ renamed to `fun_and_jac_kwargs` to align it with the other names.
+- Algorithm specific convergence and stopping criteria are renamed to align them more
+ with NlOpt and SciPy names.
+ - `convergence_relative_criterion_tolerance` -> `convergence_ftol_rel`
+ - `convergence_absolute_criterion_tolerance` -> `convergence_ftol_abs`
+ - `convergence_relative_params_tolerance` -> `convergence_xtol_rel`
+ - `convergence_absolute_params_tolerance` -> `convergence_xtol_abs`
+ - `convergence_relative_gradient_tolerance` -> `convergence_gtol_rel`
+ - `convergence_absolute_gradient_tolerance` -> `convergence_gtol_abs`
+ - `convergence_scaled_gradient_tolerance` -> `convergence_gtol_scaled`
+ - `stopping_max_criterion_evaluations` -> `stopping_maxfun`
+ - `stopping_max_iterations` -> `stopping_maxiter`
## 0.4.7
diff --git a/README.md b/README.md
index 874e520df..af35ebef2 100644
--- a/README.md
+++ b/README.md
@@ -12,39 +12,21 @@
[![image](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![image](https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg)](CODE_OF_CONDUCT.md)
[![image](https://pepy.tech/badge/estimagic/month)](https://pepy.tech/project/estimagic)
+[![image](https://img.shields.io/badge/NumFOCUS-affiliated%20project-orange.svg?style=flat&colorA=E1523D&colorB=007D8A)](https://numfocus.org/sponsored-projects/affiliated-projects)
+[![image](https://img.shields.io/twitter/follow/aiidateam.svg?style=social&label=Follow)](https://x.com/optimagic)
## Introduction
-optimagic is a Python package for nonlinear optimization with or without constraints. It
-is particularly suited to solve difficult nonlinear estimation problems. On top, it
-provides functionality to perform statistical inference on estimated parameters.
+*optimagic* is a Python package for numerical optimization. It is a unified interface to
+optimizers from SciPy, NlOpt and many other Python packages.
-### Optimization
+*optimagic*'s `minimize` function works just like SciPy's, so you don't have to adjust
+your code. You simply get more optimizers for free. On top you get powerful diagnostic
+tools, parallel numerical derivatives and more.
-- optimagic wraps algorithms from *scipy.optimize*, *nlopt*, *pygmo* and more.
-- optimagic implements constraints efficiently via reparametrization, so you can solve
- constrained problems with any optimizer that supports bounds.
-- The parameters of an optimization problem can be arbitrary pytrees
-- The complete history of parameters and function evaluations can be saved in a database
- for maximum reproducibility.
-- Painless and efficient multistart optimization.
-- The progress of the optimization can be displayed in `criterion_plot` and
- `params_plot` while the optimization is still running.
-
-### Estimation and Inference
-
-- You can estimate a model using method of simulated moments (MSM), calculate standard
- errors and do sensitivity analysis with just one function call.
-- Asymptotic standard errors for maximum likelihood estimation.
-- estimagic also provides bootstrap confidence intervals and standard errors. Of course
- the bootstrap procedures are parallelized.
-
-### Numerical differentiation
-
-- optimagic can calculate precise numerical derivatives using
- [Richardson extrapolations](https://en.wikipedia.org/wiki/Richardson_extrapolation).
-- Function evaluations needed for numerical derivatives can be done in parallel with
- pre-implemented or user provided batch evaluators.
+*optimagic* was formerly called *estimagic*, because it also provides functionality to
+perform statistical inference on estimated parameters. *estimagic* is now a subpackage
+of *optimagic*.
## Installation
@@ -101,3 +83,14 @@ If you use optimagic for your research, please do not forget to cite it.
Url = {https://github.com/OpenSourceEconomics/optimagic}
}
```
+
+## Acknowledgements
+
+We thank all institutions that have funded or supported optimagic (formerly estimagic)
+
+
+
+
+
+
+
diff --git a/docs/source/algorithms.md b/docs/source/algorithms.md
index 192070a90..c222a0566 100644
--- a/docs/source/algorithms.md
+++ b/docs/source/algorithms.md
@@ -43,7 +43,7 @@ install optimagic.
The lbfgsb algorithm is almost perfectly scale invariant. Thus, it is not necessary
to scale the parameters.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative improvement
+ - **convergence.ftol_rel** (float): Stop when the relative improvement
between two iterations is smaller than this. More formally, this is expressed as
.. math::
@@ -52,11 +52,11 @@ install optimagic.
\text{relative_criterion_tolerance}
- - **convergence.absolute_gradient_tolerance** (float): Stop if all elements of the projected
+ - **convergence.gtol_abs** (float): Stop if all elements of the projected
gradient are smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this as convergence.
- - **stopping.max_iterations** (int): If the maximum number of iterations is reached,
+ - **stopping.maxiter** (int): If the maximum number of iterations is reached,
the optimization stops, but we do not count this as convergence.
- **limited_memory_storage_length** (int): Maximum number of saved gradients used to approximate the hessian matrix.
@@ -82,9 +82,9 @@ install optimagic.
.. note::
SLSQP's general nonlinear constraints are not supported yet by optimagic.
- - **convergence.absolute_criterion_tolerance** (float): Precision goal for the value of
+ - **convergence.ftol_abs** (float): Precision goal for the value of
f in the stopping criterion.
- - **stopping.max_iterations** (int): If the maximum number of iterations is reached,
+ - **stopping.maxiter** (int): If the maximum number of iterations is reached,
the optimization stops, but we do not count this as convergence.
```
@@ -112,14 +112,14 @@ install optimagic.
The argument `initial_simplex` is not supported by optimagic as it is not
compatible with optimagic's handling of constraints.
- - **stopping.max_iterations** (int): If the maximum number of iterations is reached, the optimization stops,
+ - **stopping.maxiter** (int): If the maximum number of iterations is reached, the optimization stops,
but we do not count this as convergence.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function evaluation is reached,
+ - **stopping.maxfun** (int): If the maximum number of function evaluation is reached,
the optimization stops but we do not count this as convergence.
- - **convergence.absolute_params_tolerance** (float): Absolute difference in parameters between iterations
+ - **convergence.xtol_abs** (float): Absolute difference in parameters between iterations
that is tolerated to declare convergence. As no relative tolerances can be passed to Nelder-Mead,
optimagic sets a non zero default for this.
- - **convergence.absolute_criterion_tolerance** (float): Absolute difference in the criterion value between
+ - **convergence.ftol_abs** (float): Absolute difference in the criterion value between
iterations that is tolerated to declare convergence. As no relative tolerances can be passed to Nelder-Mead,
optimagic sets a non zero default for this.
- **adaptive** (bool): Adapt algorithm parameters to dimensionality of problem.
@@ -151,9 +151,9 @@ install optimagic.
is part of the scipy interface is not supported by optimagic because it is
incompatible with how optimagic handles constraints.
- - **convergence.relative_params_tolerance (float)**: Stop when the relative movement between parameter
+ - **convergence.xtol_rel (float)**: Stop when the relative movement between parameter
vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative improvement between two
+ - **convergence.ftol_rel** (float): Stop when the relative improvement between two
iterations is smaller than this. More formally, this is expressed as
.. math::
@@ -161,9 +161,9 @@ install optimagic.
\frac{(f^k - f^{k+1})}{\\max{{\{|f^k|, |f^{k+1}|, 1\}}}} \leq
\text{relative_criterion_tolerance}
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function evaluation is reached,
+ - **stopping.maxfun** (int): If the maximum number of function evaluation is reached,
the optimization stops but we do not count thisas convergence.
- - **stopping.max_iterations** (int): If the maximum number of iterations is reached, the optimization stops,
+ - **stopping.maxiter** (int): If the maximum number of iterations is reached, the optimization stops,
but we do not count this as convergence.
```
@@ -184,8 +184,8 @@ install optimagic.
expansion near an optimum. However, BFGS can have acceptable performance even
for non-smooth optimization instances.
- - **convergence.absolute_gradient_tolerance** (float): Stop if all elements of the gradient are smaller than this.
- - **stopping.max_iterations** (int): If the maximum number of iterations is reached, the optimization stops,
+ - **convergence.gtol_abs** (float): Stop if all elements of the gradient are smaller than this.
+ - **stopping.maxiter** (int): If the maximum number of iterations is reached, the optimization stops,
but we do not count this as convergence.
- **norm** (float): Order of the vector norm that is used to calculate the gradient's "score" that
is compared to the gradient tolerance to determine convergence. Default is infinite which means that
@@ -217,9 +217,9 @@ install optimagic.
- the gradient is not too large, e.g., has a norm less than 1000.
- The initial guess is reasonably close to the criterion's global minimizer.
- - **convergence.absolute_gradient_tolerance** (float): Stop if all elements of the
+ - **convergence.gtol_abs** (float): Stop if all elements of the
gradient are smaller than this.
- - **stopping.max_iterations** (int): If the maximum number of iterations is reached,
+ - **stopping.maxiter** (int): If the maximum number of iterations is reached,
the optimization stops, but we do not count this as convergence.
- **norm** (float): Order of the vector norm that is used to calculate the gradient's
"score" that is compared to the gradient tolerance to determine convergence.
@@ -266,10 +266,10 @@ install optimagic.
- the gradient is not too large, e.g., has a norm less than 1000.
- The initial guess is reasonably close to the criterion's global minimizer.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this. Newton CG uses the average
relative change in the parameters for determining the convergence.
- - **stopping.max_iterations** (int): If the maximum number of iterations is reached,
+ - **stopping.maxiter** (int): If the maximum number of iterations is reached,
the optimization stops, but we do not count this as convergence.
@@ -296,9 +296,9 @@ install optimagic.
For more information on COBYLA see :cite:`Powell1994`, :cite:`Powell1998` and
:cite:`Powell2007`.
- - **stopping.max_iterations** (int): If the maximum number of iterations is reached,
+ - **stopping.maxiter** (int): If the maximum number of iterations is reached,
the optimization stops, but we do not count this as convergence.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this. In case of COBYLA this is
a lower bound on the size of the trust region and can be seen as the
required accuracy in the variables but this accuracy is not guaranteed.
@@ -344,20 +344,20 @@ install optimagic.
``messg_num`` which is an additional way to control the verbosity of the optimizer.
- **func_min_estimate** (float): Minimum function value estimate. Defaults to 0.
- stopping_max_iterations (int): If the maximum number of iterations is reached,
+ - **stopping.maxiter** (int): If the maximum number of iterations is reached,
the optimization stops, but we do not count this as convergence.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this as
convergence.
- - **convergence.absolute_params_tolerance** (float): Absolute difference in parameters
+ - **convergence.xtol_abs** (float): Absolute difference in parameters
between iterations after scaling that is tolerated to declare convergence.
- - **convergence.absolute_criterion_tolerance** (float): Absolute difference in the
+ - **convergence.ftol_abs** (float): Absolute difference in the
criterion value between iterations after scaling that is tolerated
to declare convergence.
- - **convergence.absolute_gradient_tolerance** (float): Stop if the value of the
+ - **convergence.gtol_abs** (float): Stop if the value of the
projected gradient (after applying x scaling factors) is smaller than this.
- If convergence.absolute_gradient_tolerance < 0.0,
- convergence.absolute_gradient_tolerance is set to
+ If convergence.gtol_abs < 0.0,
+ convergence.gtol_abs is set to
1e-2 * sqrt(accuracy).
- **max_hess_evaluations_per_iteration** (int): Maximum number of hessian*vector
evaluations per main iteration. If ``max_hess_evaluations == 0``, the
@@ -415,19 +415,19 @@ install optimagic.
It approximates the Hessian using the Broyden-Fletcher-Goldfarb-Shanno (BFGS)
Hessian update strategy.
- - **convergence.absolute_gradient_tolerance** (float): Tolerance for termination
+ - **convergence.gtol_abs** (float): Tolerance for termination
by the norm of the Lagrangian gradient. The algorithm will terminate
when both the infinity norm (i.e., max abs value) of the Lagrangian
gradient and the constraint violation are smaller than the
- convergence.absolute_gradient_tolerance.
+ convergence.gtol_abs.
For this algorithm we use scipy's gradient tolerance for trust_constr.
This smaller tolerance is needed for the sum of squares tests to pass.
- - **stopping.max_iterations** (int): If the maximum number of iterations is reached,
+ - **stopping.maxiter** (int): If the maximum number of iterations is reached,
the optimization stops, but we do not count this as convergence.
- - **convergence.relative_params_tolerance** (float): Tolerance for termination by
+ - **convergence.xtol_rel** (float): Tolerance for termination by
the change of the independent variable. The algorithm will terminate when
the radius of the trust region used in the algorithm is smaller than the
- convergence.relative_params_tolerance.
+ convergence.xtol_rel.
- **trustregion.initial_radius** (float): Initial value of the trust region radius.
The trust radius gives the maximum distance between solution points in
consecutive iterations. It reflects the trust the algorithm puts in the
@@ -453,11 +453,11 @@ install optimagic.
The algorithm supports the following options:
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is below this.
- - **convergence.relative_gradient_tolerance** (float): Stop when the gradient,
+ - **convergence.gtol_rel** (float): Stop when the gradient,
divided by the absolute value of the criterion function is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this as
convergence.
- **tr_solver** (str): Method for solving trust-region subproblems, relevant only
@@ -495,11 +495,11 @@ install optimagic.
The algorithm supports the following options:
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is below this.
- - **convergence.relative_gradient_tolerance** (float): Stop when the gradient,
+ - **convergence.gtol_rel** (float): Stop when the gradient,
divided by the absolute value of the criterion function is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this as
convergence.
- **tr_solver** (str): Method for solving trust-region subproblems, relevant only
@@ -537,11 +537,11 @@ install optimagic.
The algorithm supports the following options:
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is below this.
- - **convergence.relative_gradient_tolerance** (float): Stop when the gradient,
+ - **convergence.gtol_rel** (float): Stop when the gradient,
divided by the absolute value of the criterion function is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this as
convergence.
- **tr_solver** (str): Method for solving trust-region subproblems, relevant only
@@ -675,13 +675,13 @@ install optimagic.
- ‘rand2bin’
- ‘rand1bin’
- - **stopping.max_iterations** (int): The maximum number of criterion evaluations
- without polishing is(stopping.max_iterations + 1) * population_size * number of
+ - **stopping.maxiter** (int): The maximum number of criterion evaluations
+ without polishing is(stopping.maxiter + 1) * population_size * number of
parameters
- **population_size_multiplier** (int): A multiplier setting the population size.
The number of individuals in the population is population_size * number of
parameters. The default 15.
- - **convergence.relative_criterion_tolerance** (float): Default 0.01.
+ - **convergence.ftol_rel** (float): Default 0.01.
- **mutation_constant** (float/tuple): The differential weight denoted by F in
literature. Should be within 0 and 2. The tuple form is used to specify
(min, max) dithering which can help speed convergence. Default is (0.5, 1).
@@ -702,7 +702,7 @@ install optimagic.
number of parameters). The initial population is clipped to bounds before use.
Default is 'latinhypercube'
- - **convergence.absolute_criterion_tolerance** (float):
+ - **convergence.ftol_abs** (float):
CONVERGENCE_SECOND_BEST_ABSOLUTE_CRITERION_TOLERANCE
- **n_cores** (int): The number of cores on which the function is evaluated in
parallel. Default 1.
@@ -743,8 +743,8 @@ install optimagic.
- **convergence.minimum_criterion_tolerance** (float): Specify the relative error
between the current best minimum and the supplied global criterion_minimum
allowed. Default is scipy's default, 1e-4.
- - **stopping.max_iterations** (int): The maximum number of iterations.
- - **stopping.max_criterion_evaluations** (int): The maximum number of criterion
+ - **stopping.maxiter** (int): The maximum number of iterations.
+ - **stopping.maxfun** (int): The maximum number of criterion
evaluations.
- **stopping.max_processing_time** (int): The maximum time allowed for the
optimization.
@@ -771,7 +771,7 @@ install optimagic.
The algorithm supports the following options:
- - **stopping.max_iterations** (int): Specify the maximum number of global searh
+ - **stopping.maxiter** (int): Specify the maximum number of global searh
iterations.
- **local_algorithm** (str): The local optimization algorithm to be used. valid
options are: "Nelder-Mead", "Powell", "CG", "BFGS", "Newton-CG", "L-BFGS-B",
@@ -784,7 +784,7 @@ install optimagic.
- **restart_temperature_ratio** (float): Reanneling starts when the algorithm is decreased to initial_temperature * restart_temperature_ratio. Default is 2e-05.
- **visit** (float): Specify the thickness of visiting distribution's tails. Range is (1, 3] and default is scipy's default, 2.62.
- **accept** (float): Controls the probability of acceptance. Range is (-1e4, -5] and default is scipy's default, -5.0. Smaller values lead to lower acceptance probability.
- - **stopping.max_criterion_evaluations** (int): soft limit for the number of criterion evaluations.
+ - **stopping.maxfun** (int): soft limit for the number of criterion evaluations.
- **seed** (int, None or RNG): Dual annealing is a stochastic process. Seed or
random number generator. Default None.
- **no_local_search** (bool): Specify whether to apply a traditional Generalized Simulated Annealing with no local search. Default is False.
@@ -803,8 +803,8 @@ install optimagic.
The algorithm supports the following options:
- **eps** (float): Specify the minimum difference of the criterion values between the current best hyperrectangle and the next potentially best hyperrectangle to be divided determining the trade off between global and local search. Default is 1e-6 differing from scipy's default 1e-4.
- - **stopping_max_criterion_evaluations** (int/None): Maximum number of criterion evaluations allowed. Default is None which caps the number of evaluations at 1000 * number of dimentions automatically.
- - **stopping_max_iterations** (int): Maximum number of iterations allowed.
+ - **stopping.maxfun** (int/None): Maximum number of criterion evaluations allowed. Default is None which caps the number of evaluations at 1000 * number of dimentions automatically.
+ - **stopping.maxiter** (int): Maximum number of iterations allowed.
- **locally_biased** (bool): Determine whether to use the locally biased variant of the algorithm DIRECT_L. Default is True.
- **convergence.minimum_criterion_value** (float): Specify the global minimum when it is known. Default is minus infinity. For maximization problems, flip the sign.
- **convergence.minimum_criterion_tolerance** (float): Specify the relative error between the current best minimum and the supplied global criterion_minimum allowed. Default is scipy's default, 1e-4.
@@ -841,9 +841,9 @@ We implement a few algorithms from scratch. They are currently considered experi
bhhh supports the following options:
- - **convergence_absolute_gradient_tolerance** (float): Stopping criterion for the
+ - **convergence.gtol_abs** (float): Stopping criterion for the
gradient tolerance. Default is 1e-8.
- - **stopping_max_iterations** (int): Maximum number of iterations.
+ - **stopping.maxiter** (int): Maximum number of iterations.
If reached, terminate. Default is 200.
```
@@ -874,14 +874,14 @@ We implement a few algorithms from scratch. They are currently considered experi
- **adaptive** (bool): Adjust parameters of Nelder-Mead algorithm to account
for simplex size. The default is True.
- - **stopping.max_iterations** (int): Maximum number of algorithm iterations.
+ - **stopping.maxiter** (int): Maximum number of algorithm iterations.
The default is STOPPING_MAX_ITERATIONS.
- - **convergence.absolute_criterion_tolerance** (float): maximal difference between
+ - **convergence.ftol_abs** (float): maximal difference between
function value evaluated on simplex points.
The default is CONVERGENCE_SECOND_BEST_ABSOLUTE_CRITERION_TOLERANCE.
- - **convergence.absolute_params_tolerance** (float): maximal distance between points
+ - **convergence.xtol_abs** (float): maximal distance between points
in the simplex. The default is CONVERGENCE_SECOND_BEST_ABSOLUTE_PARAMS_TOLERANCE.
- **batch_evaluator** (string or callable): See :ref:`batch_evaluators` for
@@ -921,19 +921,19 @@ We implement a few algorithms from scratch. They are currently considered experi
pounders supports the following options:
- - **convergence_absolute_gradient_tolerance**: Convergence tolerance for the
+ - **convergence.gtol_abs**: Convergence tolerance for the
absolute gradient norm. Stop if norm of the gradient is less than this.
Default is 1e-8.
- - **convergence_relative_gradient_tolerance**: Convergence tolerance for the
+ - **convergence.gtol_rel**: Convergence tolerance for the
relative gradient norm. Stop if norm of the gradient relative to the criterion
value is less than this. Default is 1-8.
- - **convergence_scaled_gradient_tolerance**: Convergence tolerance for the
+ - **convergence.gtol_scaled**: Convergence tolerance for the
scaled gradient norm. Stop if norm of the gradient divided by norm of the
gradient at the initial parameters is less than this.
Disabled, i.e. set to False, by default.
- **max_interpolation_points** (int): Maximum number of interpolation points.
Default is `2 * n + 1`, where `n` is the length of the parameter vector.
- - **stopping_max_iterations** (int): Maximum number of iterations.
+ - **stopping.maxiter** (int): Maximum number of iterations.
If reached, terminate. Default is 2000.
- **trustregion_initial_radius (float)**: Delta, initial trust-region radius.
0.1 by default.
@@ -1052,17 +1052,17 @@ need to have [petsc4py](https://pypi.org/project/petsc4py/) installed.
\frac{||g(X)||}{||g(X0)||} < \epsilon
- - **convergence.absolute_gradient_tolerance** (float): Stop if norm of gradient is less than this.
- If set to False the algorithm will not consider convergence.absolute_gradient_tolerance.
- - **convergence.relative_gradient_tolerance** (float): Stop if relative norm of gradient is less
+ - **convergence.gtol_abs** (float): Stop if norm of gradient is less than this.
+ If set to False the algorithm will not consider convergence.gtol_abs.
+ - **convergence.gtol_rel** (float): Stop if relative norm of gradient is less
than this. If set to False the algorithm will not consider
- convergence.relative_gradient_tolerance.
+ convergence.gtol_rel.
- **convergence.scaled_gradient_tolerance** (float): Stop if scaled norm of gradient is smaller
than this. If set to False the algorithm will not consider
convergence.scaled_gradient_tolerance.
- **trustregion.initial_radius** (float): Initial value of the trust region radius.
It must be :math:`> 0`.
- - **stopping.max_iterations** (int): Alternative Stopping criterion.
+ - **stopping.maxiter** (int): Alternative Stopping criterion.
If set the routine will stop after the number of specified iterations or
after the step size is sufficiently small. If the variable is set the
default criteria will all be ignored.
@@ -1136,7 +1136,7 @@ install each of them separately:
3. when a sufficient reduction to the criterion value at the start parameters
has been reached, i.e. when
:math:`\frac{f(x)}{f(x_0)} \leq
- \text{convergence.scaled_criterion_tolerance}`
+ \text{convergence.ftol_scaled}`
4. when all evaluations on the interpolation points fall within a scaled version of
the noise level of the criterion function. This is only applicable if the
@@ -1161,11 +1161,11 @@ install each of them separately:
.. warning::
Very small values, as in most other tolerances don't make sense here.
- - **convergence.scaled_criterion_tolerance** (float):
+ - **convergence.ftol_scaled** (float):
Terminate if a point is reached where the ratio of the criterion value
to the criterion value at the start params is below this value, i.e. if
:math:`f(x_k)/f(x_0) \leq
- \text{convergence.scaled_criterion_tolerance}`. Note this is
+ \text{convergence.ftol_scaled}`. Note this is
deactivated unless the lowest mathematically possible criterion value (0.0)
is actually achieved.
- **convergence.slow_progress** (dict): Arguments for converging when the evaluations
@@ -1193,7 +1193,7 @@ install each of them separately:
Default is no averaging (i.e.
``noise_n_evals_per_point(...) = 1``).
- **random_directions_orthogonal** (bool): see :ref:`algo_options`.
- - **stopping.max_criterion_evaluations** (int): see :ref:`algo_options`.
+ - **stopping.maxfun** (int): see :ref:`algo_options`.
- **threshold_for_safety_step** (float): see :ref:`algo_options`.
- **trustregion.expansion_factor_successful** (float): see :ref:`algo_options`.
- **trustregion.expansion_factor_very_successful** (float): see :ref:`algo_options`.
@@ -1310,7 +1310,7 @@ install each of them separately:
- **seek_global_optimum** (bool): whether to apply the heuristic to escape local
minima presented in :cite:`Cartis2018a`. Only applies for noisy criterion
functions.
- - **stopping.max_criterion_evaluations** (int): see :ref:`algo_options`.
+ - **stopping.maxfun** (int): see :ref:`algo_options`.
- **threshold_for_safety_step** (float): see :ref:`algo_options`.
- **trustregion.expansion_factor_successful** (float): see :ref:`algo_options`.
- **trustregion.expansion_factor_very_successful** (float): see :ref:`algo_options`.
@@ -1387,7 +1387,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
evaluation that cannot be done in parallel with other evaluations. Default
False.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **kernel_size** (int): Number of solutions stored in the solution archive.
- **speed_parameter_q** (float): This parameter manages the convergence speed
towards the found minima (the smaller the faster). In the pygmo
@@ -1399,7 +1399,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
function's values distances.
- **threshold** (int): when the iteration counter reaches the threshold the
convergence speed is set to 0.01 automatically. To deactivate this effect
- set the threshold to stopping.max_iterations which is the largest allowed
+ set the threshold to stopping.maxiter which is the largest allowed
value.
- **speed_of_std_values_convergence** (int): parameter that determines the
convergence speed of the standard deviations. This must be an integer
@@ -1407,7 +1407,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
- **stopping.max_n_without_improvements** (int): if a positive integer is
assigned here, the algorithm will count the runs without improvements, if
this number exceeds the given value, the algorithm will be stopped.
- - **stopping.max_criterion_evaluations** (int): maximum number of function
+ - **stopping.maxfun** (int): maximum number of function
evaluations.
- **focus** (float): this parameter makes the search for the optimum greedier
and more focused on local improvements (the higher the greedier). If the
@@ -1431,7 +1431,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
in :cite:`Mernik2015`. The algorithm is only suited for bounded parameter
spaces.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **seed** (int): seed used by the internal random number generator.
- **discard_start_params** (bool): If True, the start params are not guaranteed
to be part of the initial population. This saves one criterion function
@@ -1463,7 +1463,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
to be part of the initial population. This saves one criterion function
evaluation that cannot be done in parallel with other evaluations. Default
False.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **weight_coefficient** (float): Weight coefficient. It is denoted by $F$ in
the main paper and must lie in [0, 2]. It controls the amplification of
the differential variation $(x_{r_2, G} - x_{r_3, G})$.
@@ -1484,7 +1484,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
- **convergence.criterion_tolerance**: stopping criteria on the criterion
tolerance. Default is 1e-6. It is not clear whether this is the absolute
or relative criterion tolerance.
- - **convergence.relative_params_tolerance**: stopping criteria on the x
+ - **convergence.xtol_rel**: stopping criteria on the x
tolerance. In pygmo the default is 1e-6 but we use our default value of
1e-5.
```
@@ -1513,7 +1513,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
- **discard_start_params** (bool): If True, the start params are not guaranteed to be
part of the initial population. This saves one criterion function evaluation that
cannot be done in parallel with other evaluations. Default False.
- - **stopping.max_iterations** (int): number of generations to consider. Each generation
+ - **stopping.maxiter** (int): number of generations to consider. Each generation
will compute the objective function once.
```
@@ -1538,7 +1538,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
- **discard_start_params** (bool): If True, the start params are not guaranteed to be
part of the initial population. This saves one criterion function evaluation that
cannot be done in parallel with other evaluations. Default False.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **crossover_probability** (float): Crossover probability.
- **crossover_strategy** (str): the crossover strategy. One of “exponential”,“binomial”,
“single” or “sbx”. Default is "exponential".
@@ -1587,7 +1587,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
cannot be done in parallel with other evaluations. Default False.
- jde (bool): Whether to use the jDE self-adaptation variant to control the $F$ and
$CR$ parameter. If True jDE is used, else iDE.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **mutation_variant** (int or str): code for the mutation variant to create a new
candidate individual. The default is "rand/1/exp". The first ten are the
classical mutation variants introduced in the orginal DE algorithm, the remaining
@@ -1644,7 +1644,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
part of the initial population. This saves one criterion function evaluation that
cannot be done in parallel with other evaluations. Default False.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **backward_horizon** (float): backward time horizon for the evolution path. It must
lie betwen 0 and 1.
- **variance_loss_compensation** (float): makes partly up for the small variance loss in
@@ -1730,7 +1730,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
- **discard_start_params** (bool): If True, the start params are not guaranteed to be
part of the initial population. This saves one criterion function evaluation that
cannot be done in parallel with other evaluations. Default False.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **omega** (float): depending on the variant chosen, :math:`\omega` is the particles'
inertia weight or the construction coefficient. It must lie between 0 and 1.
@@ -1802,7 +1802,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
- **discard_start_params** (bool): If True, the start params are not guaranteed to be
part of the initial population. This saves one criterion function evaluation that
cannot be done in parallel with other evaluations. Default False.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **omega** (float): depending on the variant chosen, :math:`\omega` is the particles'
inertia weight or the constructuion coefficient. It must lie between 0 and 1.
@@ -1907,7 +1907,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
- **discard_start_params** (bool): If True, the start params are not guaranteed to be
part of the initial population. This saves one criterion function evaluation that
cannot be done in parallel with other evaluations. Default False.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **learning_rate_mean_update** (float): learning rate for the mean update
(:math:`\eta_\mu`). It must be between 0 and 1 or None.
@@ -1952,7 +1952,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
- **discard_start_params** (bool): If True, the start params are not guaranteed to be
part of the initial population. This saves one criterion function evaluation that
cannot be done in parallel with other evaluations. Default False.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
```
@@ -1975,7 +1975,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
- **discard_start_params** (bool): If True, the start params are not guaranteed to be
part of the initial population. This saves one criterion function evaluation that
cannot be done in parallel with other evaluations. Default False.
- - **stopping.max_criterion_evaluations** (int): maximum number of function evaluations.
+ - **stopping.maxfun** (int): maximum number of function evaluations.
- **start_range** (float): the start range. Must be in (0, 1].
- **stop_range** (float): the stop range. Must be in (0, start_range].
- **reduction_coeff** (float): the range reduction coefficient. Must be in (0, 1).
@@ -1999,7 +1999,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
- **discard_start_params** (bool): If True, the start params are not guaranteed to be
part of the initial population. This saves one criterion function evaluation that
cannot be done in parallel with other evaluations. Default False.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **choose_from_memory_probability** (float): probability of choosing from memory
(similar to a crossover probability).
- **min_pitch_adjustment_rate** (float): minimum pitch adjustment rate. (similar to a
@@ -2032,7 +2032,7 @@ supports the following [pygmo2](https://esa.github.io/pygmo2) optimizers.
cannot be done in parallel with other evaluations. Default False.
- **jde** (bool): Whether to use the jDE self-adaptation variant to control the $F$ and
$CR$ parameter. If True jDE is used, else iDE.
- - **stopping.max_iterations** (int): Number of generations to evolve.
+ - **stopping.maxiter** (int): Number of generations to evolve.
- **allowed_variants** (array-like object): allowed mutation variants (can be codes
or strings). Each code refers to one mutation variant to create a new candidate
individual. The first ten refer to the classical mutation variants introduced in
@@ -2117,7 +2117,7 @@ To use ipopt, you need to have
- print options.
- - **convergence.relative_criterion_tolerance** (float): The algorithm
+ - **convergence.ftol_rel** (float): The algorithm
terminates successfully, if the (scaled) non linear programming error
becomes smaller than this value.
@@ -2133,7 +2133,7 @@ To use ipopt, you need to have
- **s_max** (float): Scaling threshold for the NLP error.
- - **stopping.max_iterations** (int): If the maximum number of iterations is
+ - **stopping.maxiter** (int): If the maximum number of iterations is
reached, the optimization stops, but we do not count this as successful
convergence. The difference to ``max_criterion_evaluations`` is that one
iteration might need several criterion evaluations, for example in a line
@@ -3315,28 +3315,28 @@ need to have [the fides package](https://github.com/fides-dev/fides) installed
- **sr1**: Symmetric Rank 1 update strategy as described in :cite:`Nocedal1999`,
Chapter 6.2.
- - **convergence.absolute_criterion_tolerance** (float): absolute convergence criterion
+ - **convergence.ftol_abs** (float): absolute convergence criterion
tolerance. This is only the interpretation of this parameter if the relative
criterion tolerance is set to 0. Denoting the absolute criterion tolerance by
:math:`\alpha` and the relative criterion tolerance by :math:`\beta`, the
convergence condition on the criterion improvement is
:math:`|f(x_k) - f(x_{k-1})| < \alpha + \beta \cdot |f(x_{k-1})|`
- - **convergence.relative_criterion_tolerance** (float): relative convergence criterion
+ - **convergence.ftol_rel** (float): relative convergence criterion
tolerance. This is only the interpretation of this parameter if the absolute
criterion tolerance is set to 0 (as is the default). Denoting the absolute
criterion tolerance by :math:`\alpha` and the relative criterion tolerance by
:math:`\beta`, the convergence condition on the criterion improvement is
:math:`|f(x_k) - f(x_{k-1})| < \alpha + \beta \cdot |f(x_{k-1})|`
- - **convergence.absolute_params_tolerance** (float): The optimization terminates
+ - **convergence.xtol_abs** (float): The optimization terminates
successfully when the step size falls below this number, i.e. when
:math:`||x_{k+1} - x_k||` is smaller than this tolerance.
- - **convergence.absolute_gradient_tolerance** (float): The optimization terminates
+ - **convergence.gtol_abs** (float): The optimization terminates
successfully when the gradient norm is less or equal than this tolerance.
- - **convergence.relative_gradient_tolerance** (float): The optimization terminates
+ - **convergence.gtol_rel** (float): The optimization terminates
successfully when the norm of the gradient divided by the absolute function value
is less or equal to this tolerance.
- - **stopping.max_iterations** (int): maximum number of allowed iterations.
+ - **stopping.maxiter** (int): maximum number of allowed iterations.
- **stopping.max_seconds** (int): maximum number of walltime seconds, deactivated by
default.
@@ -3409,15 +3409,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
For details see :cite:`Powell2009`.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3437,15 +3437,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
that the nlopt version supports bounds. This is done by moving all new points that
would lie outside the bounds exactly on the bounds.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3481,15 +3481,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
In case of bounded constraints, this method is dominated by `nlopt_bobyqa`
and `nlopt_cobyla`.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
@@ -3524,15 +3524,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
- Supports unequal initial-step sizes in the different parameters.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3556,15 +3556,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
of Rowan, is that it explicitly supports bound constraints providing big
improvement in the case where the optimum lies against one of the constraints.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3591,15 +3591,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
`NEWUOA` requires the dimension n of the parameter space to be `≥ 2`, i.e. the
implementation does not handle one-dimensional optimization problems.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3623,15 +3623,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
Detailed description of algorithms is given in :cite:`Dembo1983`.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3655,15 +3655,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
Detailed description of algorithms is given in :cite:`Nocedal1989`, :cite:`Nocedal1980`.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3687,15 +3687,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
The implementation is based on CCSA algorithm described in :cite:`Svanberg2002`.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3719,15 +3719,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
verge to a local optimum from any feasible starting point.
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3748,15 +3748,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
Detailed explanation of the algorithm, including its two variations of rank-2 and
rank-1 methods can be found in the following paper :cite:`Vlcek2006` .
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
- **rank_1_update** (bool): Whether I rank-1 or rank-2 update is used.
@@ -3778,15 +3778,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
The implementation is based on the procedure described in :cite:`Kraft1988`
and :cite:`Kraft1994` .
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3823,15 +3823,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
- "DIRECT_L_RAND_NOSCAL"
- "DIRECT_RAND"
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
- **locally_biased** (bool): Whether the "L" version of the algorithm is selected.
@@ -3854,15 +3854,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
More information on this method can be found in
:cite:`DaSilva2010` , :cite:`DaSilva2010a` , :cite:`Beyer2002` and :cite:`Vent1975` .
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this
as convergence.
```
@@ -3885,15 +3885,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
:cite:`PhilipRunarsson2005` and :cite:`Thomas2000` .
- - **convergence.relative_params_tolerance** (float): Stop when the relative
+ - **convergence.xtol_rel** (float): Stop when the relative
movement between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute
+ - **convergence.xtol_abs** (float): Stop when the absolute
movement between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of
+ - **convergence.ftol_abs** (float): Stop when the change of
the criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of
+ - **stopping.maxfun** (int): If the maximum number of
function evaluation is reached, the optimization stops but we do not count
this as convergence.
```
@@ -3917,15 +3917,15 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run
meter population_size. If the user doesn't specify a value, it is set to the nlopt
default of 10*(n+1).
- - **convergence.relative_params_tolerance** (float): Stop when the relative movement
+ - **convergence.xtol_rel** (float): Stop when the relative movement
between parameter vectors is smaller than this.
- - **convergence.absolute_params_tolerance** (float): Stop when the absolute movement
+ - **convergence.xtol_abs** (float): Stop when the absolute movement
between parameter vectors is smaller than this.
- - **convergence.relative_criterion_tolerance** (float): Stop when the relative
+ - **convergence.ftol_rel** (float): Stop when the relative
improvement between two iterations is smaller than this.
- - **convergence.absolute_criterion_tolerance** (float): Stop when the change of the
+ - **convergence.ftol_abs** (float): Stop when the change of the
criterion function between two iterations is smaller than this.
- - **stopping.max_criterion_evaluations** (int): If the maximum number of function
+ - **stopping.maxfun** (int): If the maximum number of function
evaluation is reached, the optimization stops but we do not count this as
convergence.
- **population_size** (int): Size of the population. If None, it's set to be
diff --git a/docs/source/estimagic/tutorials/bootstrap_overview.ipynb b/docs/source/estimagic/tutorials/bootstrap_overview.ipynb
index 5f6dbd467..c27bfa24b 100644
--- a/docs/source/estimagic/tutorials/bootstrap_overview.ipynb
+++ b/docs/source/estimagic/tutorials/bootstrap_overview.ipynb
@@ -15,7 +15,7 @@
},
{
"cell_type": "code",
- "execution_count": 2,
+ "execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
@@ -42,9 +42,9 @@
"name": "stderr",
"output_type": "stream",
"text": [
- "/var/folders/gf/_b8vq9wn2sv2221129y0c3sh0000gn/T/ipykernel_82008/2496026297.py:3: FutureWarning: Downcasting behavior in `replace` is deprecated and will be removed in a future version. To retain the old behavior, explicitly call `result.infer_objects(copy=False)`. To opt-in to the future behavior, set `pd.set_option('future.no_silent_downcasting', True)`\n",
+ "/var/folders/gf/_b8vq9wn2sv2221129y0c3sh0000gn/T/ipykernel_90869/2496026297.py:3: FutureWarning: Downcasting behavior in `replace` is deprecated and will be removed in a future version. To retain the old behavior, explicitly call `result.infer_objects(copy=False)`. To opt-in to the future behavior, set `pd.set_option('future.no_silent_downcasting', True)`\n",
" df = df.replace({\"time\": replacements})\n",
- "/var/folders/gf/_b8vq9wn2sv2221129y0c3sh0000gn/T/ipykernel_82008/2496026297.py:3: FutureWarning: The behavior of Series.replace (and DataFrame.replace) with CategoricalDtype is deprecated. In a future version, replace will only be used for cases that preserve the categories. To change the categories, use ser.cat.rename_categories instead.\n",
+ "/var/folders/gf/_b8vq9wn2sv2221129y0c3sh0000gn/T/ipykernel_90869/2496026297.py:3: FutureWarning: The behavior of Series.replace (and DataFrame.replace) with CategoricalDtype is deprecated. In a future version, replace will only be used for cases that preserve the categories. To change the categories, use ser.cat.rename_categories instead.\n",
" df = df.replace({\"time\": replacements})\n"
]
},
@@ -202,11 +202,11 @@
{
"data": {
"text/plain": [
- "(constant 90.857208\n",
- " time 0.141391\n",
+ "(constant 90.810164\n",
+ " time 0.147692\n",
" dtype: float64,\n",
- " constant 96.738019\n",
- " time 0.633684\n",
+ " constant 96.676859\n",
+ " time 0.632926\n",
" dtype: float64)"
]
},
@@ -228,8 +228,8 @@
{
"data": {
"text/plain": [
- "constant 1.496127\n",
- "time 0.127750\n",
+ "constant 1.494204\n",
+ "time 0.123496\n",
"dtype: float64"
]
},
@@ -259,11 +259,11 @@
{
"data": {
"text/plain": [
- "(constant 91.345295\n",
- " time 0.199327\n",
+ "(constant 91.416070\n",
+ " time 0.194815\n",
" dtype: float64,\n",
- " constant 96.264179\n",
- " time 0.612123\n",
+ " constant 96.377676\n",
+ " time 0.606278\n",
" dtype: float64)"
]
},
@@ -304,8 +304,8 @@
{
"data": {
"text/plain": [
- "constant 1.207991\n",
- "time 0.100024\n",
+ "constant 1.156068\n",
+ "time 0.100937\n",
"dtype: float64"
]
},
@@ -633,7 +633,7 @@
},
{
"cell_type": "code",
- "execution_count": 19,
+ "execution_count": 17,
"metadata": {},
"outputs": [
{
@@ -787,7 +787,7 @@
"[90 rows x 6 columns]"
]
},
- "execution_count": 19,
+ "execution_count": 17,
"metadata": {},
"output_type": "execute_result"
}
diff --git a/docs/source/estimagic/tutorials/likelihood_overview.ipynb b/docs/source/estimagic/tutorials/likelihood_overview.ipynb
index 8b46b836c..fc94478f8 100644
--- a/docs/source/estimagic/tutorials/likelihood_overview.ipynb
+++ b/docs/source/estimagic/tutorials/likelihood_overview.ipynb
@@ -30,7 +30,7 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
@@ -51,7 +51,7 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
@@ -64,9 +64,66 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 3,
"metadata": {},
- "outputs": [],
+ "outputs": [
+ {
+ "data": {
+ "text/html": [
+ "
\n",
+ "\n",
+ "
\n",
+ " \n",
+ " \n",
+ " | \n",
+ " value | \n",
+ " lower_bound | \n",
+ "
\n",
+ " \n",
+ " \n",
+ " \n",
+ " intercept | \n",
+ " 2 | \n",
+ " -inf | \n",
+ "
\n",
+ " \n",
+ " slope | \n",
+ " -1 | \n",
+ " -inf | \n",
+ "
\n",
+ " \n",
+ " sd | \n",
+ " 1 | \n",
+ " 1.000000e-10 | \n",
+ "
\n",
+ " \n",
+ "
\n",
+ "
"
+ ],
+ "text/plain": [
+ " value lower_bound\n",
+ "intercept 2 -inf\n",
+ "slope -1 -inf\n",
+ "sd 1 1.000000e-10"
+ ]
+ },
+ "execution_count": 3,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
"source": [
"true_params = pd.DataFrame(\n",
" data=[[2, -np.inf], [-1, -np.inf], [1, 1e-10]],\n",
@@ -78,7 +135,7 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
@@ -94,7 +151,7 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
@@ -127,7 +184,7 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
@@ -143,9 +200,86 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 7,
"metadata": {},
- "outputs": [],
+ "outputs": [
+ {
+ "data": {
+ "text/html": [
+ "\n",
+ "\n",
+ "
\n",
+ " \n",
+ " \n",
+ " | \n",
+ " value | \n",
+ " standard_error | \n",
+ " ci_lower | \n",
+ " ci_upper | \n",
+ " p_value | \n",
+ " free | \n",
+ " stars | \n",
+ "
\n",
+ " \n",
+ " \n",
+ " \n",
+ " intercept | \n",
+ " 1.945 | \n",
+ " 0.104 | \n",
+ " 1.742 | \n",
+ " 2.148 | \n",
+ " 0.0 | \n",
+ " True | \n",
+ " *** | \n",
+ "
\n",
+ " \n",
+ " slope | \n",
+ " -0.945 | \n",
+ " 0.113 | \n",
+ " -1.167 | \n",
+ " -0.723 | \n",
+ " 0.0 | \n",
+ " True | \n",
+ " *** | \n",
+ "
\n",
+ " \n",
+ " sd | \n",
+ " 0.954 | \n",
+ " 0.079 | \n",
+ " 0.799 | \n",
+ " 1.109 | \n",
+ " 0.0 | \n",
+ " True | \n",
+ " *** | \n",
+ "
\n",
+ " \n",
+ "
\n",
+ "
"
+ ],
+ "text/plain": [
+ " value standard_error ci_lower ci_upper p_value free stars\n",
+ "intercept 1.945 0.104 1.742 2.148 0.0 True ***\n",
+ "slope -0.945 0.113 -1.167 -0.723 0.0 True ***\n",
+ "sd 0.954 0.079 0.799 1.109 0.0 True ***"
+ ]
+ },
+ "execution_count": 7,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
"source": [
"res.summary().round(3)"
]
@@ -163,33 +297,213 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 8,
"metadata": {},
- "outputs": [],
+ "outputs": [
+ {
+ "data": {
+ "text/html": [
+ "\n",
+ "\n",
+ "
\n",
+ " \n",
+ " \n",
+ " | \n",
+ " value | \n",
+ " lower_bound | \n",
+ "
\n",
+ " \n",
+ " \n",
+ " \n",
+ " intercept | \n",
+ " 1.944964 | \n",
+ " -inf | \n",
+ "
\n",
+ " \n",
+ " slope | \n",
+ " -0.944918 | \n",
+ " -inf | \n",
+ "
\n",
+ " \n",
+ " sd | \n",
+ " 0.954230 | \n",
+ " 1.000000e-10 | \n",
+ "
\n",
+ " \n",
+ "
\n",
+ "
"
+ ],
+ "text/plain": [
+ " value lower_bound\n",
+ "intercept 1.944964 -inf\n",
+ "slope -0.944918 -inf\n",
+ "sd 0.954230 1.000000e-10"
+ ]
+ },
+ "execution_count": 8,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
"source": [
"res.params"
]
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 9,
"metadata": {},
- "outputs": [],
+ "outputs": [
+ {
+ "data": {
+ "text/html": [
+ "\n",
+ "\n",
+ "
\n",
+ " \n",
+ " \n",
+ " | \n",
+ " intercept | \n",
+ " slope | \n",
+ " sd | \n",
+ "
\n",
+ " \n",
+ " \n",
+ " \n",
+ " intercept | \n",
+ " 0.008986 | \n",
+ " 0.000426 | \n",
+ " -0.001904 | \n",
+ "
\n",
+ " \n",
+ " slope | \n",
+ " 0.000426 | \n",
+ " 0.007734 | \n",
+ " 0.000303 | \n",
+ "
\n",
+ " \n",
+ " sd | \n",
+ " -0.001904 | \n",
+ " 0.000303 | \n",
+ " 0.003748 | \n",
+ "
\n",
+ " \n",
+ "
\n",
+ "
"
+ ],
+ "text/plain": [
+ " intercept slope sd\n",
+ "intercept 0.008986 0.000426 -0.001904\n",
+ "slope 0.000426 0.007734 0.000303\n",
+ "sd -0.001904 0.000303 0.003748"
+ ]
+ },
+ "execution_count": 9,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
"source": [
"res.cov(method=\"robust\")"
]
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 10,
"metadata": {},
- "outputs": [],
+ "outputs": [
+ {
+ "data": {
+ "text/html": [
+ "\n",
+ "\n",
+ "
\n",
+ " \n",
+ " \n",
+ " | \n",
+ " value | \n",
+ " lower_bound | \n",
+ "
\n",
+ " \n",
+ " \n",
+ " \n",
+ " intercept | \n",
+ " 0.103760 | \n",
+ " -inf | \n",
+ "
\n",
+ " \n",
+ " slope | \n",
+ " 0.113342 | \n",
+ " -inf | \n",
+ "
\n",
+ " \n",
+ " sd | \n",
+ " 0.078960 | \n",
+ " 1.000000e-10 | \n",
+ "
\n",
+ " \n",
+ "
\n",
+ "
"
+ ],
+ "text/plain": [
+ " value lower_bound\n",
+ "intercept 0.103760 -inf\n",
+ "slope 0.113342 -inf\n",
+ "sd 0.078960 1.000000e-10"
+ ]
+ },
+ "execution_count": 10,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
"source": [
"res.se()"
]
}
],
"metadata": {
+ "kernelspec": {
+ "display_name": "optimagic",
+ "language": "python",
+ "name": "python3"
+ },
"language_info": {
"codemirror_mode": {
"name": "ipython",
@@ -200,7 +514,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.9.13"
+ "version": "3.10.14"
}
},
"nbformat": 4,
diff --git a/docs/source/explanation/internal_optimizers.md b/docs/source/explanation/internal_optimizers.md
index a99b56348..3ee33f806 100644
--- a/docs/source/explanation/internal_optimizers.md
+++ b/docs/source/explanation/internal_optimizers.md
@@ -56,8 +56,8 @@ should return a dictionary with the following entries:
- solution_x: The best parameter achieved so far
- solution_criterion: The value of the criterion at solution_x. This can be a scalar or
dictionary.
-- n_criterion_evaluations: The number of criterion evaluations.
-- n_derivative_evaluations: The number of derivative evaluations.
+- n_fun_evals: The number of criterion evaluations.
+- n_jac_evals: The number of derivative evaluations.
- n_iterations: The number of iterations
- success: True if convergence was achieved
- message: A string with additional information.
diff --git a/docs/source/explanation/why_optimization_is_hard.ipynb b/docs/source/explanation/why_optimization_is_hard.ipynb
index c9b99aaaa..bd59bd76a 100644
--- a/docs/source/explanation/why_optimization_is_hard.ipynb
+++ b/docs/source/explanation/why_optimization_is_hard.ipynb
@@ -29,7 +29,6 @@
"source": [
"import optimagic as om\n",
"import numpy as np\n",
- "import pandas as pd\n",
"import seaborn as sns"
]
},
@@ -39,12 +38,12 @@
"metadata": {},
"outputs": [],
"source": [
- "def sphere(params):\n",
- " return (params[\"value\"] ** 2).sum()\n",
+ "def sphere(x):\n",
+ " return x @ x\n",
"\n",
"\n",
- "def sphere_gradient(params):\n",
- " return params * 2"
+ "def sphere_gradient(x):\n",
+ " return 2 * x"
]
},
{
@@ -151,85 +150,17 @@
"metadata": {},
"outputs": [],
"source": [
- "def sphere_with_noise(params, rng):\n",
- " return sphere(params) + rng.normal(scale=0.02)"
+ "def sphere_with_noise(x, rng):\n",
+ " return sphere(x) + rng.normal(scale=0.02)"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
- "outputs": [
- {
- "data": {
- "text/html": [
- "\n",
- "\n",
- "
\n",
- " \n",
- " \n",
- " | \n",
- " value | \n",
- "
\n",
- " \n",
- " \n",
- " \n",
- " x_0 | \n",
- " 1 | \n",
- "
\n",
- " \n",
- " x_1 | \n",
- " 2 | \n",
- "
\n",
- " \n",
- " x_2 | \n",
- " 3 | \n",
- "
\n",
- " \n",
- " x_3 | \n",
- " 4 | \n",
- "
\n",
- " \n",
- " x_4 | \n",
- " 5 | \n",
- "
\n",
- " \n",
- "
\n",
- "
"
- ],
- "text/plain": [
- " value\n",
- "x_0 1\n",
- "x_1 2\n",
- "x_2 3\n",
- "x_3 4\n",
- "x_4 5"
- ]
- },
- "execution_count": 7,
- "metadata": {},
- "output_type": "execute_result"
- }
- ],
+ "outputs": [],
"source": [
- "start_params = pd.DataFrame(\n",
- " data=np.arange(5) + 1,\n",
- " columns=[\"value\"],\n",
- " index=[f\"x_{i}\" for i in range(5)],\n",
- ")\n",
- "start_params"
+ "start_params = np.arange(5)"
]
},
{
@@ -274,11 +205,11 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere_with_noise,\n",
+ " fun=sphere_with_noise,\n",
" params=start_params,\n",
" algorithm=\"scipy_lbfgsb\",\n",
" logging=False,\n",
- " criterion_kwargs={\"rng\": rng},\n",
+ " fun_kwargs={\"rng\": rng},\n",
")\n",
"\n",
"res.success"
@@ -291,60 +222,8 @@
"outputs": [
{
"data": {
- "text/html": [
- "\n",
- "\n",
- "
\n",
- " \n",
- " \n",
- " | \n",
- " value | \n",
- "
\n",
- " \n",
- " \n",
- " \n",
- " x_0 | \n",
- " 1.0 | \n",
- "
\n",
- " \n",
- " x_1 | \n",
- " 2.0 | \n",
- "
\n",
- " \n",
- " x_2 | \n",
- " 3.0 | \n",
- "
\n",
- " \n",
- " x_3 | \n",
- " 4.0 | \n",
- "
\n",
- " \n",
- " x_4 | \n",
- " 5.0 | \n",
- "
\n",
- " \n",
- "
\n",
- "
"
- ],
"text/plain": [
- " value\n",
- "x_0 1.0\n",
- "x_1 2.0\n",
- "x_2 3.0\n",
- "x_3 4.0\n",
- "x_4 5.0"
+ "array([0., 1., 2., 3., 4.])"
]
},
"execution_count": 10,
@@ -389,10 +268,8 @@
"metadata": {},
"outputs": [],
"source": [
- "def piecewise_constant_sphere(params):\n",
- " params = params.copy(deep=True)\n",
- " params[\"value\"] = params[\"value\"].round(2)\n",
- " return sphere(params)"
+ "def piecewise_constant_sphere(x):\n",
+ " return sphere(x.round(2))"
]
},
{
@@ -425,7 +302,7 @@
"text/plain": [
"Minimize with 5 free parameters terminated successfully after 1 criterion evaluations, 1 derivative evaluations and 0 iterations.\n",
"\n",
- "The value of criterion improved from 55.0 to 55.0.\n",
+ "The value of criterion improved from 30 to 30.0.\n",
"\n",
"The scipy_lbfgsb algorithm reported: CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL"
]
@@ -437,10 +314,9 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=piecewise_constant_sphere,\n",
+ " fun=piecewise_constant_sphere,\n",
" params=start_params,\n",
" algorithm=\"scipy_lbfgsb\",\n",
- " logging=False,\n",
")\n",
"\n",
"res"
diff --git a/docs/source/how_to/how_to_algorithm_selection.ipynb b/docs/source/how_to/how_to_algorithm_selection.ipynb
index 95a2b769c..e7a898172 100644
--- a/docs/source/how_to/how_to_algorithm_selection.ipynb
+++ b/docs/source/how_to/how_to_algorithm_selection.ipynb
@@ -67,7 +67,7 @@
},
{
"cell_type": "code",
- "execution_count": 3,
+ "execution_count": 4,
"metadata": {},
"outputs": [
{
@@ -76,19 +76,19 @@
"3"
]
},
- "execution_count": 3,
+ "execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=start_params,\n",
" algorithm=\"scipy_lbfgsb\",\n",
- " derivative=sphere_gradient,\n",
+ " jac=sphere_gradient,\n",
")\n",
- "res.n_criterion_evaluations"
+ "res.n_fun_evals"
]
},
{
@@ -120,7 +120,7 @@
},
{
"cell_type": "code",
- "execution_count": 4,
+ "execution_count": 5,
"metadata": {},
"outputs": [
{
@@ -129,18 +129,18 @@
"33"
]
},
- "execution_count": 4,
+ "execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=start_params,\n",
" algorithm=\"nag_pybobyqa\",\n",
")\n",
- "res.n_criterion_evaluations"
+ "res.n_fun_evals"
]
},
{
@@ -159,7 +159,7 @@
},
{
"cell_type": "code",
- "execution_count": 5,
+ "execution_count": 6,
"metadata": {},
"outputs": [
{
@@ -168,18 +168,18 @@
"9"
]
},
- "execution_count": 5,
+ "execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=start_params,\n",
" algorithm=\"nag_dfols\",\n",
")\n",
- "res.n_criterion_evaluations"
+ "res.n_fun_evals"
]
}
],
diff --git a/docs/source/how_to/how_to_batch_evaluators.ipynb b/docs/source/how_to/how_to_batch_evaluators.ipynb
deleted file mode 100644
index 58dff24c6..000000000
--- a/docs/source/how_to/how_to_batch_evaluators.ipynb
+++ /dev/null
@@ -1,42 +0,0 @@
-{
- "cells": [
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "# How to use batch evaluators"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "(to be written.)\n",
- "\n",
- "In case of an urgent request for this guide, feel free to open an issue \n",
- "[here](https://github.com/OpenSourceEconomics/optimagic/issues)."
- ]
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 3 (ipykernel)",
- "language": "python",
- "name": "python3"
- },
- "language_info": {
- "codemirror_mode": {
- "name": "ipython",
- "version": 3
- },
- "file_extension": ".py",
- "mimetype": "text/x-python",
- "name": "python",
- "nbconvert_exporter": "python",
- "pygments_lexer": "ipython3",
- "version": "3.10.8"
- }
- },
- "nbformat": 4,
- "nbformat_minor": 2
-}
diff --git a/docs/source/how_to/how_to_benchmarking.ipynb b/docs/source/how_to/how_to_benchmarking.ipynb
index c2d979fc5..6a433476e 100644
--- a/docs/source/how_to/how_to_benchmarking.ipynb
+++ b/docs/source/how_to/how_to_benchmarking.ipynb
@@ -619,7 +619,7 @@
},
{
"cell_type": "code",
- "execution_count": 14,
+ "execution_count": 13,
"id": "96614437",
"metadata": {},
"outputs": [],
@@ -629,7 +629,7 @@
},
{
"cell_type": "code",
- "execution_count": 15,
+ "execution_count": 14,
"id": "f9d63ee9",
"metadata": {},
"outputs": [
@@ -676,7 +676,7 @@
"Index: []"
]
},
- "execution_count": 15,
+ "execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
diff --git a/docs/source/how_to/how_to_bounds.ipynb b/docs/source/how_to/how_to_bounds.ipynb
index bf1d2cb36..a1939dd7f 100644
--- a/docs/source/how_to/how_to_bounds.ipynb
+++ b/docs/source/how_to/how_to_bounds.ipynb
@@ -24,7 +24,7 @@
},
{
"cell_type": "code",
- "execution_count": 1,
+ "execution_count": 2,
"id": "ec477eb7",
"metadata": {},
"outputs": [],
@@ -35,18 +35,18 @@
},
{
"cell_type": "code",
- "execution_count": 2,
+ "execution_count": 3,
"id": "b0eb906d",
"metadata": {},
"outputs": [],
"source": [
- "def criterion(x):\n",
+ "def fun(x):\n",
" return x @ x"
]
},
{
"cell_type": "code",
- "execution_count": 3,
+ "execution_count": 4,
"id": "6b43b46e",
"metadata": {},
"outputs": [
@@ -56,13 +56,13 @@
"array([ 0.00000000e+00, -1.33177530e-08, 7.18836679e-09])"
]
},
- "execution_count": 3,
+ "execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
- "res = om.minimize(criterion, params=np.arange(3), algorithm=\"scipy_lbfgsb\")\n",
+ "res = om.minimize(fun=fun, params=np.arange(3), algorithm=\"scipy_lbfgsb\")\n",
"res.params"
]
},
@@ -80,7 +80,7 @@
},
{
"cell_type": "code",
- "execution_count": 4,
+ "execution_count": 5,
"id": "0c450bdd",
"metadata": {},
"outputs": [
@@ -90,14 +90,14 @@
"array([1., 1., 1.])"
]
},
- "execution_count": 4,
+ "execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"res = om.minimize(\n",
- " criterion, params=np.arange(3), lower_bounds=np.ones(3), algorithm=\"scipy_lbfgsb\"\n",
+ " fun=fun, params=np.arange(3), lower_bounds=np.ones(3), algorithm=\"scipy_lbfgsb\"\n",
")\n",
"res.params"
]
@@ -112,7 +112,7 @@
},
{
"cell_type": "code",
- "execution_count": 5,
+ "execution_count": 6,
"id": "26c5c0df",
"metadata": {},
"outputs": [
@@ -122,14 +122,14 @@
"array([-1.00000000e+00, -3.57647467e-08, 1.00000000e+00])"
]
},
- "execution_count": 5,
+ "execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"res = om.minimize(\n",
- " criterion,\n",
+ " fun=fun,\n",
" params=np.arange(3),\n",
" algorithm=\"scipy_lbfgsb\",\n",
" lower_bounds=np.array([-2, -np.inf, 1]),\n",
@@ -150,7 +150,7 @@
},
{
"cell_type": "code",
- "execution_count": 6,
+ "execution_count": 7,
"id": "9c05eb78",
"metadata": {},
"outputs": [],
@@ -158,13 +158,13 @@
"params = {\"x\": np.arange(3), \"intercept\": 3}\n",
"\n",
"\n",
- "def criterion(params):\n",
+ "def fun(params):\n",
" return params[\"x\"] @ params[\"x\"] + params[\"intercept\"]"
]
},
{
"cell_type": "code",
- "execution_count": 7,
+ "execution_count": 8,
"id": "ddcc54d4",
"metadata": {},
"outputs": [
@@ -175,14 +175,14 @@
" 'intercept': -2.0}"
]
},
- "execution_count": 7,
+ "execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"res = om.minimize(\n",
- " criterion,\n",
+ " fun=fun,\n",
" params=params,\n",
" algorithm=\"scipy_lbfgsb\",\n",
" lower_bounds={\"intercept\": -2},\n",
@@ -213,7 +213,7 @@
},
{
"cell_type": "code",
- "execution_count": 8,
+ "execution_count": 9,
"id": "b4a95453",
"metadata": {},
"outputs": [
@@ -278,7 +278,7 @@
"intercept 0 3 -2"
]
},
- "execution_count": 8,
+ "execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
@@ -295,41 +295,24 @@
},
{
"cell_type": "code",
- "execution_count": 9,
+ "execution_count": 24,
"id": "34d59f01",
"metadata": {},
"outputs": [],
"source": [
- "def criterion(params):\n",
- " value = (\n",
- " params.loc[\"x\"][\"value\"] @ params.loc[\"x\"][\"value\"]\n",
- " + params.loc[\"intercept\"][\"value\"]\n",
- " )\n",
- " return float(value) # necessary since value is a pd.Series"
+ "def fun(params):\n",
+ " x = params.loc[\"x\"][\"value\"].to_numpy()\n",
+ " intercept = params.loc[\"intercept\"][\"value\"].iloc[0]\n",
+ " value = x @ x + intercept\n",
+ " return float(value)"
]
},
{
"cell_type": "code",
- "execution_count": 10,
+ "execution_count": 25,
"id": "b284ad8a",
"metadata": {},
"outputs": [
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "/var/folders/gf/_b8vq9wn2sv2221129y0c3sh0000gn/T/ipykernel_45054/313144487.py:6: FutureWarning: Calling float on a single element Series is deprecated and will raise a TypeError in the future. Use float(ser.iloc[0]) instead\n",
- " return float(value) # necessary since value is a pd.Series\n",
- "/var/folders/gf/_b8vq9wn2sv2221129y0c3sh0000gn/T/ipykernel_45054/313144487.py:6: FutureWarning: Calling float on a single element Series is deprecated and will raise a TypeError in the future. Use float(ser.iloc[0]) instead\n",
- " return float(value) # necessary since value is a pd.Series\n",
- "/var/folders/gf/_b8vq9wn2sv2221129y0c3sh0000gn/T/ipykernel_45054/313144487.py:6: FutureWarning: Calling float on a single element Series is deprecated and will raise a TypeError in the future. Use float(ser.iloc[0]) instead\n",
- " return float(value) # necessary since value is a pd.Series\n",
- "/var/folders/gf/_b8vq9wn2sv2221129y0c3sh0000gn/T/ipykernel_45054/313144487.py:6: FutureWarning: Calling float on a single element Series is deprecated and will raise a TypeError in the future. Use float(ser.iloc[0]) instead\n",
- " return float(value) # necessary since value is a pd.Series\n",
- "/var/folders/gf/_b8vq9wn2sv2221129y0c3sh0000gn/T/ipykernel_45054/313144487.py:6: FutureWarning: Calling float on a single element Series is deprecated and will raise a TypeError in the future. Use float(ser.iloc[0]) instead\n",
- " return float(value) # necessary since value is a pd.Series\n"
- ]
- },
{
"data": {
"text/html": [
@@ -391,14 +374,14 @@
"intercept 0 -2.0 -2"
]
},
- "execution_count": 10,
+ "execution_count": 25,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"res = om.minimize(\n",
- " criterion,\n",
+ " fun,\n",
" params=params,\n",
" algorithm=\"scipy_lbfgsb\",\n",
")\n",
diff --git a/docs/source/how_to/how_to_constraints.md b/docs/source/how_to/how_to_constraints.md
index 07f1e820d..07c8e14c9 100644
--- a/docs/source/how_to/how_to_constraints.md
+++ b/docs/source/how_to/how_to_constraints.md
@@ -34,7 +34,7 @@ you can impose and how you specify them in optimagic:
>>> import numpy as np
>>> import optimagic as om
- >>> def criterion(params):
+ >>> def fun(params):
... offset = np.linspace(1, 0, len(params))
... x = params - offset
... return x @ x
@@ -48,7 +48,7 @@ The unconstrained optimum of a six-dimensional version of this problem is:
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.array([2.5, 1, 1, 1, 1, -2.5]),
... algorithm="scipy_lbfgsb",
... )
@@ -78,7 +78,7 @@ constraint is imposed via the "loc" key. Generalizations for selecting subsets o
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.array([2.5, 1, 1, 1, 1, -2.5]),
... algorithm="scipy_lbfgsb",
... constraints={"loc": [0, 5], "type": "fixed"},
@@ -105,7 +105,7 @@ constraint is imposed via the "loc" key. Generalizations for selecting subsets o
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.array([1, 1, 1, 1, 1, 1]),
... algorithm="scipy_lbfgsb",
... constraints={"loc": [1, 2, 3], "type": "increasing"},
@@ -144,7 +144,7 @@ constraint is imposed via the "loc" key. Generalizations for selecting subsets o
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.array([1, 1, 1, 1, 1, 1]),
... algorithm="scipy_lbfgsb",
... constraints={"loc": [3, 0, 4], "type": "decreasing"},
@@ -176,7 +176,7 @@ constraint is imposed via the "loc" key. Generalizations for selecting subsets o
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.array([1, 1, 1, 1, 1, 1]),
... algorithm="scipy_lbfgsb",
... constraints={"loc": [0, 5], "type": "equality"},
@@ -201,7 +201,7 @@ constraint is imposed via the "loc" key. Generalizations for selecting subsets o
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.array([1, 1, 1, 1, 1, 1]),
... algorithm="scipy_lbfgsb",
... constraints={"locs": [[0, 1], [2, 3]], "type": "pairwise_equality"},
@@ -226,7 +226,7 @@ constraint is imposed via the "loc" key. Generalizations for selecting subsets o
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.array([0.3, 0.2, 0.25, 0.25, 1, 1]),
... algorithm="scipy_lbfgsb",
... constraints={"loc": [0, 1, 2, 3], "type": "probability"},
@@ -257,7 +257,7 @@ constraint is imposed via the "loc" key. Generalizations for selecting subsets o
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.ones(6),
... algorithm="scipy_lbfgsb",
... constraints={"loc": [0, 1, 2], "type": "covariance"},
@@ -295,7 +295,7 @@ constraint is imposed via the "loc" key. Generalizations for selecting subsets o
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.ones(6),
... algorithm="scipy_lbfgsb",
... constraints={"loc": [0, 1, 2], "type": "sdcorr"},
@@ -342,7 +342,7 @@ constraint is imposed via the "loc" key. Generalizations for selecting subsets o
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.ones(6),
... algorithm="scipy_lbfgsb",
... constraints={
@@ -388,7 +388,7 @@ constraint is imposed via the "loc" key. Generalizations for selecting subsets o
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.ones(6),
... algorithm="scipy_slsqp",
... constraints={
@@ -422,7 +422,7 @@ constraints simultaneously, simple pass in a list of constraints. For example:
.. code-block:: python
>>> res = om.minimize(
- ... criterion=criterion,
+ ... fun=fun,
... params=np.ones(6),
... algorithm="scipy_lbfgsb",
... constraints=[
@@ -493,7 +493,7 @@ Below we show how to use each of these selection methods in simple examples
.. code-block:: python
res = om.minimize(
- criterion=some_criterion,
+ fun=some_fun,
params=params,
algorithm="scipy_lbfgsb",
constraints={"loc": "cutoffs", "type": "increasing"},
@@ -536,7 +536,7 @@ Below we show how to use each of these selection methods in simple examples
.. code-block:: python
res = om.minimize(
- criterion=some_criterion,
+ fun=some_fun,
params=params,
algorithm="scipy_lbfgsb",
constraints={"query": "category == 'betas' | name == 'a'", "type": "fixed"},
@@ -567,7 +567,7 @@ Below we show how to use each of these selection methods in simple examples
.. code-block:: python
res = om.minimize(
- criterion=some_criterion,
+ fun=some_fun,
params=params,
algorithm="scipy_lbfgsb",
constraints={"selector": lambda params: params["b"]["d"], "type": "fixed"},
@@ -587,7 +587,7 @@ Below we show how to use each of these selection methods in simple examples
res = om.minimize(
- criterion=some_criterion,
+ fun=some_fun,
params=params,
algorithm="scipy_lbfgsb",
constraints={"selector": my_selector, "type": "fixed"},
diff --git a/docs/source/how_to/how_to_errors_during_optimization.ipynb b/docs/source/how_to/how_to_errors_during_optimization.ipynb
index 3d5de4bb3..aa296ad05 100644
--- a/docs/source/how_to/how_to_errors_during_optimization.ipynb
+++ b/docs/source/how_to/how_to_errors_during_optimization.ipynb
@@ -165,7 +165,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=jennrich_sampson,\n",
+ " fun=jennrich_sampson,\n",
" params=start_x,\n",
" algorithm=\"scipy_lbfgsb\",\n",
" error_handling=\"continue\",\n",
diff --git a/docs/source/how_to/how_to_logging.ipynb b/docs/source/how_to/how_to_logging.ipynb
index 86eee78a2..e1c75203d 100644
--- a/docs/source/how_to/how_to_logging.ipynb
+++ b/docs/source/how_to/how_to_logging.ipynb
@@ -46,7 +46,7 @@
"outputs": [],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
" logging=\"my_log.db\",\n",
@@ -73,7 +73,7 @@
"outputs": [],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
" logging=\"my_log.db\",\n",
@@ -97,7 +97,7 @@
"outputs": [],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
" logging=\"my_log.db\",\n",
diff --git a/docs/source/how_to/how_to_multistart.ipynb b/docs/source/how_to/how_to_multistart.ipynb
index 9597fa1bc..409036bcf 100644
--- a/docs/source/how_to/how_to_multistart.ipynb
+++ b/docs/source/how_to/how_to_multistart.ipynb
@@ -152,7 +152,7 @@
" \n",
" \n",
" 1 | \n",
- " -0.0 | \n",
+ " 0.0 | \n",
" -5 | \n",
" 10 | \n",
"
\n",
@@ -169,7 +169,7 @@
"text/plain": [
" value soft_lower_bound soft_upper_bound\n",
"0 -0.0 -5 10\n",
- "1 -0.0 -5 10\n",
+ "1 0.0 -5 10\n",
"2 0.0 -5 10"
]
},
@@ -180,7 +180,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=params,\n",
" algorithm=\"scipy_lbfgsb\",\n",
" multistart=True,\n",
@@ -291,17 +291,17 @@
"text/plain": [
"Minimize with 3 free parameters terminated successfully after 6 criterion evaluations, 6 derivative evaluations and 4 iterations.\n",
"\n",
- "The value of criterion improved from 14.0 to 6.38558109434918e-18.\n",
+ "The value of criterion improved from 14.0 to 9.468366080598408e-18.\n",
"\n",
"The multistart_scipy_lbfgsb algorithm reported: CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL\n",
"\n",
"Independent of the convergence criteria used by multistart_scipy_lbfgsb, the strength of convergence can be assessed by the following criteria:\n",
"\n",
" one_step five_steps \n",
- "relative_criterion_change 3.06e-14*** 3.06e-14***\n",
- "relative_params_change 5.482e-07* 5.482e-07* \n",
- "absolute_criterion_change 3.06e-15*** 3.06e-15***\n",
- "absolute_params_change 5.482e-08* 5.482e-08* \n",
+ "relative_criterion_change 4.859e-14*** 4.859e-14***\n",
+ "relative_params_change 7.279e-07* 7.279e-07* \n",
+ "absolute_criterion_change 4.859e-15*** 4.859e-15***\n",
+ "absolute_params_change 7.279e-08* 7.279e-08* \n",
"\n",
"(***: change <= 1e-10, **: change <= 1e-8, *: change <= 1e-5. Change refers to a change between accepted steps. The first column only considers the last step. The second column considers the last five steps.)"
]
@@ -313,7 +313,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=params,\n",
" algorithm=\"scipy_lbfgsb\",\n",
" multistart=True,\n",
@@ -364,7 +364,7 @@
" \n",
" \n",
" 1 | \n",
- " -0.0 | \n",
+ " 0.0 | \n",
" -5 | \n",
" 10 | \n",
"
\n",
@@ -381,7 +381,7 @@
"text/plain": [
" value soft_lower_bound soft_upper_bound\n",
"0 -0.0 -5 10\n",
- "1 -0.0 -5 10\n",
+ "1 0.0 -5 10\n",
"2 0.0 -5 10"
]
},
@@ -541,7 +541,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.10.8"
+ "version": "3.10.14"
}
},
"nbformat": 4,
diff --git a/docs/source/how_to/how_to_scaling.md b/docs/source/how_to/how_to_scaling.md
index d127cdc5a..22f4efab2 100644
--- a/docs/source/how_to/how_to_scaling.md
+++ b/docs/source/how_to/how_to_scaling.md
@@ -50,7 +50,7 @@ start_params["lower_bound"] = 0
start_params["upper_bound"] = 2 * np.arange(5) + 1
res = om.minimize(
- criterion=sphere,
+ fun=sphere,
params=start_params,
algorithm="scipy_lbfgsb",
scaling=True,
@@ -99,7 +99,7 @@ start_params["lower_bound"] = 0
start_params["upper_bound"] = 2 * np.arange(5) + 1
res = om.minimize(
- criterion=sphere,
+ fun=sphere,
params=start_params,
algorithm="scipy_lbfgsb",
scaling=True,
@@ -134,7 +134,7 @@ start_params["lower_bound"] = 0
start_params["upper_bound"] = 2 * np.arange(5) + 1
res = om.minimize(
- criterion=sphere,
+ fun=sphere,
params=start_params,
algorithm="scipy_lbfgsb",
scaling=True,
diff --git a/docs/source/how_to/how_to_specify_algorithm_and_algo_options.md b/docs/source/how_to/how_to_specify_algorithm_and_algo_options.md
index 4472c1f52..3e9550bb5 100644
--- a/docs/source/how_to/how_to_specify_algorithm_and_algo_options.md
+++ b/docs/source/how_to/how_to_specify_algorithm_and_algo_options.md
@@ -35,7 +35,6 @@ algo_options = {
"trustregion.shrinking_factor.not_successful": 0.4,
"trustregion.shrinking_factor.lower_radius": 0.2,
"trustregion.shrinking_factor.upper_radius": 0.8,
- "convergence.scaled_criterion_tolerance": 0.0,
"convergence.noise_corrected_criterion_tolerance": 1.1,
}
```
diff --git a/docs/source/how_to/how_to_start_parameters.md b/docs/source/how_to/how_to_start_parameters.md
index 55b5daab1..fc5a031e9 100644
--- a/docs/source/how_to/how_to_start_parameters.md
+++ b/docs/source/how_to/how_to_start_parameters.md
@@ -37,7 +37,7 @@ Again, we use the simple `sphere` function you know from other tutorials as an e
om.minimize(
- criterion=sphere,
+ fun=sphere,
params=np.arange(3),
algorithm="scipy_lbfgsb",
)
@@ -66,7 +66,7 @@ Again, we use the simple `sphere` function you know from other tutorials as an e
)
om.minimize(
- criterion=sphere,
+ fun=sphere,
params=params,
algorithm="scipy_lbfgsb",
)
@@ -103,7 +103,7 @@ Again, we use the simple `sphere` function you know from other tutorials as an e
res = om.minimize(
- criterion=sphere,
+ fun=sphere,
params={"a": 0, "b": 1, "c": pd.Series([2, 3, 4])},
algorithm="scipy_neldermead",
)
@@ -131,7 +131,7 @@ Again, we use the simple `sphere` function you know from other tutorials as an e
om.minimize(
- criterion=sphere,
+ fun=sphere,
params=3,
algorithm="scipy_lbfgsb",
)
diff --git a/docs/source/how_to/index.md b/docs/source/how_to/index.md
index f16634510..747a50c24 100644
--- a/docs/source/how_to/index.md
+++ b/docs/source/how_to/index.md
@@ -21,7 +21,6 @@ how_to_scaling
how_to_logging
how_to_errors_during_optimization
how_to_slice_plot
-how_to_batch_evaluators
how_to_benchmarking
how_to_first_derivative
how_to_second_derivative
diff --git a/docs/source/tutorials/optimization_overview.ipynb b/docs/source/tutorials/optimization_overview.ipynb
index 93cb7e5f9..51a4a4b66 100644
--- a/docs/source/tutorials/optimization_overview.ipynb
+++ b/docs/source/tutorials/optimization_overview.ipynb
@@ -11,7 +11,7 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
@@ -55,7 +55,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
")\n",
@@ -105,7 +105,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=dict_sphere,\n",
+ " fun=dict_sphere,\n",
" params={\"a\": 0, \"b\": 1, \"c\": pd.Series([2, 3, 4])},\n",
" algorithm=\"scipy_powell\",\n",
")\n",
@@ -152,7 +152,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=dict_sphere,\n",
+ " fun=dict_sphere,\n",
" params={\"a\": 0, \"b\": 1, \"c\": pd.Series([2, 3, 4])},\n",
" algorithm=\"scipy_neldermead\",\n",
")\n",
@@ -173,7 +173,7 @@
"outputs": [
{
"data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAArwAAAH0CAYAAADfWf7fAAAgAElEQVR4Xu3de3xUxf3/8Q9yJ1guUVIkaAVNEEtIgyLQApZLKxSFeoFKFbxb/CkioBYERUBa+hWl2kqllYIIgoAFoaCWoKIFioKQQIREoEKgBA03EyBc5Pf4TD3bTchlN7N7snv2df5BkjPnnHnOEN87mTNT7ezZs2eFAwEEEEAAAQQQQAABjwpUI/B6tGWpFgIIIIAAAggggIARIPDSERBAAAEEEEAAAQQ8LUDg9XTzUjkEEEAAAQQQQAABAi99AAEEEEAAAQQQQMDTAgReTzcvlUMAAQQQQAABBBAg8NIHEEAAAQQQQAABBDwtQOD1dPNSOQQQQAABBBBAAAECL30AAQQQQAABBBBAwNMCBF5PNy+VQwABBBBAAAEEECDw0gcQQAABBBBAAAEEPC1A4PV081I5BBBAAAEEEEAAAQIvfQABBBBAAAEEEEDA0wIEXk83L5VDAAEEEEAAAQQQIPDSBxBAAAEEEEAAAQQ8LUDg9XTzUjkEEEAAAQQQQAABAi99AAEEEEAAAQQQQMDTAgReTzcvlUMAAQQQQAABBBAg8NIHEEAAAQQQQAABBDwtQOD1dPNSOQQQQAABBBBAAAECL30AAQQQQAABBBBAwNMCBF5PNy+VQwABBBBAAAEEECDw0gcQQAABBBBAAAEEPC1A4PV081I5BBBAAAEEEEAAAQIvfQABBBBAAAEEEEDA0wIEXk83L5VDAAEEEEAAAQQQIPDSBxBAAAEEEEAAAQQ8LUDg9XTzUjkEEEAAAQQQQAABAi99AAEEEEAAAQQQQMDTAgReTzcvlUMAAQQQQAABBBAg8NIHEEAAAQQQQAABBDwtQOD1dPNSOQQQQAABBBBAAAECL30AAQQQQAABBBBAwNMCBF5PNy+VQwABBBBAAAEEECDw0gcQQAABBBBAAAEEPC1A4PV081I5BBBAAAEEEEAAAQIvfQABBBBAAAEEEEDA0wIEXk83L5VDAAEEEEAAAQQQIPDSBxBAAAEEEEAAAQQ8LUDg9XTzUjkEEEAAAQQQQAABAi99AAEEEEAAAQQQQMDTAgReTzcvlUMAAQQQQAABBBAg8NIHEEAAAQQQQAABBDwtQOD1dPNSOQQQQAABBBBAAAECL30AAQQQQAABBBBAwNMCBF5PNy+VQwABBBBAAAEEECDw0gcQQAABBBBAAAEEPC1A4PV081I5BBBAAAEEEEAAAQIvfQABBBBAAAEEEEDA0wIEXk83L5VDAAEEEEAAAQQQIPDSBxBAAAEEEEAAAQQ8LUDg9XTzUjkEEEAAAQQQQAABAi99AAEEEEAAAQQQQMDTAgReTzcvlUMAAQQQQAABBBAg8NIHEEAAAQQQQAABBDwtQOD1dPNSOQQQQAABBBBAAAECL30AAQQQQAABBBBAwNMCBF5PNy+VQwABBBBAAAEEECDw0gcQQAABBBBAAAEEPC1A4PV081I5BBBAAAEEEEAAAQIvfQABBBBAAAEEEEDA0wIEXk83L5VDAAEEEEAAAQQQIPDSBxBAAAEEEEAAAQQ8LUDg9XTzUjkEEEAAAQQQQAABAi99AAEEEEAAAQQQQMDTAgReTzcvlUMAAQQQQAABBBAg8NIHEEAAAQQQQAABBDwtQOD1dPNSOQQQQAABBBBAAAECL30AAQQQQAABBBBAwNMCBF5PNy+VQwABBBBAAAEEECDw0gcQQAABBBBAAAEEPC1A4PV081I5BBBAAAEEEEAAAQIvfQABBBBAAAEEEEDA0wIEXk83L5VDAAEEEEAAAQQQIPDSBxBAAAEEEEAAAQQ8LUDg9XTzUjkEEEAAAQQQQAABAi99AAEEEEAAAQQQQMDTAgReTzcvlUMAAQQQQAABBBAg8NIHEEAAAQQQQAABBDwtQOD1dPNSOQQQQAABBBBAAAECL30AAQQQQAABBBBAwNMCBF5PNy+VQwABBBBAAAEEECDw0gcQQAABBBBAAAEEPC1A4PV081I5BBBAAAEEEEAAAQIvfQABBBBAAAEEEEDA0wIEXk83L5VDAAEEEEAAAQQQIPDSBxBAAAEEEEAAAQQ8LUDg9XTzUjkEEEAAAQQQQAABAi99AAEEEEAAAQQQQMDTAgReTzcvlUMAAQQQQAABBBAg8NIHEEAAAQQQQAABBDwtQOD1dPNSOQQQQAABBBBAAAECL30AAQQQQAABBBBAwNMCBF5PNy+VQwABBBBAAAEEECDw0gcQQAABBBBAAAEEPC1A4PV081I5BBBAAAEEEEAAAQIvfQABBBBAAAEEEEDA0wIEXk83L5VDAAEEEEAAAQQQIPDSBxBAAAEEEEAAAQQ8LUDg9XTzUjkEEEAAAQQQQAABAq9FH7j88sslJyfH4goURQABBBBAAAEEEAi3AIHXQpjAa4FHUQQQQAABBBBAwCUBAq8FNIHXAo+iCCCAAAIIIICASwIEXgtoAq8FHkURQAABBBBAAAGXBAi8FtAEXgs8iiKAAAIIIIAAAi4JEHgtoAm8FngURQABBBBAAAEEXBIg8FpAE3gt8CiKAAIIIIAAAgi4JEDgtYAm8FrgURQBBBBAAAEEEHBJgMBrAU3gtcCjKAIIIIAAAggg4JIAgdcCmsBrgUdRBBBAAAEEEEDAJQECrwU0gdcCj6IIIIAAAggggIBLAgReC2gCrwUeRRFAAAEEEEAAAZcECLwW0AReCzyKIoAAAggggAACLgkQeC2gCbwWeBRFAAEEEEAAAQRcEiDwWkATeC3wKIoAAggggAACCLgkQOC1gA4k8GYfOSDxdeIkvnacxZ0oigACCCCAAAIIIFBZAQJvZeVEpKLAm19UKKPXLzF3eLnzQIs7URQBBBBAAAEEEECgsgIE3srKBRB40/dukzd2bjR3GJPWS5rHNbK4G0URQAABBBBAAAEEKiNA4K2M2rdlKhrhfSlrtWzOzzVnD0/pLskNEizuRlEEEEAAAQQQQACByggQeCujFmDgvf/Dub6rE3gtoCmKAAIIIIAAAghYCBB4LfDKG+HdfiRPnstI9119cFIH6ZTQwuJuFEUAAQQQQAABBBCojACBtzJqAYzwzt+xQVbt2+67ep+L28j1l7SxuBtFEUAAAQQQQAABBCojQOCtjFoAgVdXZ9BVGi5v0ERyjhwQAq8FNEURQAABBBBAAAELAQKvBV5ZUxqc6QyNa8eZaQzLdmcSeC2cKYoAAggggAACCNgIEHgt9MoKvDOz18ravF3S7aJkSb0g0czl1ZHekSk9LO5GUQQQQAABBBBAAIHKCBB4K6P2bZnSAu+x0yfNZhPHz5ySSe37ylcnCgi8FsYURQABBBBAAAEEbAViJvAePXpU/vCHP8jy5cuN2fe+9z2ZOHGi+VOPRYsWydSpU+XkyZPSs2dPefrpp6V69erl+pYWeNfk7ZRZ2et8I7rO9AZGeG27KuURQAABBBAIv0BGRob8+te/li+//FJatWolU6ZMkSZNmoT/xtwhrAIxE3i146anp0u/fv2kTp06MnPmTFm9erXMmDFDdu3aJYMGDZJ58+aZTj1ixAhJTU2Vu+66K6jA6z+66yxDpl97ZO1CqVu9pkztdEtYG5OLI4AAAggggEDlBc6cOWMGvcaNGyddunSRWbNmyZo1a+Tll1+u/EUpGRECMRN4S2rn5OTIgw8+KO+8845Mnz5ddAR45MiR5rTPPvtMRo0aJYsXLw4q8DpLkZUczXU2oHi588CIaHQeAgEEEEAAAQTOFdi8ebP57e+CBQvMN7/55hvp2LGjrFy5Us4//3zIolggJgPvwYMHzac3/VXFAw88IKNHj5arrrpKbrzxRtOURUVF5u+ZmZkBBV4dxdU1d5d+8d/zde5ufO04X1kCbxT/C+HREUAAAQTCKqArGVXFocuFljyWLFliRnQnT57s+9bNN98sTz75pKSkpFTFY3LPEAnEVODNzc2Vm266STTwdu7cWZ599llp3LixDB8+XHr06CG9e/f2ser83OzsbKlWrZr5ms7/LXn8/ve/l24zni725f4t0qR7s1bFvjZq/RI5WFR4ThAOURtyGQQQQAABBKJWwBkUcrsCpf3Wdf78+ZKVlWXe43GO2267zfxGuEOHDm4/IvcLoUBMBV7HTUdwly1bJtOmTZO3335bnnrqKWnbtq3079/fnFJQUGA69pYtW3zUpU1vePTRR32BV6cx3JncsdjIrlP42YyVZvOJ4SndJblBQgibj0shgAACCCAQ3QKRNML71ltvyQcffGBeVHOOvn37yvjx401O4IhegZgMvE5z/fCHPzSrM+jKDXl5eWberh76huYTTzwhS5cuLbdly1qHt2QhAm/0/gPhyRFAAAEEYkdg69at5v//ziDX6dOnpX379rJq1Spp2LBh7EB4sKYxE3h37twpcXFxkpDw3xFWfVntmWeeMZ/k9u7dKwMHDpTXX3/dt0pDUlKS+RVGeUewgXdI6y6SGp/owW5ElRBAAAEEEIh+AX1J7brrrpOxY8eaqY+6SoO+sDZ79uzor1yM1yBmAq9OQteAe+jQIalRo4Zcdtll8vjjj0tycrLpAjrFQSepnzhxQrp27SqTJk2SWrVqhSTw6stsbC8c4//SqD4CCCCAQFQIbNu2TR577DHZt2+ftGzZ0rzv07x586h4dh6ybIGYCbzh6ASBjvASeMOhzzURQAABBBBAAIHABAi8gTmVehaB1wKPoggggAACCCCAgEsCBF4L6EAD76b8XJmWtVraxifKA627WNyRoggggAACCCCAAALBChB4gxXzOz/QwLv9SJ48l5EuJXdgs7g1RRFAAAEEEEAAAQQCFCDwBghV2mkEXgs8iiKAAAIIIIAAAi4JEHgtoAMNvLr18CNrF0rd6jVlaqdbLO5IUQQQQAABBBBAAIFgBQi8wYr5nR9o4NUiw9YskONnTsnzHW+WejXKX+7M4pEoigACCCCAAAIIIFBCgMBr0SWCCbzstmYBTVEEEEAAAQQQQMBCgMBrgRdM4J2/Y4Os2rdd+rdIk+7NWlnclaIIIIAAAggggAACwQgQeIPRKnFuMIE3fe82eWPnRul2UbIMaNnO4q4URQABBBBAAAEEEAhGgMAbjJZF4PVfmqx5XCMz2psanyhDWJfXogUoigACCCCAAAIIVCxA4K3YqMwzghnh9V+pQS+oL7DpMTyluyQ3SLB4CooigAACCCCAAAIIlCdA4LXoH8EEXr2Ns1KD/y3ja8fJmLRerNxg0Q4URQABBBBAAAEECLxh6gPBBl5npQZ9HH15bU3eTsktPCyDkzpIp4QWYXpKLosAAggggAACCMS2ACO8Fu0fbOB1VmrQW05q31e2H86TWdnrREd59e8cCCCAAAIIIIAAAqEXIPBamAYbeJ2VGhLjGsrYtN7mzqPWL5GDRYXM5bVoB4oigAACCCCAAALlCRB4LfpHsIHXuZW+wObstrb0i0xZtjtTOiZcKnckdbR4GooigAACCCCAAAIIlCZA4LXoF5UNvP63zC8qlNHrl5gvvdx5oMXTUBQBBBBAAAEEEECAwBviPhCKwKuPNGHjcvPyGkuUhbiBuBwCCCCAAAIIICAijPBadINQBd6Z2Wtlbd4uVmuwaAuKIoAAAggggAACZQkQeC36RqgCrzOPt8/FbeT6S9pYPBFFEUAAAQQQQAABBEoKEHgt+kSoAq//tsMjU3pYPBFFEUAAAQQQQAABBAi8IewDoQq8zotrdavXlKmdbgnhE3IpBBBAAAEEEEAAAUZ4LfpAqAKvPsL9H841T/J8x5vZZtiiTSiKAAIIIIAAAggwwhvCPhDKwOtsO8xKDSFsIC6FAAIIIIAAAgiwSoNdHwhl4HVWaujfIk26N2tl92CURgABBBBAAAEEEPAJMKXBojOEMvA6KzV0uyhZBrRsZ/FUFEUAAQQQQAABBBDwFyDwWvSHUAZeZ6WGxLhG8sCVXSS+dpzFk1EUAQQQQAABBBBAwBEg8Fr0hVAGXv8thvWRmNpg0TAURQABBBBAAAEE/AQIvBbdIZSBVx9DR3nT926Xzfm5cnmDJsKavBaNQ1EEEEAAAQQQQOBbAQKvRVcIdeDVR9lTeEgmblwhOrVhbFovi6ejKAIIIIAAAggggIAKEHgt+kE4Aq8+jrMm78udB1o8HUURQAABBBBAAAEECLyWfSDcgZdNKCwbiOIIIIAAAggggAAjvHZ9IFyBl00o7NqF0ggggAACCCCAgL8AUxos+kO4A++Q1l0kNT7R4gkpigACCCCAAAIIIEDgtegD4Qq8ziYUfS5uI9df0sbiCSmKAAIIIIAAAgggQOC16AMEXgs8iiKAAAIIIIAAAi4JxEzgPXXqlEybNk0WLlwoZ86ckaSkJJk0aZI0bdrUULdu3Vpq1KjhY586dap069at3GYIV+B1dl1jLV6X/hVwGwQQQAABBBDwtEDMBN7Dhw/L3LlzZdCgQVK/fn158cUXJScnR1544QXR7916662yYsWKoBqbwBsUFycjgAACCCCAAAJVIhAzgbekblZWljz22GOybNky2bFjh4wbN05mz54dVCOEK/DqQ7AWb1BNwckIIIAAAggggECZAjEbeOfMmSNbt2410xoyMzPNyG+TJk3k9OnTcu2118qIESOkXr16PrjCwsJzEFNTU80ocTgOAm84VLkmAggggAACCMSiQEwG3v3798vtt98uM2fOlGbNmpl2LygoMFMdjh49Kk8++aQ0btzY/Okc/fr1O6d/aGAOV+BlLd5Y/OdInRFAAAEEEEAgHAIxF3gPHTokgwcPlkcffVQ6d+5cqumuXbvknnvukfT09HLNwzmlgcAbju7ONRFAAAEEEEAgFgViKvB+/fXXcuedd8rdd98tvXr1KrO9P//8cxk2bJiZ31veEc7AO3/HBlm1b7t0uyhZBrRsF4t9kzojgAACCCCAAAIhEYiZwKtzcHXUVqcy9O7duxieTk3QKQy6RNmxY8fk8ccfN8uWPfTQQ1UWeDfl58q0rNVSt3pNmdS+r9SrUSskDc5FEEAAAQQQQACBWBOImcA7b948GTt2rJx33nnF2nj+/Plm3u748eNFQ3HdunVNIB46dKjUqlV+yAznCK8+pDOtQbcX1m2GORBAAAEEEEAAAQSCF4iZwBs8TcUlwh1484sKZcKG5XL8zCkZk9ZLmsc1qvihOAMBBBBAAAEEEECgmACB16JDhDvw+o/y6givjvRyIIAAAggggAACCAQnQOANzqvY2W4E3qVfZMqy3ZnS5+I2cv0lbSyelqIIIIAAAggggEBsChB4LdqdwGuBR1EEEEAAAQQQQMAlAQKvBbQbgTd97zZ5Y+dGliezaCeKIoAAAggggEBsCxB4LdrfjcC7/UiePJeRLpc3aCIjU3pYPC1FEUAAAQQQQACB2BQg8Fq0O4HXAo+iCCCAAAIIIICASwIEXgtoNwLvnsJDMnHjCkmMayRj08reHc6iGhRFAAEEEEAAAQQ8LUDgtWheNwKvPt79H841T/ly54EWT0tRBBBAAAEEEEAgNgUIvBbt7nbg1S2GZ25fa554BPN5LVqOoggggAACCCAQSwIEXovWdivwDluzwOy2Njipg8zKXsdor0WbURQBBBBAAAEEYk+AwGvR5m4F3mczVkrOkQPSNj5RNufnmicentJdkhskWDw9RRFAAAEEEEAAgdgQIPBatLPbgVeXJtPgq4eO9nZKaGHx9BRFAAEEEEAAAQRiQ4DAa9HObgXemdlrZW3ermJPylbDFg1HUQQQQAABBBCIKQECr0VzuxV4l36RKct2ZxZ7Up3e8EDrLhZPT1EEEEAAAQQQQCA2BAi8Fu1clYGXdXktGo6iCCCAAAIIIBBTAgRei+Z2K/CuydvpW52hce04OVhUaJ6adXktGo+iCCCAAAIIIBAzAgRei6Z2K/BuP5Inz2WkmyfVF9fyTxSa0Kvr8sbXjrOoAUURQAABBBBAAAHvCxB4Ldq4KgJvt4uSRbcb1tUaWJrMovEoigACCCCAAAIxI0DgtWhqtwLvsdMn5ZG1C82T9m+RJl+dKJRV+7YLKzVYNB5FEUAAAQQQQCBmBAi8Fk3tVuDVR7z/w7nmSXVUN/vwAbNqg472DmjZzqIGFEUAAQQQQAABBLwvQOC1aGM3A6/utqbHnckd5asTBWZOr87nHZnSw6IGFEUAAQQQQAABBLwvQOC1aGM3A6//YzovsRF4LRqPoggggAACCCAQMwIEXoumrqrAq4/sTHFgaTKLBqQoAggggAACCMSEAIHXopkJvBZ4FEUAAQQQQAABBFwSIPBaQFdl4B21fglr8Vq0HUURQAABBBBAIHYECLwWbV2VgVdfYmMtXovGoygCCCCAAAIIxIwAgdeiqQm8FngURQABBBBAAAEEXBIg8FpAV2XgXfpFplmLl80nLBqQoggggAACCCAQEwIEXotmJvBa4FEUAQQQQAABBBBwSYDAawFdlYF3Td5OmZW9TjomXCp3JHW0qAVFEUAAAQQQQAABbwu4GnhPnTolX331lTRt2tQTqlUZeNl8whNdiEoggAACCCCAgAsCrgTeI0eOyFNPPSVvv/22nD17VrZv326qtmTJEvniiy9k6NChLlQ19Lcg8IbelCsigAACCCCAAAKhFnAl8I4cOVKKiopMsO3Xr59s3brV1EOD73333ScffPBBqOvlyvWqMvBqBdltzZVm5iYIIIAAAgggEOUCrgTeq666St59911p3LixXHnllb7AqyO/HTt2lKysrKhkJPBGZbPx0AgggAACCCAQYwKuBN527drJ8uXLJSEhoVjgXb9+vQwfPlw++uijqGSv6sDLbmtR2W14aAQQQAABBBBwWcCVwPv000/Lvn37ZNy4cdKjRw/5+OOPZcOGDebvvXv3lhEjRoS92vrC3LRp02ThwoVy5swZSUpKkkmTJvleoFu0aJFMnTpVTp48KT179hR95urVq5f7XFUdeNltLezdhhsggAACCCCAgAcEXAm8GiI1TM6ePVtOnDhh2GrVqiV33nmnDBs2TGrUqBF2ysOHD8vcuXNl0KBBUr9+fXnxxRclJydHXnjhBdm1a5f5+rx586RJkyYmgKempspdd90VFYF3SOsukhqfGHZDboAAAggggAACCESjgCuB14HRUda9e/eaEdbmzZub0FtVh84bfuyxx2TZsmUyffp0OXr0qOjLdXp89tlnMmrUKFm8eHFEB152W6uq3sN9EUAAAQQQQCCaBFwNvJEEM2fOHPPynE5rGD16tOiLdTfeeKN5RF1RQv+emZlJ4I2kRuNZEEAAAQQQQACBSgi4EngrmqM7ZcqUSjx65Yvs379fbr/9dpk5c6Y0a9bMvDinc4t1PrFz6Pzc7OxsqVatmvmSzv8teTz33HNmWkRVHYzwVpU890UAAQQQQACBaBJwJfDqPFn/Q0dQd+/eLatWrTJr895///2umR06dEgGDx4sjz76qHTu3Nnc94knnpC2bdtK//79zd8LCgqkQ4cOsmXLFt9zvfnmm+c84+OPP16lgZfd1lzrNtwIAQQQQAABBKJYwJXAW5aPrtTwl7/8pdTR03CYfv311+ZFubvvvlt69erlu8WMGTMkLy/PzNvVIyMjw4TgpUuXlvsYVb1KA4E3HL2EayKAAAIIIICA1wSqNPAq5o9//GN57733wu5aWFgo99xzj5nK4D91QW+cm5srAwcOlNdff923SoMuW/bggw8SeMPeMtwAAQQQQAABBBAIr0CVBl7dae3666+X1atXh7eWImbJsbFjx8p5551X7F7z5883S5Dpag2TJ082y6Z17drVvMxW0SoSVT3Cm19UKKPXL5H42nEyqX3fsBtyAwQQQAABBBBAIBoFXAm8+nJXyUOD5fvvvy+dOnUyG1BE41HVgVfN7v9wrqF7ufPAaCTkmRFAAAEEEEAAgbALuBJ4db3bkkdcXJx8//vfl379+lW4o1nYFSp5AwJvJeEohgACCCCAAAIIuCjgSuB1sT6u3orA6yo3N0MAAQQQQAABBColELbA++WXXwb8QBdeeGHA50bSiZEQeCdsXC65hYdleEp3SW6QEEk8PAsCCCCAAAIIIBARAmELvBoGAz2qcvOGQJ+xtPMiIfA+m7FSco4cIPDaNCRlEUAAAQQQQMDTAmELvLoMWKCHzueNxoPAG42txjMjgAACCCCAQKwJhC3wxgJkJATel7JWy+b8XBnSuoukxifGAjt1RAABBBBAAAEEghJwJfAeP35c5syZI9nZ2aLbCpc8fv/73wf10JFyciQE3qVfZMqy3ZnS5+I2cv0lbSKFhudAAAEEEEAAAQQiRsCVwPvwww/L7t27pWfPnvLaa6/JrbfeKrt27TIbTkycOFGuu+66iAEJ5kG8Fnj3FB6S46dPSWJcQ6lXo1YwFJyLAAIIIIAAAghErIArgTctLU3+8Y9/SHx8vNlZbenSpQZkyZIlJvROmTIlYoHKezCvBV7nBThGi6OyO/LQCCCAAAIIIFCGgGuB96OPPpJ69erJDTfcIIsXLzZb/Opua9dcc41s3rw5KhsoEgLvpvxcmZa1WtrGJ8rx0yeNY/+W7aR5XKOgTQm8QZNRAAEEEEAAAQSiQMCVwDto0CC59957pXPnzjJ06FDp0aOHCb4adO+77z7517/+FQVU5z5iJATe7Ufy5LmMdGlcO04OFv13ZYzBSR2kU0KLoE0JvEGTUQABBBBAAAEEokDAlcC7ZcsW+c53viMXX3yxZGRkyODBg83fdXMKDcC/+tWvooAqsgOv/9NVdkoCgTcquyEPjQACCCCAAAIVCLgSeEs+w/79+0VDcLNmzeSKK66I2kaKpBFef8TLGzSRkSk9gnYl8AZNRgEEEEAAAQQQiAIBVwJvr169zBQGfWEtMdE7a8VGQuDVPnb/h3OLdbW61WvK1E63BN39nMDbMeFSuSOpY9DlKYAAAggggAACCESigCuBd9GiRfL3v/9d1q5dKykpKSb8aghu3CkmUsAAACAASURBVLhxJJoE/EyRGHg17B4/c0omte8r8bWD28HOCc6VHSEOGI4TEUAAAQQQQAABFwVcCbxOfQ4fPizvvvuuLF++XD755BPp1KmTCb99+vRxscqhu1UkBl4NqzlHDlRq5zUCb+j6BldCAAEEEEAAgcgRcDXw+ld7x44dMmHCBPnnP/8pOTk5kSMSxJNESuAdtX6JWaFBw25yg4RK77xG4A2i8TkVAQQQQAABBKJGwNXAe/DgQXn77bfNCO+mTZvkRz/6kRnh7d27d9SA+T9opARe/7m3qfHNzbq8lZmWQOCNym7IQyOAAAIIIIBABQKuBN4FCxaYOby63m5qaqpvDm/Dhg2juoEiLfD2b5EmqRc0l9Hrl5j5uzqPN5jDCbyJcY1kbFqvYIpyLgIIIIAAAgggELECrgReHcHt27evWaXhoosuiliMYB8sUgJvflGhfHWiwOyuVq9GLRm2ZoF5ce35jjebvwd6+K/28HLngYEW4zwEEEAAAQQQQCCiBVwJvBEtYPFwkRJ4S1bBmeIwPKW7mdMbyHHs9El5ZO1C36kE3kDUOAcBBBBAAAEEokGAwGvRSpEaeOfv2CCr9m2XYHZcc7YodjgIvBYdg6IIIIAAAgggEFECBF6L5ojUwLsmb6fMyl4nbeMT5YHWXQKqIYE3ICZOQgABBBBAAIEoFCDwWjRapAbePYWHZOLGFUG9uEbgtegIFEUAAQQQQACBiBZwLfAWFRXJnj17pKCg4BwQXbkhGo9IDbxq6byAFuiLayUD75i0XuYlOA4EEEAAAQQQQCDaBVwJvO+//76MGDFCTp48KXXq1DnH7OOPP45Kx0gOvMG+uFYy8AbzwltUNh4PjQACCCCAAAIxI+BK4O3Vq5c88sgj8pOf/MRTsJEceJ0X13Rt3u7NWlXoTuCtkIgTEEAAAQQQQCBKBVwJvNdee63oKK/XjkgOvOl7t8kbOzdKx4RL5Y6kjhXSO+c7JzLCWyEZJyCAAAIIIIBAlAi4Enh/9rOfyfz586V+/fpRwhLYY0Zy4HVGbAPdYnjpF5mybHemr+IE3sD6AGchgAACCCCAQOQLuBJ43333XXnttddkyJAhcvHFF0utWsV3/7rwwgsjX6qUJ4zkwKu7r+kWw3Wr15SpnW6p0Ldk4B3SuoukxieKXmdt3k5pXDtOOiW0qPA6nIAAAggggAACCESagCuBNyUlRY4fP15m3XNyciLNJaDnieTAqxVwVmoIZBOJkoHX2bTCGSlOjGskY9N6BeTCSQgggAACCCCAQCQJuBJ4CwsLy61zXFxcJJkE/CyRHnhHrV8iB4sKZVL7vmZN3vKOsgLvpvxcmZa12hQNJDgHjMeJCCCAAAIIIICASwKuBF6X6uL6bSI98AazNFlZgdf/6wRe17sYN0QAAQQQQACBEAi4Fng3bdok06dPF52+cPbsWbnsssvk7rvvlquvvjoE1aiaS0R64H0pa7Vszs8VZz5ueUrOMmY65/f4mVPiTGkg8FZN3+KuCCCAAAIIIBA6AVcC7/Lly2Xs2LEyaNAgufLKK6VatWqyZcsWefXVV+Wpp56SG264IXQ1cvFKkR54nbDqhNfyaJzRYF3VIefIgVIDbyBTI1zk51YIIIAAAggggEBAAq4EXt1w4oknnpCuXbsWeyhdm3fSpEmiqzhE4xEtgbfbRckyoGW7comdwKvnrtq3XZzlzGZmr5W1ebtMWZYqi8ZeyjMjgAACCCCAgCuB94orrpBPPvlESr6cVlBQIO3bt5esrCzXWuK9996T4cOHy5w5c6R169a+++p/16hRw/f3qVOnSrdu3cp9rkgPvMGsxesEXh0N1vV4ncDrfJ3A61oX5UYIIIAAAgggEGIBVwJv9+7d5ZlnnpEOHToUe/yPPvpIJkyYIO+8806Iq1X65V555RVZtWqVHDt2zDyPE3gPHz4st956q6xYsSKo5yDwBsXFyQgggAACCCCAQJUIuBJ4582bJ88//7x5SU1Dpr60pqO6GkB1qkPfvn1dqfy6deskLS1NBg8ebOYUO4F3x44dMm7cOJk9e3ZQzxHpgVcrE+havIGM8AYyFzgoQE5GAAEEEEAAAQRcEHAl8Go9dCqBBlwNl6dPn5bk5GQTgH/84x+7UM3itxgwYIB5Wc4JvJmZmeaFuiZNmphnu/baa2XEiBFSr149X8ETJ06c85xt2rQxq05E8mEbeIetWWBWbdCDwBvJLc2zIYAAAggggEBZAq4F3khqgpKBV59N5xPXr19fjh49Kk8++aQ0btzY/Okc119//TlV2LZtW8QH3gkbl0tu4eEKXzhzgu2YtF4yceMKs1GFrsrgBGYCbyT1YJ4FAQQQQAABBIIRCGvg/fLLL6VBgwZy5MiRcp/pwgsvDOaZrc8tLfD6X3TXrl1yzz33SHp6ern3ioYpDYFuPuE/ElzafytEIKs9WDcOF0AAAQQQQAABBEIsENbA26lTJ7ntttvM/N3yDrenBVQUeD///HMZNmyYLFu2LOoDr7OhRP8WadK9Wasy61NayH2+483yyNqFvjLOyg0h7oNcDgEEEEAAAQQQCKtAWAOvThPQebDHjx8vtxIllysLa41FpGTg3bp1q5nC0LRpU7OCw+OPPy5JSUny0EMPRX3gdTaf0CkKg5M7SHKDhFLrVFrg1XV3n8v43yg3gTfcPZPrI4AAAggggEA4BMIaeJ0H1h3Vfv7zn8v5558fjjoEfc2SgXf16tUyfvx4KSwslLp160rv3r1l6NChUqtWragPvHsKD8lLW1fLwaJCU5eyNo8g8AbdjSiAAAIIIIAAAlEi4Erg1e2EdTe1Zs2aRQlLYI8ZDXN4nZq8lLVaNufnlrnSQnmBt3HtOBOY61avKVM73RIYDmchgAACCCCAAAIRIuBK4NX5sLqEly5D5qUjmgJv+t5t8sbOjaW+eHbs9EkzV9cJtM6Lbh0TLjXbCjt/atu93Hmgl5qQuiCAAAIIIIBADAi4Eng//fRTGTNmjOgWw9dcc42ZNuB/9OnTJyqpoynwlrfNcMnvOYFX5+zmHDlgRoV1u2ECb1R2Ux4aAQQQQACBmBdwJfD269evXOjFixdHZUNEU+DNLyqU0euXmPV1da3daVmrjfmIlB5SMvA6Kzs4jULgjcruyUMjgAACCCCAwLcCrgRer2pHU+DVNnDm6fqvvqD/rYeuxuCswrApP9cXiPV7Gng1FOtob1kvvXm1jakXAggggAACCES/gGuBt6ioSDZt2iT79++Xvn37GrmTJ0+aPytaDSFSmaMt8Dq7rrWNTzQvsDlhNqlhk2KB1xkNdtyHtO4iK/duI/BGakfkuRBAAAEEEECgXAFXAq9u5KA7l+kat7p1r27Jq8e8efPkww8/lD/+8Y9R2UzRFnidubn6ctrxM6eMuY7qXn9Jm2KBV7/ubDWs/62jurqeLyO8UdlNeWgEEEAAAQRiXsCVwKu7rXXt2lXuvfde0SXKdKMHPXbv3i233HKL/Otf/4rKhoi2wOtsQlES25ni4L+xhBOOncCbvne7GRUenNRBOiW0iMr24qERQAABBBBAIDYFXAm8KSkpsm7dOrPrmn/gPXjwoPzoRz+SrKysqNSPtsDrLE2m2DrKG18nTnILD4szxaHbRckyoGU70xb+4Vhfctv0Va5ZqUHn8+qIMAcCCCCAAAIIIBAtAq4E3i5dusif/vQnad26dbHAq6sz6NfffvvtaPEq9pzRFnid1Ri0Ejqa2zyukazat91XJ/8w6//imq696wRgAm9UdlUeGgEEEEAAgZgWcCXwzp07V2bMmCEPPfSQjBo1Sl566SX55JNPZNasWTJx4kTfS2zR1hLRFnj9X0bT4Nq8fiPfagw64ntHckdJjU80zeB/rn/g9R8Fjrb24nkRQAABBBBAIDYFXAm8SvvBBx+Y0KsvsJ05c0Yuu+wyuf/++6Vz585RKx9tgVehnaXJdOWFpAZNzNq83Zu1ku7NkqVejVpltkV5G1dEbQPy4AgggAACCCAQEwKuBV4vakZj4NV5vMdOn5JO321hNqHQbYXLC7pOuzkjvs72w15sT+qEAAIIIIAAAt4UcCXw6koMCxYsOEfw+PHjMmDAAHnrrbeiUjcaA68NtLNU2aT2fU1Y5kAAAQQQQAABBKJBwJXAe/XVV8vHH398jseXX34p3bt3l4yMjGiwOucZYy3wOkuV6XQIZ65vVDYcD40AAggggAACMSUQ1sCrL6mdPXtWVq1aJd26dSsGq/N4MzMzJS0tTV544YWoRI+1wMtKDVHZTXloBBBAAAEEYl4grIF337598v7778vkyZPlgQceKIZ93nnnSWJiovTs2VNq1KgRlQ0Ra4HXWapM1+19oHUX02Z7Cg/J8dOnzAtwHAgggAACCCCAQCQKhDXwOhVetGiR3HTTTZFYf6tnirXAq+F24sYVZv6uzuPVw1n1gXm9Vl2JwggggAACCCAQRoGwBl6do9ugQQM5cuRIuVW48MILw1jF8F061gKvf8B9vuPNZnT3uYx0A8yWw+HrZ1wZAQQQQAABBOwEwhp4O3XqJLfddps8//zz5T5lTk6OXS2qqHQsBl7nxbXhKd3NVIZpWauNPhtSVFEn5LYIIIAAAgggUKFAWANvQUGB1KtXT3T5sfKOuLjoXOIqFgPv/B0bzHbEulObHst2Z5o/E+Maydi0XhV2OE5AAAEEEEAAAQTcFghr4HUq8+qrr8rPf/5zOf/8892uX1jvF4uBd03eTpmVvU46JlxqbNfm7fIZ6xbEHAgggAACCCCAQKQJuBJ4r7zySnn33XelWbNmkVZ/q+eJxcDrbDGsI7p1a9SUnCMHfIY6zSG5QYKVKYURQAABBBBAAIFQC7gSeIcNGyZt2rSRu+++O9TPX6XXi8XAq+DOygy6zfDxM6dElynbnJ9rpjlcf8l/pzpwIIAAAggggAACkSLgSuD99NNPZcyYMXLFFVfINddcI3Xr1i1W/z59+kSKR1DPEauBd8LG5ZJbeNhnpSs06DQHXYt3REoP83UdCb6gTn22IA6qR3EyAggggAACCIRDwJXA269fv3KfffHixeGoW9ivGauBd2b2Wt/c3cS4hvLAlV1l9PolxrtjQgvJP1Eg2UcOsFRZ2HsgN0AAAQQQQACBQARcCbyBPEg0nhOrgTd97zZ5Y+dG02SXN2giI1N6iPMym387MsUhGns1z4wAAggggID3BMIaeNPT06Vp06bSunXrUuX+85//yNGjRyU5OTkqZWM18Dovrmmj+Yda3YhiyuaVkli/kXmZTVdyuCOpY1S2LQ+NAAIIIIAAAt4RCGvgvfHGG2XgwIFy8803lyr20UcfyZQpU+Rvf/tbVIrGauA9dvqkPLJ2oWmz/i3SpHuzVsXazwnEzuhvVDYuD40AAggggAACnhEIa+D9wQ9+IAsXLpSWLVuWCrZnzx654YYbRF9qi8YjVgOvttWo9UvkYFGhlLYUmY70Tty4wrywNql932hsWp4ZAQQQQAABBDwkENbAm5KSIsuXL5fExMRSyfbv3y/dunWTrKysqCSN5cBbUYM5S5exGUVFUnwfAQQQQAABBMItENbA27dvX7P2ro7ilna888478rvf/U50rm80HgTesltt2JoFZo1eHeHVkV4OBBBAAAEEEECgqgTCGnh1S+EZM2bIa6+9ds4o74EDB+S2226Tnj17yqOPPlpV9be6L4G3bL5nM1aaF9fK2n0tv6hQ8k8USnydOAKxVS+kMAIIIIAAAghUJBDWwPvNN9/Iww8/LO+9957oC2zOagyff/65vPnmm2Zu7+zZsyUuLjpHAAm8ZXevl7JWm93XdFOKTgktzjnRWcuXpcsq+ifK9xFAAAEEEEDAViCsgdd5uL///e+ydOlS2bVrl5w+fdqM9vbo0UMGDBggtWrVsq1DlZUn8JZNv/SLTFm2O7PM7YadEWACb5V1X26MAAIIIIBAzAi4Eni9qkngLbtlnc0pul2ULANatjvnxAkbV0hu4SFpG58oD7Tu4tUuQr0QQAABBBBAIAIEYi7w6vSK4cOHy5w5c4ptiLFo0SKZOnWqnDx50swrfvrpp6V69erlNhGBt2yeitbidVZxYK3eCPgpwCMggAACCCDgcYGYCryvvPKKrFq1So4dOybPPPOML/DqVItBgwbJvHnzpEmTJjJixAhJTU2Vu+66i8BbyX8Azlq8iXGNZGxar3OuQuCtJCzFEEAAAQQQQCBogZgKvOvWrZO0tDQZPHiwjB071hd4p0+fbrY4HjlypAH87LPPZNSoUbJ48WICb9Bd6n8FylqL139r4rICscVtKYoAAggggAACCBQTiKnA69RcX5Z76qmnfIF39OjRctVVV5mVJPQoKioyf8/MzCTwWvyDcdbifb7jzVKvxv9eTvQPvHp5NqewQKYoAggggAACCFQoQOAVMXN6ddWI3r17+8B0fm52drZUq1bNfE2nQ5Q8fvvb30pOTk6FyLF6Qllr8TorODguBN5Y7SHUGwEEEEAAAXcECLwi8sQTT0jbtm2lf//+Rr2goEA6dOggW7Zs8bWCzu8teei0CAJv2R21rLV4Cbzu/OPmLggggAACCCDwXwECr4jZDS4vL8/M29UjIyPDhGBdO7i8g1Uayv9nVNZavE4QdkqPSeslzeMa8W8SAQQQQAABBBAIiwCBV0Ryc3Nl4MCB8vrrr/tWaUhKSpIHH3yQwGvR7Tbl58q0rNXnrLXrTHVwLl3W9sMWt6YoAggggAACCCDgEyDwfkuxbNkymTx5spw4cUK6du0qkyZNqnAXOEZ4y/+X5Lyc5qzEkH3kgNStUVOmbV0t+UWF0rh2nBwsKhQCLz+REEAAAQQQQCCcAjEZeEMFSuCtWNJZmkxD7XMZ6cUK6C5rm/NzZUjrLpIan1jxxTgDAQQQQAABBBCohACBtxJoThECb8V4ztJkHRMulbV5u3wF6lavKd2btZJluzOlz8Vt5PpL2lR8Mc5AAAEEEEAAAQQqIUDgrQQagTdwtJLzdX12DZpIcoMEAm/glJyJAAIIIIAAApUUIPBWEk6LMcJbMd78HRtk1b7tvhOdkV79M752fQJvxYScgQACCCCAAAKWAgReC0ACb8V4/mvuJsY1lLFp/9vco6xVHCq+KmcggAACCCCAAAKBCxB4A7c650wCb8V4/tsI66juHUkdfYWc713eoImMTOlR8cU4AwEEEEAAAQQQqIQAgbcSaE4RAm/FeHsKD8nEjSvMiYOTOkinhBYE3orZOAMBBBBAAAEEQihA4LXAJPAGhucsTTapfV+Jrx3nK+SEYWed3sCuxlkIIIAAAggggEBwAgTe4LyKnU3gtcD7tqgThl/uPND+YlwBAQQQQAABBBAoRYDAa9EtCLwWeAReezyugAACCCCAAAIBCRB4A2Iq/SQCrwUegdcejysggAACCCCAQEACBN6AmAi8FkzlFp2wcbnkFh6WMWm9pHlco3DdhusigAACCCCAQAwLEHgtGp8RXgu8b4s6O7ENT+ludl7jQAABBBBAAAEEQi1A4LUQJfBa4JUIvENad5HU+ET7C3IFBBBAAAEEEECghACB16JLEHgt8L4t6uzE1ufiNnL9JW3sL8gVEEAAAQQQQAABAm/o+gCB195yTd5OmZW9TtrGJ8oDrbvYX5ArIIAAAggggAACBN7Q9QECr72ls70wm0/YW3IFBBBAAAEEEChdgCkNFj2DwGuB51eUzSdC48hVEEAAAQQQQIDAG/I+QOANDemo9UvkYFEhS5OFhpOrIIAAAggggEAJAUZ4LboEgdcCz6/oS1mrZXN+rrBSQ2g8uQoCCCCAAAIIFBcg8Fr0CAKvBZ5fUVZqCI0jV0EAAQQQQACB0gUIvBY9g8BrgedXdFN+rkzLWm1WahjQsp3knyiU/BMF0rhOHJtRhIaYqyCAAAIIIBDTAgRei+Yn8Frg+RXdU3hIJm5cIXWr15TjZ075vsNSZaHx5SoIIIAAAgjEugCB16IHEHgt8EoUdVZq0C8nxjWU3MLDwlJlofPlSggggAACCMSyAIHXovUJvBZ4JYpO2Lhcjp0+JXckdzDTGFiqLHS2XAkBBBBAAIFYFyDwWvQAAq8FXomiOo83NT7R99VhaxaY6Q2T2veV+NpxobsRV0IAAQQQQACBmBMg8Fo0OYHXAq+Cos9mrJScIwdkeEp3XlwLHzNXRgABBBBAICYECLwWzUzgtcCroOj8HRtk1b7t0r9FmnRv1ip8N+LKCCCAAAIIIOB5AQKvRRMTeC3wKijK2rzhs+XKCCCAAAIIxJoAgdeixQm8FngVFN1+JE+ey0iXyxs0kZEpPcJ3I66MAAIIIIAAAp4XIPBaNDGB1wKvgqLO2rz6wpq+uMaBAAIIIIAAAghUVoDAW1k5ESHwWuAFUJSlyQJA4hQEEEAAAQQQqFCAwFshUdknEHgt8AIoOmr9EjlYVChj0npJ87hGAZTgFAQQQAABBBBA4FwBAq9FryDwWuAFUJSlyQJA4hQEEEAAAQQQqFCAwFshESO8FkRWRQNZqUFfbtP1evXlNt2hjQMBBBBAAAEEECgpQOC16BOM8FrgBVA0fe82eWPnRul2UbIMaNmu1BIvZa2Wzfm50jHhUrkjqWMAV/3fKct2Z5q/9Lm4TVDlOBkBBBBAAAEEokuAwGvRXgReC7wAigayNNno9Uskv6gw6OXLtIyW1ePlzgMDeBpOQQABBBBAAIFoFSDwfttyrVu3lho1avjacerUqdKtW7dy25XAG95u74TSutVrytROt5xzs2OnT8ojaxearwe7Xq8TprUsL8WFtx25OgIIIIAAAlUtQOAVkcOHD8utt94qK1asCKo9CLxBcVXq5PKWJvMPrcGO1K7J2ymzsteZZxqe0p35v5VqHQohgAACCCAQHQIEXhHZsWOHjBs3TmbPnh1UqxF4g+Kq1MkTNi6X3MLDpYZS56U258LBTE3wL0vgrVTTUAgBBBBAAIGoESDwikhmZqYMGjRImjRpIqdPn5Zrr71WRowYIfXq1fM15DfffHNOoyYnJ0tOTk7UNHY0PqjzUtqQ1l0kNT6xWBWc7zlffL7jzVKvRq2Aqjkze62szdtlztWX1q6/hBfXAoLjJAQQQAABBKJQgMD7baMVFBRI/fr15ejRo/Lkk09K48aNzZ/O8dOf/vSc5t25cyeBN8ydvrylyZwX1nSO7/Ezp4KamuCs8UvgDXMDcnkEEEAAAQQiQIDAW0oj7Nq1S+655x5JT08vt4mY0hD+HuzMtS257FjJF9Z0Ld5gpiY4YVlrUJklzcJfc+6AAAIIIIAAAqESIPCWIvn555/LsGHDZNmyZQTeUPW0Sl6nrKXJnCCsqzPotsOr9m2X/i3SpHuzVgHdyXkZTk8ubYWHPYWH5PjpU5IY1zDgaRIB3ZiTEEAAAQQQQMB1AQKviGzdutVMYWjatKkcO3ZMHn/8cUlKSpKHHnqIwOt6lyx+Q/+RXP+X0qZkrJTsIwdkcFIHyT9RKLqJRKBzcf3X4C0r8A5bsyDoaRJVTMXtEUAAAQQQQKAMAQKviKxevVrGjx8vhYWFUrduXendu7cMHTpUatUq/wUopjS48+/KGY3VF9eOnz4pjevEyXMZ6aJzdye17yvpe7ebwFvejmz+T+qMGuvora4AUdo6v849Aw3R7khwFwQQQAABBBCojACBtzJq35Yh8FrgBVHU/wUz/2JOGA1kRzb/cpvyc2Va1mppG59otiXWw3/0WKczTNz43zWZCbxBNBSnIoAAAgggEKECBF6LhiHwWuAFUdRZQkzn2ur0hYNFhb7RXV2GrKLAq/N91+btNMua6Rxf/5UfdGRYD/8lzfw3tAh2B7cgqsWpCCCAAAIIIOCSAIHXAprAa4EXRFGdcxtfO85XQgOrHs7auRVtQeys15sY10jGpvWS+Ts2+F5y+zQ/V0qu8JC+d5u8sXOjuYdTJojH5VQEEEAAAQQQiDABAq9FgxB4LfBCXLS8LYidF9CckVwNwE7I1fBcMvDa7OAW4mpxOQQQQAABBBAIgQCB1wKRwGuBF+KiTuDVtXjf2LHRjOTq4T8fV/+uL769sWOD6KjwmLReoqO5uuOarvbQKaGFKWOzg1uIq8XlEEAAAQQQQCAEAgReC0QCrwVeiIuWfLFNV2/QaRD+0xP0ljpFIbfwkDSuHSe/ad+32Hze+DpxktwwQf66fa0Z9XWOYDa0CHG1uBwCCCCAAAIIhECAwGuBSOC1wAtx0ZKB1xmxdUZr/Vdk0Fs7m1Q40xecIKxf16/pVsX6wpoGX//R3xA/NpdDAAEEEEAAARcECLwWyAReC7wQF9UpCitzt5mpCrrUmLNdsDN/V6cvOEuN6a2dVRmcJcqcx9FRYb2GHrokWTAbWoS4SlwOAQQQQAABBEIkQOC1gCTwWuCFqagzZ1eD65Aru5iQ60xfmLBxudlowgnD+gj+c3x1Awod2dVDN6XQJcxmZa8rdn6YHpvLIoAAAggggEAYBQi8FrgEXgu8MBZ1RnU12OoLaU7A1XC76atcSb0gUZrHNfI9gW5f7Hxv1b7t5us6nUGXPdMd3ViLN4yNxaURQAABBBBwQYDAa4FM4LXAC2PRsl5gq+iW/qO9uk1xj8RWMnr9kmJbD2s41s0uOBBAAAEEEEAgegQIvBZtReC1wAtjUf91dIMdnR21fonZyc3ZUtgZLdaX2ZIaJsiUzSulf8t2ZgkzZ5c2DceE4DA2KJdGAAEEEEDAUoDAawFI4LXAC2NR/62Bdd1d3VI40ENfWPvqRIEkN0gwRZyX2nR+rx7OHF+9pn5PD1ZxCFSX8xBAAAEEEKgaAQKvhTuB1wIvzEVLbj9sczv/jSj0BTgdAfY//F+Cs7kPZRFAAAEEEEAgPAIEXgtXAq8FXhQV1Xm7EzaukOSGTeSOpI4yM3ut72U4fSlOV4TQjS44EEAAAQQQQCAyBQi8Fu1C4LXA61VO6gAAIABJREFUi7KiOtVBg61z6AiyruLgzPnVdX79V37Q8zUo+38tyqrM4yKAAAIIIOAZAQKvRVMSeC3wPFLUGe11dm5zqqWrO2jo1RfaBrRs55HaUg0EEEAAAQSiU4DAa9FuBF4LPI8UXZO302xOoVsXP9C6i6mV8zX9b6Y7eKShqQYCCCCAQFQLEHgtmo/Aa4HnkaI6iltyrV5ndNepos7v9Z8O4ZGqUw0EEEAAAQSiRoDAa9FUBF4LPA8Vdebx6vJkeuiIr67moC+56UttJac7eKjqVAUBBBBAAIGoECDwWjQTgdcCz0NFnSkMSQ2aSP6JQjN31z/8+k938FC1qQoCCCCAAAJRI0DgtWgqAq8FnoeK6moMOo3B2ZQiMa6hjE3rbVZpeGTtQlNTXdFBd31zNrTwUPWpCgIIIIAAAhEvQOC1aCICrwWex4r6b2c8PKW7L9hO2LhccgsP+2r7fMeb2YbYY21PdRBAAAEEIl+AwGvRRgReCzyPFdVpDGv275R6NWpK92atfLXTbY6zDx+QTfl7TPBlG2KPNTzVQQABBBCICgECr0UzEXgt8GKsaGnLl8UYAdVFAAEEEECgygQIvBb0BF4LvBgr6j+fl2kNMdb4VBcBBBBAoMoFCLwWTUDgtcCLwaIvZa2Wzfm5vmkNOg1CV3WIrxPHOr0x2B+oMgIIIICAewIEXgtrAq8FXgwW9V++bERKD5mwcYXkFh4qtktbDLJQZQQQQAABBMIuQOC1ICbwWuDFYFH/5cv6XNxGlu3O9CnY7MamL8blHDkgiXGNJDU+0VxTR4/Z3S0GOxlVRgABBBAoVYDAa9ExCLwWeDFa1H/5MiWoW72mWb9XA7Cu1escGo6nZKRL92bJ0imhRTGtPYWHpHlcI9/XnJFiDbxj03rJsDULzDXZ4S1GOxnVRgABBBA4R4DAa9EpCLwWeDFc1NmKWAl0mTLdilhHY3WUVw8NtFM2rzShVQPxiLY9zPd1m+L0vdvMhhZj2/U2X9ORXN30wjk05L6xc6P5q5Z1ztO/65SKkuE5kGZwRqI1lHMggAACCCAQjQIEXotWI/Ba4MVw0U35ubJy7zYzSjugZTtxArCO5sbXri9v7NxgdBrXjpOD305N0JDr7OSm39NtjHUecFkjxs7IsXOeTnt4LiPdjPp2TGghEzeuMGHZf5OM0prEKaffG9K6i2/KRAw3H1VHAAEEEIhCAQKvRaMReC3wKOoTcF5m8yfpdlGymeIwJWOlb6c23Zq4R7NWMnP72m+nLLQzI74aXDsmXGpGgJ1jTFov3yixjsyuzdtpztMgrIF31b7t5lQnEJfVHDOz1/quq/e4I6kjLYcAAggggEDUCRB4LZqMwGuBR9FiAs40Bv3iHckdi718Nn/HBjOXN7lBgimjI8TTslb7ymuI1ekQj6xdaL6mwXhkSo9zzitJ7owC+4/yajDWZ9Hl0ro1Szajws7h3KdejVpWraejxjq6bXsdq4egMAIIIIBATAkQeC2am8BrgUfRcwR02oKOwvq/kFYWk05l0OB4/PRJSWqQYKZGaDDWsKqjwM5qDc7av3odnZLgjA5rKNYArfNz9WW3O5I7yBs7Nkj2kQPn3FLP1fvo1sg6Oryn4JD84IJE0VHokqHVGUUu+XWtm35Nvz9hw3KJr1NfRqR0J/Ty7wABBBBAwBUBAq8FM4HXAo+irgg4S6FpUNVQrFMg9KU2HdXVYK0vvPnPDdZR3O7NWkl+UYFvKoO+WKeHvlxX2qFleiS2Mt9ambvtnDA7M3udZB/Okz6XtJH0vdvN2sN66DPpyLWOKuu0C/+QrPOY9fn0HA4EEEAAAQRsBQi83wouWrRIpk6dKidPnpSePXvK008/LdWrVy/Xl8Br2/0o74aAToHQubpOoNTQq6FWDw3EGi51/m9iXEN54MquvvV79TwdSZ7a6RZznk6Z0NFeXelB5x3r2r9lHTpqrAF7T8FB36oRzrl6H50y4R+09XxnxFcDsoZgPZwwrcFXy+hz+NfFDT/ugQACCCAQ/QIEXhHZtWuXDBo0SObNmydNmjSRESNGSGpqqtx1110E3ujv49QgAIGKpiLoJcrazEKnUfx3ubRT5kU7nTah0x/8D33hbdNXueZLusyahledh6wrUejhrEZRt0Yt3wiwBuOS13GuqaE3Nb65pF6QaAK6BmHdtlmnSuih3+dAAAEEEEDAESDwisj06dPl6NGjMnLkSOPy2WefyahRo2Tx4sUEXv6tIBCkgIZPHRnOPpJnAqtOidBRYf26zhF25hc7G2j8d5ON/61Gobdzyug8ZR3tdUagNRCXHFnW0WFnmkTJR3WCb3LD/77wp/OdywrEOhLevH6jYjvU6RQMfU4N8k6w9p96wY52QXYOTkcAAQSqSIDAKyKjR4+Wq666Sm688UbTDEVFRebvmZn/2/q1tPZhSkMV9Vpu6zkBDb0aLPXQQFnei3t6roZTDdQ6auxMjdDpFno4L9gFguQEYh1xdka5r78kReLrxJlRa/+X+JxgrdMsNEDry3taRr+uL/EFemhwdkaiAy3DeQhEqkDdGjUDetE2Up+f54odAQKviAwfPlx69OghvXv39rW8htns7GypVq2a+drs2bPP6RXjx4+XnJyc2Okt1BSBCBTQUeAL6tQvNjKrj+mE6OOnT5lwqoeeq0dp84+dZdr8q6hf02kT/mscRyABj4QAAjEg8HLngTFQy/BVkcArIk888YS0bdtW+vfvb6QLCgqkQ4cOsmXLFp/8q6++ek4rTJgwgcAbvr7JlREIu4ATgPVGukybjhzr8m46wqt/11UknOXUvjpRYL6mo7qbvtojifUbmb/rC3w6QhzooSPZGsY5EPCCQDC/UfFCfauyDgReO30Cr4jMmDFD8vLyzLxdPTIyMkwIXrp0abm6TGmw63yURgABBBBAAAEE3BAg8IpIbm6uDBw4UF5//XXfKg1JSUny4IMPEnjd6IXcAwEEEEAAAQQQCKMAgfdb3GXLlsnkyZPlxIkT0rVrV5k0aZLUqlX+FqqM8IaxZ3JpBBBAAAEEEEAgRAIEXgtIAq8FHkURQAABBBBAAAGXBAi8FtAEXgs8iiKAAAIIIIAAAi4JEHgtoAm8FngURQABBBBAAAEEXBIg8FpAE3gt8CiKAAIIIIAAAgi4JEDgtYAm8FrgURQBBBBAAAEEEHBJgMBrAU3gtcCjKAIIIIAAAggg4JIAgdcCmsBrgUdRBBBAAAEEEEDAJQECrwU0gdcCj6IIIIAAAggggIBLAgReC2gCrwUeRRFAAAEEEEAAAZcECLwW0AReCzyKIoAAAggggAACLgkQeC2gCbwWeBRFAAEEEEAAAQRcEiDwWkATeC3wKIoAAggggAACCLgkQOC1gNbAy4EAAggEI9ChQwdZt25dMEU4FwEEEJCcnBwULAQIvBZ4s2fPlrNnz8qgQYMsrkLRigSys7Nl6tSp8tJLL1V0Kt+3FOjfv7/86U9/ksaNG1teieLlCTz33HPSqlUr6d27N1BhFFizZo2888478vTTT4fxLlxaBXr27Cn/+Mc/wEAgYgUIvBZNQ+C1wAuiKIE3CCzLUwm8loABFifwBghleRqB1xIwiOIE3iCwOLVKBAi8FuwEXgu8IIoSeIPAsjyVwGsJGGBxAm+AUJanEXgtAYMoTuANAotTq0SAwGvBTuC1wAuiKIE3CCzLUwm8loABFifwBghleRqB1xIwiOIE3iCwOLVKBAi8FuwEXgu8IIoSeIPAsjyVwGsJGGBxAm+AUJanEXgtAYMoTuANAotTq0SAwFsl7NwUAQQQQAABBBBAwC0BAq9b0twHAQQQQAABBBBAoEoECLxVws5NEUAAAQQQQAABBNwSIPC6Jc19EEAAAQQQQAABBKpEgMBbJezcFAEEEEAAAQQQQMAtAQJvJaS/+eYbeeaZZ2TZsmVSs2ZNGTJkiPzyl7+sxJUoUlLgk08+kdtuu01q1arl+9Y///lPOf/88+XgwYPy6KOPSkZGhtkJbNKkSdKuXTsQgxD497//Lffff7/84he/kDvvvNNXsjxb9f71r38tX375pdkdbMqUKdKkSZMg7hp7pxYWFsqIESNMxXXnOuegf4euL5w6dUqmTZsmCxculDNnzkhSUpL5mdC0aVNzk0WLFpkdGk+ePGl2AdPd1qpXr17h90L3hN65kv6/Tne6PHTokMTHx8uTTz4p7du3NxXUnUY//fRTqVatmvm7/vx+7LHHzH/zs8M7fcALNSHwVqIVFyxYIEuXLpXp06fLsWPHTHjQZYa+//3vV+JqFPEX0G1A165dK+PGjTsHZuTIkdKsWTN5+OGHzQ9S/VPPr1OnDogBCKxbt878T//yyy+XH/zgB8UCb1m2+oFOw4K2R5cuXWTWrFmiSz29/PLLAdwxNk/5z3/+I7/61a+M8f79+4sFXvp36PrE4cOHZe7cuSZw1a9fX1588UXJycmRF154QXbt2mW+Pm/ePPPhTD98pKamyl133VXu90L3dN660h//+Efp16+f+fm7fv16GTp0qPk5rSH3Zz/7megSnSW3I9cPIfzs8FY/iPbaEHgr0YJ333233HHHHdK5c2dTeubMmaL/kxs1alQlrkYRfwH9H5iONj744IPFYHRU/aqrrhId7a1bt675no6s33LLLdKtWzcQAxDQMKAj5RoCGjRo4Au85dnqaM7EiRNFP+Tpoed27NhRVq5caa7Fca5AQUGBbN++3Yws6gcE/xFe+nf4ekxWVpYZWdTRSB2MOHr0qOgHOT0+++wz8/N58eLF5X4vfE/nrSvrh7n333/f/BzR/w+uXr3aN8Lr1HTz5s387PBWs0d9bQi8lWjC7t27y6uvvmo+7eqh/9j173/5y18qcTWK+Auo4V//+lepXbu2Cba33nqr+RWZjpTpSLr+kHWO//u//5OGDRvKvffeC2IQAvrbiEaNGvkCb3m2OjqmI7qTJ0/23eHmm282v9JMSUkJ4q6xd6p+ONORL//AS/8OXz+YM2eObN261UxrGD16tPmAfOONN5obFhUVmb9nZmaW+73wPZ13rvz555+baVHp6emmUhp+v/vd75rfdl555ZXmg0Xz5s1lyZIl/OzwTrN7oiYE3ko04w9/+EN56623zFwmPfRXPDpXTEdvOOwE9Ndg+j+nevXqyRdffGFGcXXqQnJysvk18dtvv+27gf4KU0cc9fscgQuUDLw6r7csW/0fmY6c6VQI59APIDoC36FDh8BvGoNnlhZ46d/h6Qj6oe322283v23TgYjhw4dLjx49pHfv3r4b6lQe3bVRpzeU9T1nHmp4njL6r3r69Gnz283Bgweb6Qp66G804uLiRL+nH/DefPNNM8o+f/58fnZEf5N7qgYE3ko0p/6wfOWVV+SSSy4xpfWTroZd/RpHaAX01+/OfN2bbrpJPvroI98NdCTnggsukPvuuy+0N/X41UoG3ry8PCnLVgPvBx98YF5Uc46+ffvK+PHjpW3bth6XsqteaYG35BXp33bGWlpfpNIApi+0OtPMnnjiCdM/datsJ5TpB7QtW7ZIed+zfxrvXkEHF3SKiP5/r7xBhk6dOpnQqwNB/Ozwbn+IxpoReCvRavrrnAEDBvjmjv75z38WDQ1jxoypxNUoUp6A/ppy586dxvbqq6+WVatWyXe+8x1T5J577jHt4Iw0IBmYQMnAe/bs2TJtL7roIhMQdO6jHjqKo29nazvodBKOsgUCCbz0b7se9PXXX5upOfpeRa9evXwXmzFjhvmZ7LxXoR+atR/ry8blfc/uabxbWn9GqJ+O5Oqf5R3XXHONvPvuu5Kbm8vPDu92iaisGYG3Es2mc5N0yRtnlQYdRfjNb35jQgOHnYDOF9W5ofrW9d69e81b1RMmTDAhS+fl6YjusGHDzKiv/hpeX57SczkCFygZeLVkWbY6teS6666TsWPHmtEzfQlLzfVXlxzlC5QWeOnfoes1uvSbfujVqQz+Uxf0Dhq2Bg4cKK+//rpvlQZdtkyn4pT3vdA9nbeupD+DNfTq3H3/Qz9UHDhwQNq0aWO+r3PUdVT3tddeM9PN+NnhrX4Q7bUh8FayBX/3u9+Z0KtzvnR0gRenKglZopgud6UvAOqhI7k6mq7L4eihb13rW9gbNmww39OlspxfYYbm7rFxldICb3m227ZtM+779u2Tli1byrPPPmteSuEIPvDSv0PXa3Q6iH4QO++884pdVOeO6hJkOo9UX7Y8ceKEdO3a1bzM5qzvXd73QveE3riSLvH2k5/85BxnnQutS5Lp9IY9e/aYF431BTYdAXbW6eZnhzf6gFdqQeD1SktSDwQQQAABBBBAAIFSBQi8dAwEEEAAAQQQQAABTwsQeD3dvFQOAQQQQAABBBBAgMBLH0AAAQQQQAABBBDwtACB19PNS+UQQAABBBBAAAEECLz0AQQQQAABBBBAAAFPCxB4Pd28VA4BBBBAAAEEEECAwEsfQAABBBBAAAEEEPC0AIHX081L5RBAAAEEEEAAAQQIvPQBBBBAAAEEEEAAAU8LEHg93bxUDgEEEEAAAQQQQIDASx9AAAEEEEAAAQQQ8LQAgdfTzUvlEEAAAQQQQAABBAi89AEEEEAAAQQQQAABTwsQeD3dvFQOAQQQQAABBBBAgMBLH0AAAQQQQAABBBDwtACB19PNS+UQQAABBBBAAAEECLz0AQQQQAABBBBAAAFPCxB4Pd28VA4BbwhkZGTIE088IV988YVcf/318swzz3ijYhFUi9/85jdy7NgxmTBhQgQ9FY+CAAIIhEaAwBsaR66CgOcEunbtKs2bN5fXXnutWN127NghN998s3z66aeu1Vnv16NHD7nrrrtMKGvYsKHVvV988UUpLCyUX//61+Y6+qdeOykpKaTXtbqYy4VDHXj/9re/SUFBgdx+++2+mgwaNEgGDx4s3bt3d7l23A4BBGJdgMAb6z2A+iNQhoAG3q+//lrGjh0rP//5z31nVUXgTU1Nlddff12uuOKKkLTX+vXrpaioSDp37iwnT540AeyVV16xDrz+1w3Jg7p4kVAH3ocffliuuuqqYoF34cKF0r59e7n44otdrBm3QgABBEQIvPQCBBAoVUAD73333ScvvPCCvPPOO75R1dIC74IFC+RPf/qT/Oc//zGjwhp2evfuHZCsBs5nn31W3nrrLROwU1JS5MknnzThNicnR0aNGiWbN2+W888/X6pXry7/+Mc/zhnh1Wv87ne/k6VLl5pRxZYtW8rs2bNlw4YNsmjRIrnllltk/PjxZlT3X//6lzjh7qmnnpIbb7xRPvvsM9/1dbrET37yE9m/f7/o9//5z39KfHy8/OIXv5Bf/epXUq1aNVm1alW513WmBeg19L8/+ugjOe+88+Taa68113RGqPU56tSpI3v37pWNGzfK6dOnpUOHDuZZ9eulHVon/f7nn38ul1xyiYwYMcIE9rffflsmTZokH3zwgXlG59B7Pv7449KrVy957rnnZMWKFaadLrroIvN1Z7TVP/Dm5ubKj3/8Y+NSo0YN37Uuv/xyWblypbnvwYMHzf3WrVsnR48eldatW5u/t2jRQsaMGWN8atasKbVr1zbG2o79+vWTe+65R/r06WOuqXXW+2ZlZUmDBg2kf//+MnToUGOlh56vo8Jvvvmm6DNpvX75y1+aa+ihXr/97W/l73//u3kGDdLDhw+Xnj17BtT3OAkBBGJHgMAbO21NTREISkAD71/+8heZPn26CT0aTPQoGXg1YD322GPy/PPPS9u2beXjjz82IUxHTHVktqJDg6qGUP3zwgsvNEFJw3N6errUr1/fFL/yyitFf0Ve1pQDDT1r1qyRiRMnmiCXnZ0tnTp1MsFUA+cFF1xg5gAnJCRI06ZNfYHXCaalXV9DcpcuXeTOO++UAwcOyJAhQ8x/a/AN9Lo6FUOfWUPYmTNnTDg7cuSIzJgxw9RLTf/617/KtGnTTPA8fvy4CXga2PTDRslDQ6YGVw2WWj8NjP/v//0/0Q8cGkKvueYa456WlmaKbtq0Se644w4TSjVA6wcX/SChBsuWLTPBWdvLaV9nDm8ggffUqVOyfPlyM0per149Y5+Xlyd//vOfzb11ioiGZv8pDf6BNz8/30xT0b6j87I19Gu/0f++//77fYH3q6++Mh9eLr30UhN69fszZ840fU1H/TUM//GPfzR9RYOz9iG14EAAAQT8BQi89AcEEChVQAOvBonExEQTsnSk9+qrrz4n8Gqg0dDjH9B+//vfy/bt2+Wll14qV1dHZn/wgx/I3LlzTYBxjoEDB8pPf/pTM9+zosDrXEPnGuu1/A8NphqedFT4e9/7nu9bJX99XzLw6iiqBjEN3c6h19fR0Tlz5pjAW9F1NUjqiLAGcR3l1ENHn3/4wx/K/PnzpVWrVibw6gt5GtycQz9gbNmyxXiXPNRTQ58GXud46KGHREdedWRUR1E1fOo0FD30PA3YkydPPudaZ8+eNR8k3n33XdPGwY7wlrygmmmw1w9AelQUeP/whz+YeeAa0J1Dg7kGeL2WHhqQdYR62LBhvnPuvvtu6datmxnp1bZYvHixvPrqq1K3bl3+JSOAAAJlChB46RwIIFCqgAZeDSVt2rQxvy7XEKvTDnbv3l3spTUdadTR2R/96Ee+62hQ1NFMDZrlHbt27TLTB7Zu3Sq1atXynapBTUcbddRQj/JGeMu6hpbTYKpTCD788MNij1FR4NVAqqFRfyXvHBoQdfRYf6UfyHXnzZtnRh/feOONYvfu27ev+ZW8jlTqcxw+fLhYINXRzPfee883CuxfWAOljqrq1A7n+Oabb8wcazVbu3atGSXVKRT6638dodZ20JCtI7I6sqzPryPJ+v1t27aZttUpIMEGXh3Nffnll01o1WufOHHCBHoNrYEE3kceecQEbX1e59Dy+qFFp5E0adLEN6VBp0Q4x4MPPmj6pH7g0HnYjz76qOjc6QEDBshtt91mRng5EEAAgZICBF76BAIIVBh49QQNGho8deT1pptu8q3SoC8mTZ06tVjg1VClo4oVBV4dBdb5nCUDr86j1VAWSODVKRbXXXfdOddwAq8GPh3F9D8qCrw64qwvWGlgLe3QwFvRdXXUUacNlAy8N9xwg9x7772+wFtyKbDyAq/OjXbm3pb2XBp+NeTqXF2dpqCjpRp+NSBrINbRZG2r7373u6IBPjk52YxaBxJ4NVx+//vf983h1eCuUydGjhwpcXFxJnTqKHOggVefTef7VhR4/ef8Ov3QCbyOgfYjnRqiUzb0Q5oGfA4EEEDAX4DAS39AAIGAAq/OpdRfMetoroYVZ1kyHVXTKQ3OvEu9mI4G6wtPOhe3vENHBXWahIY8//m+OqVB57HqnFk9yhvh1SDmrOJQcs5wIMFUr68BSoOpswqEhjcNWjoyrC9TlTwCua5eQ+f9ljalQacw6Etepa2MUF7g1eXUVq9ebebslnXoNfVlLmcUePTo0eZUDdo6RUQ/rOjx73//2xiXNsJ76NAhs5qCtrEzj1pfINQXEfXDTOPGjU3Y1ZFoHaXVQ0fFdR63E3jVT0f9dQ6xc/jP4dW66PWd+cx6jpZVM52brCPQJV9yKyvwOtfX6REa8DX8ciCAAAIEXvoAAghUKOA/pcE5WV8cmzVrltkAwgm8On1BQ5Xz0tonn3xi5nJq2NUwq3NI9UUvHa1t167dOffVgKZlNEjrr7F1ZFXnqup1v/Od71QYePWEcePGmZUcdNRVRy/1V/W62oP+ir+ikVgtr3OUNRDqPGQNizrnVlcM0LCra/RqwNORZA3oGuICCbw62qovrelcXeelNR311g8OOvqrR7CBV19a05fbdN60fijQ1Qx01FZf6NJRWj30xS0djdfAqB88dFRWDw2S+uKa3vPLL780ZhowdZpKyRFePV9HirXMrbfeaqYO6EiuhmMNvLoagratjuhq2+oIvfYBDcpO4NXr64toOuKqFjrH1j/A6goWGqD1umrvvLSmbaEfqPSoKPDqhwptKzXWudz6EqJOQ2Fjkgr/eXMCAjEnwAhvzDU5FUYgMIHSAq+W1JeRNOz6bzyxZMkSE1L37NljRvw0cGmI0UPneuocUw3EuopAyUMDpgYzDbq6tJSOtuqyVk5Q0/MrWqVBpz9osNX5rTpFQF/i0pfMNBAFEnh1JFfvqSsHaCj92c9+ZkKhBid9CUvDlC63pi9P6fSJQAKvPrcG1KefftqMyjohUucUa4CuTODVMhrs9Rn1T522oNMSdLrCZZdd5qPV0Kiu/lNKdK6zvoingVg/FGhA1Q8l6lNa4NWX7jRA6gcWDZU6DUNfYtQPPLoKgho4KzPoaLXWU8/R+bd6aIDVQKwfPnR+rS41VzLAZmZmGmP9Uz/c6Oizzu11RqcrCrz6DGqhL/Lpy3o6lUGDtu3GJIH9C+EsBBCIJgECbzS1Fs+KAAIIIIAAAgggELQAgTdoMgoggAACCCCAAAIIRJMAgTeaWotnRQABBBBAAAEEEAhagMAbNBkFEEAAAQQQQAABBKJJgMAbTa3FsyKAAAIIIIAAAggELUDgDZqMAggggAACCCCAAALRJEDgjabW4lkRQAABBBBAAAEEghYg8AZNRgEEEEAAAQQQQACBaBIg8EZTa/GsCCCAAAIIIIAAAkELEHiDJqMAAggggAACCCCAQDQJEHijqbV4VgQQQAABBBBAAIGgBQi8QZNRAAEEEEAAAQQQQCCaBAi80dRaPCsCCCCAAAIIIIBA0AIE3qDJKIAAAggggAACCCAQTQIE3mhqLZ4VAQQQQAABBBBAIGgBAm/QZBRAAAEEEEAAAQQQiCYBAm80tRbPigACCCCAAAIIIBC0AIE3aDIKIIAAAggggAACCESTAIE3mlqLZ0UAAQQQQAABBBAIWoDAGzQZBRBAAAEEEEAAAQSiSYDAG02txbMigAACCCCAAAIIBC2IdADuAAAAv0lEQVRA4A2ajAIIIIAAAggggAAC0SRA4I2m1uJZEUAAAQQQQAABBIIWIPAGTUYBBBBAAAEEEEAAgWgSIPBGU2vxrAgggAACCCCAAAJBCxB4gyajAAIIIIAAAggggEA0CRB4o6m1eFYEEEAAAQQQQACBoAUIvEGTUQABBBBAAAEEEEAgmgQIvNHUWjwrAggggAACCCCAQNACBN6gySiAAAIIIIAAAgggEE0CBN5oai2eFQEEEEAAAQQQQCBogf8PuoJaqscBKJQAAAAASUVORK5CYII="
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAArwAAAH0CAYAAADfWf7fAAAgAElEQVR4Xu3dCXxVxd3/8Z+yhBhqgAihEqCCEooFaUQlKLQFbBsLhVIRS11wqQtqVUrtH9RHEKViERVl86mKS1EUtXkUwULQQln0gTwIStkMAkHZwqJEdvy/fqNze7Pfe+fmLud8zuvlS0nOnDPznhP8Zu6cmZO+/vrrr4UDAQQQQAABBBBAAAGPCpxE4PVoz9IsBBBAAAEEEEAAASNA4OVBQAABBBBAAAEEEPC0AIHX091L4xBAAAEEEEAAAQQIvDwDCCCAAAIIIIAAAp4WIPB6untpHAIIIIAAAggggACBl2cAAQQQQAABBBBAwNMCBF5Pdy+NQwABBBBAAAEEECDw8gwggAACCCCAAAIIeFqAwOvp7qVxCCCAAAIIIIAAAgRengEEEEAAAQQQQAABTwsQeD3dvTQOAQQQQAABBBBAgMDLM4AAAggggAACCCDgaQECr6e7l8YhgAACCCCAAAIIEHh5BhBAAAEEEEAAAQQ8LUDg9XT30jgEEEAAAQQQQAABAi/PAAIIIIAAAggggICnBQi8nu5eGocAAggggAACCCBA4OUZQAABBBBAAAEEEPC0AIHX091L4xBAAAEEEEAAAQQIvDwDCCCAAAIIIIAAAp4WIPB6untpHAIIIIAAAggggACBl2cAAQQQQAABBBBAwNMCBF5Pdy+NQwABBBBAAAEEECDw8gwggAACCCCAAAIIeFqAwOvp7qVxCCCAAAIIIIAAAgRengEEEEAAAQQQQAABTwsQeD3dvTQOAQQQQAABBBBAgMDLM4AAAggggAACCCDgaQECr6e7l8YhgAACCCCAAAIIEHh5BhBAAAEEEEAAAQQ8LUDg9XT30jgEEEAAAQQQQAABAi/PAAIIIIAAAggggICnBQi8nu5eGocAAggggAACCCBA4OUZQAABBBBAAAEEEPC0AIHX091L4xBAAAEEEEAAAQQIvDwDCCCAAAIIIIAAAp4WIPB6untpHAIIIIAAAggggACBl2cAAQQQQAABBBBAwNMCBF5Pdy+NQwABBBBAAAEEECDw8gwggAACCCCAAAIIeFqAwOvp7qVxCCCAAAIIIIAAAgRengEEEEAAAQQQQAABTwsQeD3dvTQOAQQQQAABBBBAgMDLM4AAAggggAACCCDgaQECr6e7l8YhgAACCCCAAAIIEHh5BhBAAAEEEEAAAQQ8LUDg9XT30jgEEEAAAQQQQAABAi/PAAIIIIAAAggggICnBQi8nu5eGocAAggggAACCCBA4OUZQAABBBBAAAEEEPC0AIHX091L4xBAAAEEEEAAAQQIvDwDCCCAAAIIIIAAAp4WIPB6untpHAIIIIAAAggggACBl2cAAQQQQAABBBBAwNMCBF5Pdy+NQwABBBBAAAEEECDw8gwggAACCCCAAAIIeFqAwOvp7qVxCCCAAAIIIIAAAgRengEEEEAAAQQQQAABTwsQeD3dvTQOAQQQQAABBBBAgMDLM4AAAggggAACCCDgaQECr6e7l8YhgAACCCCAAAIIEHh5BhBAAAEEEEAAAQQ8LUDg9XT30jgEEEAAAQQQQAABAi/PAAIIIIAAAggggICnBQi8nu5eGocAAggggAACCCBA4OUZQAABBBBAAAEEEPC0AIHX091L4xBAAAEEEEAAAQQIvDwDCCCAAAIIIIAAAp4WIPB6untpHAIIIIAAAggggACBl2cAAQQQQAABBBBAwNMCBF5Pdy+NQwABBBBAAAEEECDw8gwggAACCCCAAAIIeFqAwOvp7qVxCCCAAAIIIIAAAgRengEEEEAAAQQQQAABTwsQeD3dvTQOAQQQQAABBBBAgMDLM4AAAggggAACCCDgaQECr6e7l8YhgAACCCCAAAIIEHh5BhBAAAEEEEAAAQQ8LUDg9XT30jgEEEAAAQQQQAABAi/PAAIIIIAAAggggICnBQi8nu5eGocAAggggAACCCBA4OUZQAABBBBAAAEEEPC0AIHX091L4xBAAAEEEEAAAQQIvDwDCCCAAAIIIIAAAp4WIPB6untpHAIIIIAAAggggACBl2cAAQQQQAABBBBAwNMCBF5Pdy+NQwABBBBAAAEEECDw8gwggAACCCCAAAIIeFqAwOvp7qVxCCCAAAIIIIAAAgRengEEEEAAAQQQQAABTwsQeD3dvTQOAQQQQAABBBBAgMDLM4AAAggggAACCCDgaQECr6e7l8YhgAACCCCAAAIIEHh5BhBAAAEEEEAAAQQ8LUDg9XT30jgEEEAAAQQQQAABAi/PAAIIIIAAAggggICnBQi8nu5eGocAAggggAACCCBA4OUZQAABBBBAAAEEEPC0AIHX091L4xBAAAEEEEAAAQQIvA7PwFlnnSUbNmxwuAJFEUAAAQQQQAABBGpbgMDrIEzgdcCjKAIIIIAAAgggECMBAq8DNIHXAY+iCCCAAAIIIIBAjAQIvA7QBF4HPIoigAACCCCAAAIxEiDwOkATeB3wKIoAAggggAACCMRIgMDrAE3gdcCjKAIIIIAAAgggECMBAq8DNIHXAY+iCCCAAAIIIIBAjAQIvA7QBF4HPIoigAACCCCAAAIxEiDwOkATeB3wKIoAAggggAACCMRIgMDrAE3gdcCjKAIIIIAAAgggECMBAq8DNIHXAY+iCCCAAAIIIIBAjAQIvA7QBF4HPIoigAACCCCAAAIxEiDwOkATeB3wKIoAAggggAACCMRIgMDrAE3gdcCjKAIIIIAAAgggECMBAq8DNIHXAY+iCCCAAAIIIIBAjAQIvA7QoQTe9ft3SkaDNMlISXO4E0URQAABBBBAAAEEIhUg8EYqJyI1Bd6Sw6Uy8oN8c4dp3Qc73ImiCCCAAAIIIIAAApEKEHgjlQsh8BZsWyuvFBWaO9yTkyct0xo73I2iCCCAAAIIIIAAApEIEHgjUfu2TE0jvJPXLJQPS4rN2cM69ZLs9EyHu1EUAQQQQAABBBBAIBIBAm8kaiEG3hsXzQhcncDrAE1RBBBAAAEEEEDAQYDA64BX3Qjvuv07ZMKqgsDVr27XVbpltnG4G0URQAABBBBAAAEEIhEg8EaiFsII78xPVsiCz9YFrt6nVUfp27qjw90oigACCCCAAAIIIBCJAIE3ErUQAq+uzqCrNJyV3kw27N8pBF4HaIoigAACCCCAAAIOAgReB7yqpjTY6QxNUtLMNIa3tqwm8Do4UxQBBBBAAAEEEHARIPA66FUVeKevXypLd2ySnqdnS+fTssxcXh3pHd6pt8PdKIoAAggggAACCCAQiYBvAu+xY8dk9+7dcuLECcnMzJQ6depU8Nq1a5c0bNhQUlNTQ7KsLPB+deyI2Wzi4PGjMvb8frL70AECb0ianIQAAggggEBiCISbBxKj1tSiOgFfBN4ZM2bIfffdF3Bo3ry5TJ48WTp2/OYlss2bN8v1118vn376qfnzwIEDZfTo0VKvXr1qn57KAu+SHUXy3PplgRFdO72BEV5+EBFAAAEEEEhsgUjzQGK3itqpgC8Cb35+vjRq1EjOO+880ZHeO+64Q44ePSovvPCCeQquvfZaM7I7btw4+fzzz2XAgAEm8Pbr1y+swBs8umuXIdOv3bl0lqTWqSePdRvIU4cAAggggAACCSoQaR5I0OZQrSABXwTe8j2ugVenNkycOFH2798vXbp0kZkzZ0pOTo45VcOuBt+pU6eGFXjtUmTlR3PtBhTTug/m4UMAAQQQQACBBBRwyQMJ2ByqVE7AV4H373//u8yfP1/WrVtnwu73v/992bhxo+Tl5cnixYulWbNmhmf69OnyxhtviI4MV3fYKQ06iqtr7r65ebU5XefuZqSkBYoSePm5QwABBBBAoKKArmIUj0OXCi1/uOSBeLSBe4Yn4KvAO378eFmxYoXs2LFD/vznP8sFF1wghYWFMmjQIFm+fLmkp6cbvZdfflkmTZokixYtCmg++eSTFWQff/xx6fnM6DJfv6xNjvRq0b7M10Z8kC97DpdWCMLhdRVnI4AAAggg4C0BOyAU61ZV9olrqHkg1nXlftER8FXgtWT6wtpzzz0n77//fmCEd8mSJdK0aVNzSmUjvDo6XP744x//GAi8Oo3hmuzcMiO79vzxq+abzSeGdeol2emZ0ek5roIAAggggECSCyTiCG9NeSDJyX1bfV8G3rlz58ptt90m//73v6W0tLTCHN5Ro0bJ9u3bw57DW9VTROD17c8XDUcAAQQQSBKByubwhpoHkqSJvq6mLwLvE088Id27d5f27dubtXjvvPNOadCgQWCVhiFDhsipp57qvEpDTYH35g49pHNGlq8fOBqPAAIIIIBAogpEmgcStT3U6z8Cvgi8f/rTn+T1118PtFpXY9D5vC1btjRfKyoqMuvwbt261fxZlyUbM2aM1K9fv9pnpaqd1soX0pfZ2F6YHzsEEEAAAQQSWyDSPJDYraJ2KuCLwKsNPXLkiHlZTdfbbdy4caW9r9MY9Pv6TygHgTcUJc5BAAEEEEAguQTCzQPJ1Tp/1tY3gbc2upfAWxuqXBMBBBBAAAEEEIiuAIHXwTPUwLuypFimrFko52RkydAOPRzuSFEEEEAAAQQQQACBcAUIvOGKBZ0fauBdt3+HTFhVIOV3YHO4NUURQAABBBBAAAEEQhQg8IYIVdlpBF4HPIoigAACCCCAAAIxEiDwOkCHGnh16+E7l86S1Dr15LFuAx3uSFEEEEAAAQQQQACBcAUIvOGKBZ0fauDVIncseVUOHj8qj+ZeKqfUrX65M4cqURQBBBBAAAEEEECgnACB1+GRCCfwstuaAzRFEUAAAQQQQAABBwECrwNeOIF35icrZMFn6+SyNjnSq0V7h7tSFAEEEEAAAQQQQCAcAQJvOFrlzg0n8BZsWyuvFBVKz9OzZVDbcx3uSlEEEEAAAQQQQACBcAQIvOFoOQTe4KXJWqY1NqO9nTOy5GbW5XXoAYoigAACCCCAAAI1CxB4azaq8oxwRniDV2rQC+oLbHoM69RLstMzHWpBUQQQQAABBBBAAIHqBAi8Ds9HOIFXb2NXagi+ZUZKmtyTk8fKDQ79QFEEEEAAAQQQQIDAW0vPQLiB167UoNXRl9eW7CiS4tJ9cnW7rtIts00t1ZLLIoAAAggggAAC/hZghNeh/8MNvHalBr3l2PP7ybp9O+S59ctER3n1zxwIIIAAAggggAAC0Rcg8DqYhht47UoNWWmN5N6cS8ydR3yQL3sOlzKX16EfKIoAAggggAACCFQnQOB1eD7CDbz2VvoCm91t7c3Nq+WtLaslN/MMGdIu16E2FEUAAQQQQAABBBCoTIDA6/BcRBp4g29ZcrhURn6Qb740rftgh9pQFAEEEEAAAQQQQIDAG+VnIBqBV6s0pvBt8/IaS5RFuYO4HAIIIIAAAgggICKM8Do8BtEKvNPXL5WlOzaxWoNDX1AUAQQQQAABBBCoSoDA6/BsRCvw2nm8fVp1lL6tOzrUiKIIIIAAAggggAAC5QUIvA7PRLQCb/C2w8M79XaoEUURQAABBBBAAAEECLxRfAaiFXjti2upderJY90GRrGGXAoBBBBAAAEEEECAEV6HZyBagVercOOiGaYmj+ZeyjbDDn1CUQQQQAABBBBAgBHeKD4D0Qy8dtthVmqIYgdxKQQQQAABBBBAgFUa3J6BaAZeu1LDZW1ypFeL9m4VozQCCCCAAAIIIIBAQIApDQ4PQzQDr12poefp2TKo7bkOtaIoAggggAACCCCAQLAAgdfheYhm4LUrNWSlNZahZ/eQjJQ0h5pRFAEEEEAAAQQQQMAKEHgdnoVoBt7gLYa1SkxtcOgYiiKAAAIIIIAAAkECBF6HxyGagVeroaO8BdvWyYclxXJWejNhTV6HzqEoAggggAACCCDwrQCB1+FRiHbg1apsLd0rDxTOEZ3acG9OnkPtKIoAAggggAACCCCgAgReh+egNgKvVseuyTut+2CH2lEUAQQQQAABBBBAgMDr+AzUduBlEwrHDqI4AggggAACCCDACK/bM1BbgZdNKNz6hdIIIIAAAggggECwAFMaHJ6H2g68N3foIZ0zshxqSFEEEEAAAQQQQAABAq/DM1BbgdduQtGnVUfp27qjQw0pigACCCCAAAIIIOCbwHvs2DHZtWuXNGnSRFJSUqLS8wTeqDByEQQQQAABBBBAoFYFfBF4p02bJuPHjw9A5uXlyf333y+NGjUyXxs7dqw8++yzZaBzcnJk5syZ1eLXVuC1u66xFm+tPvtcHAEEEEAAAQR8IuCLwKvBtVWrVtK5c2fZsmWLXHXVVXLDDTfIddddZ7r5wQcfNF8fMWJEoNsbNGggzZs3J/D65AeBZiKAAAIIIICAdwV8EXjLd9/IkSNl69at8sILLwQC7969e8uMAofS5bU1wqv3Zi3eUHqAcxBAAAEEEEAAgZoFfBd4jx49Kj179pS+ffvKXXfdFQi8c+fOldzcXGncuLH07t1bzjvvvDJ6paWlFTR1xHjDhg01K0dwBoE3AjSKIIAAAggggAAClQj4LvDefffdMnv2bHnnnXckMzPTkOTn58umTZvMy2yrV6+WefPmycSJE0Xn+tqjf//+Ffg+/vjjWgu8rMXLzysCCCCAAAIIIBAdAV8F3ieeeMIE2ddee006depUpeDw4cNFpzg8/fTT1SrX5pQGAm90HnCuggACCCCAAAII+CLwnjhxQsaNG2dWXfjb3/4mZ599drU9rys6rFixQl566aW4Bd6Zn6yQBZ+tk56nZ8ugtufypCKAAAIIIIAAAghEKOCLwKurL8yaNcuM2LZp0yZApasw1K1b17ysplMWWrduLWvXrpUrr7zSrOIwdOjQuAXelSXFMmXNQkmtU0/Gnt9PTqlbP8IuphgCCCCAAAIIIOBvAV8EXn1JTVdlKH/oXN3vfe97MmDAADN31x7659GjR4suTVbdUZtTGvS+dlqDbi+s2wxzIIAAAggggAACCIQv4IvAGwrLl19+KXv27JFmzZpJampqKEWktgNvyeFSGbPibTl4/Kjck5MnLdMah1QvTkIAAQQQQAABBBD4jwCB1+FpqO3AGzzKqyO8OtLLgQACCCCAAAIIIBCeAIE3PK8yZ8ci8L65ebW8tWW19GnVUfq27uhQW4oigAACCCCAAAL+FCDwOvQ7gdcBj6IIIIAAAggggECMBAi8DtCxCLwF29bKK0WFLE/m0E8URQABBBBAAAF/CxB4Hfo/FoF33f4dMmFVgZyV3kyGd+rtUFuKIoAAAggggAAC/hQg8Dr0O4HXAY+iCCCAAAIIIIBAjAQIvA7QsQi8W0v3ygOFcyQrrbHcm5PnUFuKIoAAAggggAAC/hQg8Dr0eywCr1bvxkUzTC2ndR/sUFuKIoAAAggggAAC/hQg8Dr0e6wDr24xPH3dUlPjPzCf16HnKIoAAggggAACfhIg8Dr0dqwC7x1LXjW7rV3drqs8t34Zo70OfUZRBBBAAAEEEPCfAIHXoc9jFXjHr5ovG/bvlHMysuTDkmJT42Gdekl2eqZD7SmKAAIIIIAAAgj4Q4DA69DPsQ68ujSZBl89dLS3W2Ybh9pTFAEEEEAAAQQQ8IcAgdehn2MVeKevXypLd2wqU1O2GnboOIoigAACCCCAgK8ECLwO3R2rwPvm5tXy1pbVZWqq0xuGdujhUHuKIoAAAggggAAC/hAg8Dr0czwDL+vyOnQcRRFAAAEEEEDAVwIEXofujlXgXbKjKLA6Q5OUNNlzuNTUmnV5HTqPoggggAACCCDgGwECr0NXxyrwrtu/QyasKjA11RfXSg6VmtCr6/JmpKQ5tICiCCCAAAIIIICA9wUIvA59HI/A2/P0bNHthnW1BpYmc+g8iiKAAAIIIICAbwQIvA5dHavA+9WxI3Ln0lmmppe1yZHdh0plwWfrhJUaHDqPoggggAACCCDgGwECr0NXxyrwahVvXDTD1FRHddfv22lWbdDR3kFtz3VoAUURQAABBBBAAAHvCxB4Hfo4loFXd1vT45rsXNl96ICZ06vzeYd36u3QAooigAACCCCAAALeFyDwOvRxLANvcDXtS2wEXofOoygCCCCAAAII+EaAwOvQ1fEKvFplO8WBpckcOpCiCCCAAAIIIOALAQKvQzcTeB3wKIoAAggggAACCMRIgMDrAB3PwDvig3zW4nXoO4oigAACCCCAgH8ECLwOfR3PwKsvsbEWr0PnURQBBBBAAAEEfCMQ98B78OBBqVu3rtSrVy/p0Am8SddlVBgBBBBAAAEEfCgQl8B77NgxmTJlirz44ouyZ88eGT9+vPTr10+uu+46SUlJkcmTJydFV8Qz8L65ebVZi5fNJ5LiUaGSCCCAAAIIIBBHgbgE3gULFsiNN94oAwcOlGXLlsntt99uAu/cuXPltttuk+XLl0t6enocWUK7NYE3NCfOQgABBBBAAAEE4ikQl8CrI7mtWrWS++67T6699loTdvWf7du3S/fu3SU/P186dOgQT5eQ7h3PwLtkR5E8t36Z5GaeIUPa5YZUX05CAAEEEEAAAQT8KBCXwKuh9tZbb5VBgwZVGnjnzJkjZ555ZsL3RzwDL5tPJPzjQQURQAABBBBAIEEE4hJ4Nezu27dPnn/+ebn++usDI7yPPvqomb/70Ucfmbm8iX4QeBO9h6gfAggggAACCCAgEpfAu3btWunbt69873vfky+++EJ++MMfyvHjx+W9996T4cOHm/m9yXDEM/CqD7utJcNTQh0RQAABBBBAIN4CcQm82mgNvTqi+/7770tpaalkZ2fLVVddJZdeeqmcfPLJ8XYJ6f4E3pCYOAkBBBBAAAEEEIirQNwCb3Crv/76aznppJNqFUKXQtu1a5c0adKkyukS+v2GDRtKampqSHWJd+Blt7WQuomTEEAAAQQQQMDnAnEJvBs3bjRTGao6OnXqZDajiNYxbdo0s9avPfLy8uT++++XRo0amS9t3rzZzCX+9NNPzZ91ubTRo0fXuBlGvAMvu61F6wnhOggggAACCCDgZYG4BN6bbrpJCgoKqnSN9jq8M2fONMugde7cWbZs2WKmTtxwww1mows9dGk0HdkdN26cfP755zJgwAATeHWptOqORAm8N3foIZ0zsrz8nNI2BBBAAAEEEEAgYoG4BN7PPvvMzNstf/zpT38ywfSRRx6ROnXqRNyomgqOHDlStm7dKi+88ILs379funTpIhqKc3JyTFENuxp8p06dmtCBl93Wauppvo8AAggggAACCMRplYaq4P/5z3+aqQWFhYXyne98p1b65+jRo9KzZ0+zSsRdd90lOr1CpzgsXrxYmjVrZu45ffp0eeONN8wGGIk8wkvgrZVHhIsigAACCCCAgMcE4jLCW5WhzqXt3bt3mdHWaHvffffdMnv2bHnnnXckMzPThGvdACN4GsXLL78skyZNkkWLFgVuP2XKlApVmTBhgmzYsCHaVQz5egTekKk4EQEEEEAAAQR8LBCXwLtz5045ePBgGfYvv/zSTDF4/fXXy4TPaPbNE088IRMnTpTXXntN9MU4PewI75IlS6Rp06ZVjvBqvcofOgUjnoGX3dai+XRwLQQQQAABBBDwqkBcAm9VL62lpaXJ7bffLtdcc01UvU+cOGFeSNN5un/729/k7LPPDly/sjm8o0aNku3btyf8HF4Cb1QfEy6GAAIIIIAAAh4ViEvg1U0n9uzZU4ZUw64G0WguR2ZvMGLECJk1a5Y8/fTT0qZNm8B9mzdvbu43ZMgQOfXUU5NulQYCr0d/KmkWAggggAACCERVIC6BN6otCOFi+pKarspQ/pg3b57Z3rioqMi8LGfP0WXJxowZI/Xr16/26vFelqzkcKmM/CBfMlLSZOz51S+hFgITpyCAAAIIIIAAAp4UiFng1RfSdJ5uKMf3v//9Wl2WrKo66DQGXY9X/wnliHfg1TreuGiGqeq07oNDqTLnIIAAAggggAACvhOIWeCtabOJYPlobzxRW71K4K0tWa6LAAIIIIAAAghETyBmgVd3OAt1hLd9+/ZxGeENl5XAG64Y5yOAAAIIIIAAArEXiFngjX3Tav+OiRB4xxS+LcWl+2RYp16SnZ5Z+43mDggggAACCCCAQJIJxC3w6qYO77//fqVbDOsOaKmpqQlPmQiBd/yq+bJh/04Cb8I/LVQQAQQQQAABBOIlEJfA++abb8qwYcNElyIrLS01KyWkpKTIunXrpEmTJlJQUBDyi2PxgtP7Enjjqc+9EUAAAQQQQACB0ATiEnivuOIKE2x16a8uXbrIe++9Jy1atJBHHnlEli1bJq+++mpotY/zWYkQeCevWSgflhTLzR16SOeMrDiLcHsEEEAAAQQQQCDxBOISeHVdXF214dJLL5Xs7GwTcDt37mxGePv06SPvvPNOmQ0iEo/tmxolQuB9c/NqeWvLaunTqqP0bd0xUamoFwIIIIAAAgggEDeBuATevLw8+fWvf202e+jXr5/84he/kBtuuEE+/vhj6d+/fyAAx00lxBt7LfBuLd0rB48dlay0RnJK3eo33QiRiNMQQAABBBBAAIG4C8Ql8N54442m4dOmTZMnnnhCJk6cKNdcc40sWbJEdu3aJYsXL66VLYajre21wGtfgGO0ONpPCtdDAAEEEEAAgXgKxCXwrlmzRnbs2CE/+clP5MiRIzJy5EjJz88383lvvfVWufDCC+NpEvK9EyHwriwplilrFso5GVly8NgRU/fL2p4rLdMah9wOeyKBN2wyCiCAAAIIIIBAEgjEJfBW5nLixAk5+eSTk4DsP1VMhMC7bv8OmbCqQJqkpMmew6Wmcle36yrdMtuEbUngDZuMAggggAACCCCQBAJxCbzDhw+X9PR0GThwoOiuasl6JFLgDTaMdEoCgTdZn0TqjQACCCCAAALVCcQl8M6YMUMefvhhswZvTk6ODB48WH76058mxWYTwZiJGnjPSm8mwzv1DvvJJ/CGTUYBBM7fHIoAACAASURBVBBAAAEEEEgCgbgEXnU5evSoLFy40KzIoBtN6CYUv/nNb8xSZW3btk0CusRYlkyhblw0o4xXap168li3gWEb2sCbm3mGDGmXG3Z5CiCAAAIIIIAAAokoELfAG4yhL7DNnDnTrNigx/Lly82Uh0Q/EmGEt3zg1bB78PhRGXt+P8lISQuL0AbnSEeIw7oZJyOAAAIIIIAAAjESiHvg/eijj+S1116TF1980TS5e/fuMmnSpKSY3pCIgVfD6ob9OyPaeY3AG6OfOm6DAAIIIIAAAjEViEvg3bdvn8yePVteeukls7uaTme4+uqrzWYUrVq1iimAy80SJfCO+CDfrNCgYTc7PTPindcIvC5PA2URQAABBBBAIFEF4hJ4dVthnber6+1efvnlolsN16+ffDt7JUrgDZ572zmjpVmXN5JpCQTeRP0xpV4IIIAAAggg4CIQl8D7j3/8Q7Kzs6V169YudY972UQLvJe1yZHOp7WUkR/km/m7Oo83nMMG3qy0xnJvTl44RTkXAQQQQAABBBBIWIG4BN6E1QizYokSeEsOl8ruQwfM7mqn1K0vdyx51by49mjupebPoR7Bqz1M6z441GKchwACCCCAAAIIJLQAgdehexIl8JZvgp3iMKxTLzOnN5Tjq2NH5M6lswKnEnhDUeMcBBBAAAEEEEgGAQKvQy8lauCd+ckKWfDZOglnxzW7RbHlIPA6PBgURQABBBBAAIGEEiDwOnRHogbeJTuK5Ln1y+ScjCwZ2qFHSC0k8IbExEkIIIAAAgggkIQCBF6HTkvUwLu1dK88UDgnrBfXCLwODwJFEUAAAQQQQCChBQi8Dt2TqIFXm2RfQAv1xbXygfeenDzzEhwHAggggAACCCCQ7AJxCbyHDh2Sd99916zFW1RUVMHw+eefl4YNGya8bSIH3nBfXCsfeMN54S3hO4oKIoAAAggggICvBeISeP/617/KuHHjpEuXLmZntXr16pXphLvvvputhR0fS/vimq7N26tF+xqvRuCtkYgTEEAAAQQQQCBJBeISeHVnta5du8rYsWOTlO2baifyCG/BtrXySlGh5GaeIUPa5dbobM+3JzLCWyMZJyCAAAIIIIBAkgjEJfAOHDhQLrjgAhk+fHiSMFVezUQOvHbENtQtht/cvFre2rI60FACb1I/mlQeAQQQQAABBIIE4hJ4X3zxRXnmmWdkzpw5kpKSkrQdksiBV3df0y2GU+vUk8e6DazRuHzgvblDD+mckSV6naU7iqRJSpp0y2xT43U4AQEEEEAAAQQQSDSBuATeJ598Uh5//HHp3LmzNG3atILJX/7yF0lLS0s0qwr1SeTAq5W1KzWEsolE+cBrN62wI8VZaY3l3py8hO8TKogAAggggAACCJQXiFvg/fDDD6vsjccee4zAG4VndcQH+bLncKmMPb+fWZO3uqOqwLuypFimrFloioYSnKNQbS6BAAIIIIAAAghEVSAugTeqLYjjxRJ9hDecpcmqCrzBXyfwxvFh49YIIIAAAgggELFA3ALv4cOH5c0335R169bJV199ZZYny8vLM/9OliPRA+/kNQvlw5JisfNxq3O1y5jpnN+Dx4+KndJA4E2Wp5F6IoAAAggggEBVAnEJvLt27ZJBgwbJ1q1bTb10vm5paan574kTJ5rgmwxHogdeG1ZteK3O1I4G66oOG/bvrDTwhjI1Ihn6jToigAACCCCAgL8E4hJ4R44cKW+//bY89dRTcs4555iVGj755BN5+OGHZcGCBbJq1apa2Xji66+/luPHj0vdunWj0svJEnh7np4tg9qeW22bbeDVcxd8tk7scmbT1y+VpTs2mbIsVRaVx4aLIIAAAggggECMBeISeLt37y6//OUv5Y9//GOZ5v773/82X3/99delY8eOUafIz8+X8ePHy6JFi8pcWzfAePbZZ8t8LScnR2bOnFltHRI98IazFq8NvDoarOvx2sBrv07gjfrjyAURQAABBBBAIEYCcQm8ffr0MUuSPfDAA2Wa+f7778sVV1wR9cC7efNmueaaa8wUiubNm1cIvA8++KBs2bJFRowYEahPgwYNzLnVHQTeGD2l3AYBBBBAAAEEEHAQiEvg1XV2dTqDBl7dca1x48ayfPlymTRpkmzbtk3+9a9/Sb169RyaVbbosWPHZPfu3TJ//nyZNm1apYF37969ZvQ3nCPRA6+2JdS1eEMZ4Q1lLnA4fpyLAAIIIIAAAgjEQiAugffgwYNyyy23VAieTZo0kcmTJ8u551Y/3zRSmNmzZ8tDDz1UaeCdO3eu5ObmmvDdu3dvOe+888rc5tChQxVuq9MuNmzYEGl1YlLONfDeseRVs2qDHgTemHQZN0EAAQQQQACBKAvEJfDaNhQWFgaWJWvZsqVceOGFtbrhRFWBV+f2btq0ybw8t3r1apk3b16F1SL69u1bgX7t2rUJH3jHFL4txaX7anzhzAbbe3Ly5IHCOWajCl2VwQZmAm+Uf/K4HAIIIIAAAgjETCCugTdmrfz2RlUF3vL1GD58uOgUh6effrraKibDlIZQN58IHgmu7L8VIpTVHmLdp9wPAQQQQAABBBCoSSBmgXfjxo2iI6n68tj//d//mZfEqjoGDx5sRlujfYQaeHUu74oVK+Sll15K+sBrN5S4rE2O9GrRvsr2VBZyH829VO5cOitQxq7cEO1+4XoIIIAAAggggEBtCsQs8L777rtyww03mOkCEyZMkDlz5lTZLn2BLT09PWrt1vV3jx49au6pYbagoEBOPvnkwHq8+rX+/ftL69atRacpXHnllaauQ4cOTfrAazef0CkKV2d3lez0zErbVFng1XV3J6wqIPBG7UnkQggggAACCCAQD4GYBV7d8EG3E05NTZWTTjoppm3VF8suueSSMvfs169fYFWGAQMGmLm79tA/jx49WnRpsuqOZJjSsLV0r0z+eKHsOfzNTnZVbR5B4I3pI8nNEEAAAQQQQCCGAjELvMFt0vVud+zYIc8880wMm1r9rb788kvZs2ePNGvWLORd3pIh8NpWT16zUD4sKa5ypYXqAm+TlDQTmFPr1JPHug1MmD6jIggggAACCCCAQCgCcQm8urWwzuF98cUXQ6ljwp6TTIG3YNtaeaWosNIXz746dsTM1bWB1r7olpt5htlW2P5bO2Ja98EJ2x9UDAEEEEAAAQQQqEwgLoFX59DedNNNZrOJaM7VjXUXJ1PgrW6b4fLfs4FXX1LbsH+nGRXW7YYJvLF+wrgfAggggAACCERDIC6Bd8GCBTJs2DCzy9pFF11UoR2DBg2S+vXrR6N9tXqNZAq8JYdLZeQH+WZ9XV1rd8qahcbmD516S/nAa1d2sHgE3lp9jLg4AggggAACCNSyQFwC76233irvvPNOlU1LlpHfZAq8im3n6QavvqD/rYeuxmCXHVtZUhwIxPo9DbwainW0t6qX3mr5OeXyCCCAAAIIIIBAxAJxCbwR1zbBCiZb4LW7rp2TkWVeYLNhtl2jZmUCrx0Nttw3d+gh87etJfAm2PNHdRBAAAEEEEAgNIG4BV5dG1e38/3888+lbdu20rx5c9m8ebOccsop0rRp09BqH+ezki3w2rm5+nLaweNHjZ6O6vZt3bFM4NWv262G9b91VFfX82WEN84PHLdHAAEEEEAAgYgE4hJ4Dxw4IL/73e/MS2t66MYPui6ubvSgIbi6TSkiamUtFUq2wGs3oSjPYac4BO+kZsOxDbwF29aZUeGr23WVbpltakmUyyKAAAIIIIAAAtEXiEvgffnll+XRRx8VXZ7sueeek6uvvtoE3vfff1+uuOIK+de//iWZmZXvCBZ9gsivmGyB1y5Npi3WUd6MBmlSXLpP7BSHnqdny6C25xqQ4HCsL7mt3F1sVmrQ+bw6IsyBAAIIIIAAAggki0BcAm+fPn0kLy9PbrnlFrn22mtN2NV/dOMHXbnhtddek06dOiW8YbIFXrsag8LqaG7LtMay4LN1AefgMBv84pquvWsDMIE34R9LKogAAggggAAC5QTiEng17Or2vTqtITjwbty40QThd999V7KyshK+s5It8Aa/jKbBtWXDxoHVGHTEd0h2rnTO+MY9+NzgwBs8CpzwHUQFEUAAAQQQQAABEYlL4B01apQsXLhQZsyYYaY16OjuxRdfLHfeeaesXLlSlixZInXq1En4Dkq2wKugdmkyXXmhXXozszZvrxbtpVeLbDmlbtVrH1e3cUXCdxQVRAABBBBAAAFfC8Ql8OrUBQ2527dvN/gtW7Y00xlKS0tl6tSp0qvXN2vDJvqRjIFX5/F+deyodGvexmxCodsKVxd0bR/YEV+7/XCi9w31QwABBBBAAAEErEBcAq/e/ODBg6Ivr61evVq+/PJLOeOMM+TSSy+Vdu3aJU3vJGPgdcG1S5WNPb+fCcscCCCAAAIIIIBAMgjEJfDqagzp6enSvn37MkY7d+6UpUuXyi9+8QupW7duwvv5LfDapcp0OoSd65vwnUQFEUAAAQQQQMD3AnEJvDfddJP84Ac/EN1iOPjYtm2b/PjHPzbbDrdpk/hrvfot8LJSg+//vgAAAQQQQACBpBRIqMD78ccfS//+/WX+/PnSunXrhAf1W+C1S5Xpur1DO/Qw/bO1dK8cPHbUvADHgQACCCCAAAIIJKJATAPv8OHDZe/evbJixQpp0qSJmbdrjyNHjsiyZcukQ4cOkp+fn4hWFerkt8Cr4faBwjlm/q7O49XDrvrAvN6keGSpJAIIIIAAAr4UiGng1SXI9u3bZwLvqaeeKhoY7dGgQQOz6YROaUiGXda03n4LvMEB99HcS83o7oRVBaYL2XLYl39/0GgEEEAAAQSSQiCmgdeK6E5qzZs3lwsvvDApkKqqpB8Dr31xbVinXmYqw5Q1Cw0PG1Ik9aNM5RFAAAEEEPC0QFwCr1dE/Rh4Z36ywmxHrDu16fHWltXm31lpjeXenDyvdC3tQAABBBBAAAEPCcQs8C5fvlzGjh0rkyZNkjfeeEM+/PDDKhknTJggaWmJv86rHwPvkh1F8tz6ZZKb+c3866U7NgX6Ubcg5kAAAQQQQAABBBJNIGaBV+ftPvTQQzJx4kT5+9//Xm3gfeSRRwi8ifakfFsfu8Wwjuim1q0nG/bvDNRUpzlkp2cmaM2pFgIIIIAAAgj4VSBmgTcYePHixfLFF19IXl5yfwTuxxFe7Ue7MoNuM3zw+FHRZco+LCk20xz6tv5mqgMHAggggAACCCCQKAJxCby///3v5cCBA/LMM88kikNE9fBr4B1T+LYUl+4LmOkKDTrNQdfi/UOn3ubrOhJ8WoOGbEEc0ZNFIQQQQAABBBCIpkBcAu8TTzxh5vEuWLAgmm2J+bX8Gninr18amLubldZIhp79Ixn5wTdrJ+dmtpGSQwdk/f6dLFUW8yeSGyKAAAIIIIBAZQJxCby7d++W3r17y+OPPy4/+tGPkrZn/Bp4C7atlVeKCk2/nZXeTIZ36i32ZbbgzmSKQ9I+2lQcAQQQQAABTwnEJfDecccdMnv27CohdUWH9PT0hIf2a+C1L65pBwWHWt2I4pEP50tWw8bmZTZdyWFIu9yE70cqiAACCCCAAALeFohL4J03b55s2bKlStkrrrhCUlJSEl7er4H3q2NH5M6ls0z/XNYmR3q1aF+mr2wgtqO/Cd+RVBABBBBAAAEEPC0Ql8DrFVG/Bl7tvxEf5Muew6VS2VJkOtL7QOEc88La2PP7eaW7aQcCCCCAAAIIJKlATAPvzJkzzfq7w4cPlyZNmpQh+/e//y0vvPCC/OxnP0uaeb1+Drw1Pe926TI2o6hJiu8jgAACCCCAQG0LxCzwHjp0SLp27So9e/YU3Umt/HHs2DHp37+/1KlTR/Lzv3njP9EPAm/VPXTHklfNGr06wqsjvRwIIIAAAggggEC8BGIWeJctWyZXXnmlzJ07V9q2bVtpe+fMmSO6Ru+SJUukadOm8TIJ+b4E3qqpxq+ab15cq2r3tZLDpVJyqFQyGqQRiEN+4jgRAQQQQAABBCIRiFng1XV377rrLlm7dq0Zxa3s+PTTT+Xiiy+WWbNmyTnnnBNJe2JahsBbNffkNQvN7mu6KUW3zDYVTrRr+bJ0WUwfWW6GAAIIIICALwViFnjfeecdufXWW6sNvJ988on8/Oc/Fx3pPfPMMxO+Qwi8VXfRm5tXy1tbVle53bAdASbwJvxjTgURQAABBBBIeoGYBd6NGzdKXl6ePPfcc9KtW7dK4Z566in5y1/+Ih999FGtLEv29ddfy/Hjx6Vu3bqV3n/Xrl3SsGFDSU1NDaljCbxVM9nNKXqeni2D2p5b4cQxhXOkuHSvnJORJUM79AjJm5MQQAABBBBAAIFIBGIWeE+cOCFXXXWVbNiwQSZNmiRdunQJ1FeD6FtvvSXDhg2TgQMHytixYyNpS41l9GW48ePHy6JFi8qcu3nzZrn++utFp1TooXUYPXq01KtXr9prEnir5qlpLV67igNr9db42HICAggggAACCDgKxCzwaj01WOqmEtu3b5fs7Gxp166d6OoNq1evNl/TaQwzZsyQxo0bOzarbHG97zXXXCNbt26V5s2bVwi81157rRnZHTdunHz++ecyYMAAE3j79at+DVkCb9XdZNfizUprLPfm5FU4kcAb1UeciyGAAAIIIIBANQIxDbxaj4MHD8r06dPlf//3f+Xjjz+W+vXrS4cOHeSiiy6Syy+/vMZR1Uh6U5c82717t8yfP1+mTZtWJvDu37/fjDbrGsE5OTnm8hp2NfhOnTqVEd5IwL8tU9VavMFbE1cViB1uS1EEEEAAAQQQQKCMQMwDbzz9Z8+eLQ899FCZwGvnFi9evFiaNWtmqqeBXFeVqGk9YEZ4q+9Nuxbvo7mXyil16wdODg68+kU2p4jnTwX3RgABBBBAwPsCvg+8hYWFMmjQIFm+fLmkp6ebHn/55ZfNPOPgub5PP/10hadBw7POSeaoXKCqtXjtCg62FIGXJwgBBBBAAAEEalPA94HXjvAGb3ZR2QivhuDyx7333kvgrebprGotXgJvbf5Ic20EEEAAAQQQKC/g+8Bb2RzeUaNGmZfomMPr9gNT1Vq8Ngjbq9+Tkyct06L7oqJbzSmNAAIIIIAAAl4S8EXg1WXPjh49aja00GXJCgoK5OSTTw6sxztkyBA59dRTWaUhyk/2ypJimbJmYYW1du1UB3u7qrYfjnJ1uBwCCCCAAAII+FTAF4FX59lecsklZbpYlxzT8KtHUVGRWYdXly3TQ5clGzNmjFlBorqDl9aq/6mxL6fZlRjW798pqXXryZSPF0rJ4VJpkpImew6XCoHXp3/70GwEEEAAAQRiJOCLwBuqpU5j0PV49Z9QDgJvzUp2aTINtRNWFZQpoLusfVhSLDd36CGdM7JqvhhnIIAAAggggAACEQgQeCNAs0UIvDXj2aXJcjPPkKU7NgUKpNapJ71atJe3tqyWPq06St/WHWu+GGcggAACCCCAAAIRCBB4I0Aj8IaOVn6+bsAuvZlkp2cSeEOn5EwEEEAAAQQQiFCAwBshnBZjhLdmvJmfrJAFn60LnGhHevXfGSkNCbw1E3IGAggggAACCDgKEHgdAAm8NeMFr7mbldZI7s35z8uDVa3iUPNVOQMBBBBAAAEEEAhdgMAbulWFMwm8NeMFbyOso7pD2uUGCtnvnZXeTIZ36l3zxTgDAQQQQAABBBCIQIDAGwGaLULgrRlva+leeaBwjjnx6nZdpVtmGwJvzWycgQACCCCAAAJRFCDwOmASeEPDs0uTjT2/n2SkpAUK2TBs1+kN7WqchQACCCCAAAIIhCdA4A3Pq8zZBF4HvG+L2jA8rftg94txBQQQQAABBBBAoBIBAq/DY0HgdcAj8LrjcQUEEEAAAQQQCEmAwBsSU+UnEXgd8Ai87nhcAQEEEEAAAQRCEiDwhsRE4HVgqrbomMK3pbh0n9yTkyct0xrX1m24LgIIIIAAAgj4WIDA69D5jPA64H1b1O7ENqxTL7PzGgcCCCCAAAIIIBBtAQKvgyiB1wGvXOC9uUMP6ZyR5X5BroAAAggggAACCJQTIPA6PBIEXge8b4vandj6tOoofVt3dL8gV0AAAQQQQAABBAi80XsGCLzulkt2FMlz65fJORlZMrRDD/cLcgUEEEAAAQQQQIDAG71ngMDrbmm3F2bzCXdLroAAAggggAAClQswpcHhySDwOuAFFWXzieg4chUEEEAAAQQQIPBG/Rkg8EaHdMQH+bLncClLk0WHk6sggAACCCCAQDkBRngdHgkCrwNeUNHJaxbKhyXFwkoN0fHkKggggAACCCBQVoDA6/BEEHgd8IKKslJDdBy5CgIIIIAAAghULkDgdXgyCLwOeEFFV5YUy5Q1C81KDYPanislh0ql5NABadIgjc0ookPMVRBAAAEEEPC1AIHXofsJvA54QUW3lu6VBwrnSGqdenLw+NHAd1iqLDq+XAUBBBBAAAG/CxB4HZ4AAq8DXrmidqUG/XJWWiMpLt0nLFUWPV+uhAACCCCAgJ8FCLwOvU/gdcArV3RM4dvy1bGjMiS7q5nGwFJl0bPlSggggAACCPhdgMDr8AQQeB3wyhXVebydM7ICX71jyatmesPY8/tJRkpa9G7ElRBAAAEEEEDAdwIEXocuJ/A64NVQdPyq+bJh/04Z1qkXL67VHjNXRgABBBBAwBcCBF6HbibwOuDVUHTmJytkwWfr5LI2OdKrRfvauxFXRgABBBBAAAHPCxB4HbqYwOuAV0NR1uatPVuujAACCCCAgN8ECLwOPU7gdcCroei6/TtkwqoCOSu9mQzv1Lv2bsSVEUAAAQQQQMDzAgRehy4m8Drg1VDUrs2rL6zpi2scCCCAAAIIIIBApAIE3kjlRITA64AXQlGWJgsBiVMQQAABBBBAoEYBAm+NRFWfQOB1wAuh6IgP8mXP4VK5JydPWqY1DqEEpyCAAAIIIIAAAhUFCLwOTwWB1wEvhKIsTRYCEqcggAACCCCAQI0CBN4aiRjhdSByKhrKSg36cpuu16svt+kObRwIIIAAAggggEB5AQKvwzPBCK8DXghFC7atlVeKCqXn6dkyqO25lZaYvGahfFhSLLmZZ8iQdrkhXPU/p7y1ZbX5Q59WHcMqx8kIIIAAAgggkFwCBF6H/iLwOuCFUDSUpclGfpAvJYdLw16+TMtoWT2mdR8cQm04BQEEEEAAAQSSVYDAKyJjx46VZ599tkwf5uTkyMyZM6vtVwJv7T72NpSm1qknj3UbWOFmXx07IncunWW+Hu56vTZMa1leiqvdfuTqCCCAAAIIxFuAwCsiDz74oGzZskVGjBgR6I8GDRpI8+bNCbxxfkKrW5osOLSGO1K7ZEeRPLd+mWndsE69mP8b537m9ggggAACCNSmAIH328C7d+9eGT9+fFjWjPCGxRXRyWMK35bi0n2VhlL7Upu9cDhTE4LLEngj6hoKIYAAAgggkDQCBN5vA+/cuXMlNzdXGjduLL1795bzzjuvTCeeOHGiQqdmZ2fLhg0bkqazk7Gi9qW0mzv0kM4ZWWWaYL9nv/ho7qVySt36ITVz+vqlsnTHJnOuvrTWtzUvroUEx0kIIIAAAggkoQCBV0Ty8/Nl06ZNkpKSIqtXr5Z58+bJxIkTJS8vL9ClP/vZzyp0b1FREYG3lh/66pYmsy+s6Rzfg8ePhjU1wa7xS+Ct5Q7k8ggggAACCCSAAIG3kk4YPny46BSHp59+utouYkpD7T/Bdq5t+WXHyr+wpmvxhjM1wYZlbUEkS5rVfsu5AwIIIIAAAghES4DAW4mkzuVdsWKFvPTSSwTeaD1pEV6nqqXJbBDW1Rl02+EFn62Ty9rkSK8W7UO6k30ZTk+ubIWHraV75eCxo5KV1ijkaRIh3ZiTEEAAAQQQQCDmAgReEfOyWv/+/aV169aydu1aufLKK+WGG26QoUOHEnhj/kiWvWHwSG7wS2mPrJov6/fvlKvbdZWSQ6Wim0iEOhc3eA3eqgLvHUteDXuaRJypuD0CCCCAAAIIVCFA4BWRAQMGmLm79tA/jx49WnRpsuoOpjTE5ufKjsbqi2sHjx2RJg3SZMKqAtG5u2PP7ycF29aZwFvdjmzBNbWjxjp6qytAVLbOr71nqCE6NhLcBQEEEEAAAQQiESDwfqv25Zdfyp49e6RZs2aSmpoakiWBNyQm55OCXzALvpgNo6HsyBZcbmVJsUxZs1DOycgy2xLrETx6rNMZHiicY75O4HXuPi6AAAIIIIBA3AUIvA5dQOB1wAujqF1CTOfa6vSFPYdLA6O7ugxZTYFX5/su3VFkljXTOb7BKz/oyLAewUuaBW9oEe4ObmE0i1MRQAABBBBAIEYCBF4HaAKvA14YRXXObUZKWqCEBlY97Nq5NW1BbNfrzUprLPfm5MnMT1YEXnL7v5JiKb/CQ8G2tfJKUaG5hy0TRnU5FQEEEEAAAQQSTIDA69AhBF4HvCgXrW4LYvsCmh3J1QBsQ66G5/KB12UHtyg3i8shgAACCCCAQBQECLwOiAReB7woF7WBV9fifeWTQjOSq0fwfFz9s7749sonK0RHhe/JyRMdzdUd13S1h26ZbUwZlx3cotwsLocAAggggAACURAg8DogEngd8KJctPyLbbp6g06DCJ6eoLfUKQrFpXulSUqa/Pn8fmXm82Y0SJPsRpny7LqlZtTXHuFsaBHlZnE5BBBAAAEEEIiCAIHXAZHA64AX5aLlA68dsbWjtcErMuit7SYVdvqCDcL6df2ablWsL6xp8A0e/Y1ytbkcAggggAACCMRAgMDrgEzgdcCLclGdojC/eK2ZqqBLjdntgu38XZ2+YJca01vbVRnsEmW2OjoqrNfQQ5ckC2dDiyg3icshgAACCCCAQJQECLwOkAReB7xaKmrn7GpwvfnsHibk2ukLYwrfNhtNC1TQBwAAIABJREFU2DCsVQie46sbUOjIrh66KYUuYfbc+mVlzq+lanNZBBBAAAEEEKhFAQKvAy6B1wGvFovaUV0NtvpCmg24Gm5X7i6WzqdlScu0xoEa6PbF9nsLPltnvq7TGXTZM93RjbV4a7GzuDQCCCCAAAIxECDwOiATeB3warFoVS+w1XTL4NFe3aa4d1Z7GflBfpmthzUc62YXHAgggAACCCCQPAIEXoe+IvA64NVi0eB1dMMdnR3xQb7Zyc1uKWxHi/VltnaNMuWRD+fLZW3PNUuY2V3aNBwTgmuxQ7k0AggggAACjgIEXgdAAq8DXi0WDd4aWNfd1S2FQz30hbXdhw5IdnqmKWJfatP5vXrYOb56Tf2eHqziEKou5yGAAAIIIBAfAQKvgzuB1wGvlouW337Y5XbBG1HoC3A6Ahx8BL8E53IfyiKAAAIIIIBA7QgQeB1cCbwOeElUVOftjimcI9mNmsmQdrkyff3SwMtw+lKcrgihG11wIIAAAggggEBiChB4HfqFwOuAl2RFdaqDBlt76AiyruJg5/zqOr/BKz/o+RqUg7+WZE2muggggAACCHhGgMDr0JUEXgc8jxS1o7125zbbLF3dQUOvvtA2qO25HmktzUAAAQQQQCA5BQi8Dv1G4HXA80jRJTuKzOYUunXx0A49TKvs1/S/me7gkY6mGQgggAACSS1A4HXoPgKvA55Hiuoobvm1eu3orm2izu8Nng7hkabTDAQQQAABBJJGgMDr0FUEXgc8DxW183h1eTI9dMRXV3PQl9z0pbby0x081HSaggACCCCAQFIIEHgduonA64DnoaJ2CkO79GZScqjUzN0NDr/B0x081GyaggACCCCAQNIIEHgduorA64DnoaK6GoNOY7CbUmSlNZJ7cy4xqzTcuXSWaamu6KC7vtkNLTzUfJqCAAIIIIBAwgsQeB26iMDrgOexosHbGQ/r1CsQbMcUvi3FpfsCrX0091K2IfZY39McBBBAAIHEFyDwOvQRgdcBz2NFdRrDku1FckrdetKrRftA63Sb4/X7dsrKkq0m+LINscc6nuYggAACCCSFAIHXoZsIvA54Pita2fJlPiOguQgggAACCMRNgMDrQE/gdcDzWdHg+bxMa/BZ59NcBBBAAIG4CxB4HbqAwOuA58Oik9cslA9LigPTGnQahK7qkNEgjXV6ffg80GQEEEAAgdgJEHgdrAm8Dng+LBq8fNkfOvWWMYVzpLh0b5ld2nzIQpMRQAABBBCodQECrwMxgdcBz4dFg5cv69Oqo7y1ZXVAwWU3Nn0xbsP+nZKV1lg6Z2SZa+roMbu7+fAho8kIIIAAApUKEHgdHgwCrwOeT4sGL1+mBKl16pn1ezUA61q99tBw/MiqAunVIlu6ZbYpo7W1dK+0TGsc+JodKdbAe29Ontyx5FVzTXZ48+lDRrMRQAABBCoIEHgdHgoCrwOej4varYiVQJcp062IdTRWR3n10ED7yIfzTWjVQPyHc3qb7+s2xQXb1poNLe499xLzNR3J1U0v7KEh95WiQvNHLWvP0z/rlIry4TmUbrAj0RrKORBAAAEEEEhGAQKvQ68ReB3wfFx0ZUmxzN+21ozSDmp7rtgArKO5GSkN5ZWiFUanSUqa7Pl2aoKGXLuTm35PtzHWecBVjRjbkWN7nk57mLCqwIz65ma2kQcK55iwHLxJRmVdYsvp927u0CMwZcLH3UfTEUAAAQSSUIDA69BpBF4HPIoGBOzLbMEkPU/PNlMcHlk1P7BTm25N3LtFe5m+bum3UxbONSO+GlxzM88wI8D2uCcnLzBKrCOzS3cUmfM0CGvgXfDZOnOqDcRVdcf09UsD19V7DGmXS88hgAACCCCQdAIEXocuI/A64FG0jICdxqBfHJKdW+bls5mfrDBzebPTM00ZHSGesmZhoLyGWJ0OcefSWeZrGoyHd+pd4bzy5HYUOHiUV4Ox1kWXS+vZItuMCtvD3ueUuvWdek9HjXV02/U6TpWgMAIIIICArwQIvA7dTeB1wKNoBQGdtqCjsMEvpFXFpFMZNDgePHZE2qVnmqkRGow1rOoosF2twa79q9fRKQl2dFhDsQZonZ+rL7sNye4qr3yyQtbv31nhlnqu3ke3RtbR4a0H9soPT8sSHYUuH1rtKHL5r2vb9Gv6/TEr3paMBg3lD516EXr5OUAAAQQQiIkAgTeIedeuXdKwYUNJTU0NCZ/AGxITJ8VRwC6FpkFVQ7FOgdCX2nRUV4O1vvAWPDdYR3F7tWgvJYcPBKYy6It1eujLdZUdWqZ3VnvzrfnFayuE2enrl8n6fTukT+uOUrBtnVl7WA+tk45c66iyTrsIDsk6j1nrp+dwIIAAAggg4CpA4BWRzZs3y/XXXy+ffvqp8Rw4cKCMHj1a6tWrV60vgdf18aN8LAR0CoTO1bWBUkOvhlo9NBBruNT5v1lpjWTo2T8KrN+r5+lI8mPdBprzdMqEjvbqSg8671jX/q3q0FFjDdhbD+wJrBphz9X76JSJ4KCt59sRXw3IGoL1sGFag6+W0XoEtyUWftwDAQQQQCD5BQi8InLttdeakd1x48bJ559/LgMGDDCBt1+/b5aJquog8Cb/DwAt+EagpqkI9pzKNrPQaRTfLJd21Lxop9MmdPpD8KEvvK3cXWy+pMusaXjVeci6EoUedjWK1Lr1AyPAGozLX8deU0Nv54yW0vm0LBPQNQjrts06VUIP/T4HAggggAACVsD3gXf//v3SpUsXmTlzpuTk5BgXDbsafKdOnUrg5WcFgTAFNHzqyPD6/TtMYNUpEToqrF/XOcJ2frHdQOObTTb+sxqF3s6W0XnKOtprR6A1EJcfWdbRYTtNonxVbfDNbvTNC38637mqQKwj4S0bNi6zQ51OwdB6apC3wTp46gU72oX5cHA6AgggECcB3wfejRs3Sl5enixevFiaNftmVGj69OnyxhtvSH7+fxb0r6x/GOGN01PLbT0noKFXg6UeGiire3FPz9VwqoFaR43t1AidbqGHfcEuFCQbiHXE2Y5y923dSTIapJlR6+CX+Gyw1mkWGqD15T0to1/Xl/hCPTQ425HoUMtwHgKJKpBat15IL9omav2pl38EfB94CwsLZdCgQbJ8+XJJT083Pf/yyy/LpEmTZNGiRYEn4YUXXqjwVNx///2yYcMG/zwttBSBBBTQUeDTGjQsMzKr1bQh+uCxoyac6qHn6lHZ/GO7TFtwE/VrOm0ieI3jBCSgSggg4AOBad0H+6CVtddE3wdeO8K7ZMkSadq0qZGubIT3+eefr9ALY8aMIfDW3rPJlRGodQEbgPVGukybjhzr8m46wqt/1lUk7HJquw8dMF/TUd2Vu7dKVsPG5s/6Ap+OEId66Ei2hnEOBLwgEM4nKl5obzzbQOB10/d94K1sDu+oUaNk+/btzOF1e7YojQACCCCAAAIIJISA7wOv9sKQIUPk1FNPZZWGhHgkqQQCCCCAAAIIIBBdAQKviBQVFZl1eLdu3Wp0dVkyna5Qv371W6jy0lp0H0auhgACCCCAAAII1IYAgTdIVacx6Hq8+k8oB4E3FCXOQQABBBBAAAEE4itA4HXwJ/A64FEUAQQQQAABBBCIkQCB1wGawOuAR1EEEEAAAQQQQCBGAgReB2gCrwMeRRFAAAEEEEAAgRgJEHgdoAm8DngURQABBBBAAAEEYiRA4HWAJvA64FEUAQQQQAABBBCIkQCB1wGawOuAR1EEEEAAAQQQQCBGAgReB2gCrwMeRRFAAAEEEEAAgRgJEHgdoAm8DngURQABBBBAAAEEYiRA4HWAJvA64FEUAQQQQAABBBCIkQCB1wGawOuAR1EEEEAAAQQQQCBGAgReB2gCrwMeRRFAAAEEEEAAgRgJEHgdoDXwciCAAALhCHTt2lWWLVsWThHORQABBGTDhg0oOAgQeB3wXnjhBfn666/lqquucrgKRWsSWL9+vTz22GMyefLkmk7l+44Cl112mUydOlWaNGnieCWKVycwYcIEad++vVxyySVA1aLAkiVL5J133pHRo0fX4l24tApcfPHFMm/ePDAQSFgBAq9D1xB4HfDCKErgDQPL8VQCryNgiMUJvCFCOZ5G4HUEDKM4gTcMLE6NiwCB14GdwOuAF0ZRAm8YWI6nEngdAUMsTuANEcrxNAKvI2AYxQm8YWBxalwECLwO7AReB7wwihJ4w8ByPJXA6wgYYnECb4hQjqcReB0BwyhO4A0Di1PjIkDgdWAn8DrghVGUwBsGluOpBF5HwBCLE3hDhHI8jcDrCBhGcQJvGFicGhcBAm9c2LkpAggggAACCCCAQKwECLyxkuY+CCCAAAIIIIAAAnERIPDGhZ2bIoAAAggggAACCMRKgMAbK2nugwACCCCAAAIIIBAXAQKvA/uXX34pR48eZZF+B8NIiuIeido3ZXSjlOPHj0vdunUrXOTEiROyc+dOOe200yr9/pEjR2TPnj2SmZkpJ510UuSV8EnJY8eOVepYU/Nr6oeayvvp+/v375dDhw6ZZ7Kyo6ZndteuXdKwYUNJTU31E1vYbT148KD52f/ud78rJ598ctjl+Ts7bDIK1IIAgTcC1NLSUvnDH/4gBQUFpnTnzp3NLmBNmzaN4GoUCRbIy8uTjRs3lkH5/e9/L7fddpvg7v6s5Ofny/jx42XRokVlLvbuu+/KnXfeaYz1GDNmjFx++eWBkPzkk0/KxIkTzZ91F7Zp06aZ556jcoEtW7ZIr1695L333pMWLVoETho7dqw8++yzZQrl5OTIzJkzzdeq6wes/yOgQXXw4MHy6aefmi+eeeaZctNNN0m/fv1CemY3b94s119/faD8wIEDzW5s9erVg7mcgLra/9fpz/6vf/1rueuuuwJn8Xc2j0yyCBB4I+gp/Z+9/g/qpZdeklNOOcX8xdm2bVvR/5lxuAnoX559+/Yts+Vqenq6NG7c2IQs3CPz1f/BX3PNNbJ161Zp3rx5mcCroze5ubly++23y5VXXikLFiyQW265xfy7ZcuWUlhYKIMGDTLPe6dOncw2z2+++ab885//jGi0J7IWJE8pDU8rV640FS4feB988EHRMDxixIhAgxo0aGD6pKZ+SB6B2q/pjh075PXXX5df/epXkpaWJtOnT5enn35ali5dakZra3pmr732WjOyO27cOPn8889lwIABJvDawFz7LUieOzz++OPy85//XFq3bi26zNuNN94os2bNknPOOcc0gr+zk6cv/V5TAm8ET4D+pXjJJZeYH3w95syZIzoKqevF8lFvBKBBRfQvT/2fkYaG8gfukdvqx+u7d++W+fPnm18cgkd4dVTxhhtukI8//ljq169vbqJramr4veqqq+Thhx+WNWvWmFChh4aNiy66SHS0uEOHDpFXyqMl1UdDlD7DlQXevXv3mlH28kdN/eBRrqg0S3+R69mzp/mlrEuXLtU+szrirufoL886uq6Hhl3ts6lTp0alPl6+SPfu3c3o+s033xwIvPyd7eUe907bCLwR9KV+lPvnP//Z/GarhwaF/v37y/Lly0VHIzkiF1BTHXnRjyhPP/10+eUvf2lGFvTAPXJXW3L27Nny0EMPlQm8L7/8shkdmzdvXuAG+jFmmzZtzEeXd9xxhxlhv++++wLfP+uss+Spp56Sn/zkJ+6V8uAVtm/fLhoMKgu8c+fONSPqatq7d28577zzjEBN/eBBpqg1SUccddT8/fffN1Nuqntm9VML/Xtm8eLF0qxZM1MH/WXujTfeML/EcVQtoFNI9Jfh4J99/s7miUkWAQJvmD2lL/20a9euzA+8zjnVH3r9iFdDGkfkAjpPVF+KUGcdjdSP4vV/Qq1atcI9ctZAycoCr474vv3222X+Z6+BQX/xeOCBB8yIe/v27cvM29NfPnSer04/4agoUFXg1Wd506ZNkpKSIqtXrza/ZOgzr39/1NQPOFcuoJ+s6Q6BOmVHp+XoUd0zqyO8OkUneIBCf9mYNGlShbntmP9H4MCBA2Ze/3e+8x158cUXpU6dOuab/J3NU5IsAgTeCHpK/2evo2Q6r0kPRngjQAyhiL5hrS/+XH311WaeNO4hoNVwSqQjvDpq9l//9V+BqzPCWz10VYG3fKnhw4eLTnHQEXZGeMN/vouLi+U3v/mNXHDBBWY+rg1h+gtbVc+sHeHV+aj2RWNGeKu31/nlOq9fp33MmDHDfDpR2cHf2eE/w5SInQCBNwJr5pJGgBZhEX2ZRD8211UacI8QMahYZYG3srmjOh9yyJAhgTm8a9eulWeeecZciTm8NfdDqIFX5/KuWLHCzD2tqR9qvqu/ztiwYYNcccUVZlqIzsENXmpP551X9cxWNod31KhRon3GHN6Kz9AXX3whQ4cOla+++sr8YlZV2LUl+TvbXz+HydRaAm8EvRW8WoC+IXzdddexSkMEjuWL6PQFXf5GXwjMyMgwH7PrCJiOKOg8R9wjR9YpIrpmtL5gqSFLnXXqiIYE/R+ZvnE9cuTIaldp0BFIXaVhwoQJ8tZbb7FKQxXdoc46EqafTuiUBQ1Ydrkrtdf5/jovXQOZvhioLwzaQFFdP0Te+94rqXY6nUb/0dFcuzasrpqjI7t2lYaqnln9Ze7UU09llYYaHg39u0GXIdOXXp944gkzzUkPHUnXNXn5O9t7P1tebhGBN4Le1blMumapvpCiR8eOHWXKlClVLn4ewS18WUT/8tQRGx1psYeGMJ2bpwfukT8WOhqmv0gEHzpiblcL0ACsL6rZQ0e8fvvb35o/aljWpYl0jqMe+kuejvbaN9wjr5U3S+rUG7uesbZQA5i+TKWHjn7p3F172OWwdGkyParrB29qRdYq/aRCg275wz7TNT2zRUVFZpqUru5g+0XnpNtVSiKrlfdK2U8qyrfMPtP8ne29Pvdyiwi8Dr2ru/zonCU2nHBALFdU/0elO/poYNAXACvbEQz36HkHX0l3YNORSX1zvbL/8euOViUlJRHvtlQ7tU6+q+quU/qMq3NlO3zV1A/J1+L41bimZ1YDnY5a2pHL+NU0ee/M39nJ23d+qzmB1289TnsRQAABBBBAAAGfCRB4fdbhNBcBBBBAAAEEEPCbAIHXbz1OexFAAAEEEEAAAZ8JEHh91uE0FwEEEEAAAQQQ8JsAgddvPU57EUAAAQQQQAABnwkQeH3W4TQXAQQQQAABBBDwmwCB1289TnsRQAABBBBAAAGfCRB4fdbhNBcBBBBAAAEEEPCbAIHXbz1OexFAAAEEEEAAAZ8JEHh91uE0FwEEEEAAAQQQ8JsAgddvPU57EUAAAQQQQAABnwkQeH3W4TQXAQQQQAABBBDwmwCB1289TnsRQAABBBBAAAGfCRB4fdbhNBcBBBBAAAEEEPCbAIHXbz1OexFAAAEEEEAAAZ8JEHh91uE0F4FggXfffVeOHz8uZ599tnz3u98NfGvLli3yySefyE9+8pO4gK1cuVJef/11Wbp0qVxyySVy5513RqUehYWFsnXrVunXr19Urlf+InPmzJFTTz1VLrzwwlq5fiwv+tJLL8miRYtk8uTJtXrbr776StStY8eO0q5du1q9FxdHAAH/ChB4/dv3tBwBOeuss4xCjx495Omnnw6IvPDCC3L//ffLhg0bYq5UWloqnTt3NqHxRz/6kTRu3Fj69+8flXrcc889MnPmzEC7nnjiCXnxxRfl/fffj8r1u3fvLh06dJBp06ZF5XrxvMijjz5qfunQ0ButozLvzz77zPTzyJEj5ZprronWrbgOAgggUEaAwMsDgYCPBTTwnnnmmbJx40b529/+Jueff77RiGfg/cc//iG33HKLfPDBBybsRvPQ0cSjR49Kenq6uezEiRNNu6MVeL/44gs5+eSTpWHDhtGsdlyuVRuBtzLvEydOyP79++WUU06RlJSUuLSVmyKAgPcFCLze72NaiECVAhp4dWTttddek9TUVHnllVfkpJNOqjTw5ufny3//93/LunXrJDs7W373u99FNDVg165d8uc//1kWL14shw8flt69e8v/+3//T0477TQzheGuu+6S7du3S05Ojqm3ntumTZsKbdCpGPqxu45CFhUVSevWreWnP/2pCcszZsyQZcuWye9//3vTFp2ecccdd5h/6301eP3zn/8099qzZ0/gXjqS/Jvf/Ea+/PJLmTBhgsyfP9/UpWvXrsbp+9//vqmH/vcZZ5xhPoJXl507d8qkSZPkkUcekRYtWsiNN95ozquurfp9W08dzdZ6FhcXy6BBg+Taa6+VzMzMKvtN2/7888+b/tJfVrQ/tN15eXmiof66664zo+J6LXt8+umn8qc//UmGDx8u5513nrH56KOPzBSPJk2aiI5O6/eaN29uipQPvOqn5X77298GrjlmzBhp1KiR3HbbbeZr1V2zKu9f/epXcvXVV8utt95q6qCH1uuhhx4yv4i0bNlSfvnLX8rQoUOlfv36Af+MjAzRsPw///M/Uq9ePVOvK6+8MnCOTovRftZpLBqkO3XqZIz00wMOBBDwnwCB1399TosRCAho4P2v//ovadWqlVx//fXmo/iePXtWCLxvvvmmDBs2zAQSDVU651I/6tZQ1KdPn5BFdXRV5+RqENRQpodOpWjatKm8/fbboh9vjx07VhYsWCCjRo0y3//5z38uGm7KHw8//LAJ4FpfPWft2rXyzDPPmOkKf/nLX+Spp54yRbp06WLC4+DBg03YtR/Ta1DUe2k77L000J5zzjkycOBA2bdvnwlQGgafe+45E6r13O985zsm6K9Zs8ZcX+9fp04dcy39JUBN9b9raquGNFtPDZmXX3651K1bV8aPH28Cs4bPqg4tpyPTGvK0vmo3e/ZsefXVV02g03C4evVqE+p1xFkPDeMaqvWXCv3lRs/54Q9/aAKlhv7HH3/cBPrp06dXGnits/6SYI8rrrhCmjVrZn450KO6a1bl3b59e9MGbbe66vzxXr16yfe+9z0ZMmSIcdZgr7+I6DQbPay//lKkfa9ldGqKPks6PUfbc8EFF5h/1OjAgQMyd+5c8wmG/WUk5IeWExFAwBMCBF5PdCONQCAyARt4NdjpaKB+JP/WW2+ZkcfgObwXX3yx+chZRzPtoUFXR2jnzZsX8s01KOsooIZR+0KchlsNITq/U8OLfk9flNIRuqoODczdunUrE4L03B07dphwawPhyy+/LBqo7FF+1LKyj9gLCgrkpptuCoRHLauj2tpeHcXVUWQNXDraqL8gaCC2hwZlG3hDaavWc9asWfLee++ZEKqHhmV9mbAq15KSEjPi/Mc//lFuuOEGU+bYsWMm2Ov97777bhPMdZRYA66eq+H7oosuMvXW0engQ/tw79698uyzz5pfGPQXBw3w5a1CCbz2ulVdszJvHZEODryjR4824XX58uWBqSfjxo2Tv/71r/Kvf/3L9K+2Q4O6PjP6iYQe+ouYtvW+++4zz45a6P306/Y4ePBgwDnkh5YTEUDAEwIEXk90I41AIDKB4MC7YsUKM8qoI20afG3g1fDygx/8oMKoox2d/PjjjwMfI9dUCw0oGkL0XrqagR46f1PDmgZh/Wg8lMCr0xU0pE+dOtWMBpY/tG4aODVMBx+hBF4NtY899ph5+cweOoVAQ699sUoDl4Y0O+JYWeANpa2V1VNDp07jqOqFQZ3brKOWGvh0tNkeOhKqoVRDuNb3xz/+sRnh1P60v1Soic7Z1kP/W9uq7Qo+bH9GEnhrumYogVdHcvWZ05F4e+hItX4CoUFY21SZv46u66Gj/keOHDGfRuhIr/6ypiPZv/jFL+T000+v6RHl+wgg4FEBAq9HO5ZmIRCKQHDg1fN1pFWD01VXXSU6ZUBDl34crIFBpzTcfPPNgcs++eST5mPwVatWhTxqpuFLA5nO0bQvKB06dMgsSaXzK3WeaCiB1wag4BftgtvrEnhtHXVEsfyh83Z1+kcogTeUtlZWT52bq3Njqwq8CxcuNNNB7FSU4DrqS346V1UPddY6aEAeMWKE+cVC5zzrYUeABwwYYEbJtU06oqyrWFQXeDU86rXsETylIZRrhhJ4tU5paWlmdNoeNuTrdAud71yZv47Ka9DXwKuHtlctdR6wfSmxql+QQvlZ4RwEEEhuAQJvcvcftUfASaB84NWPs/v27Ws+ptfRMRu6dFRNz9URNntoUNJ5rTZMaDDWj6e1rM5FrezQj+81MAUHVS2vwUlfUvr1r38dUuDdvHmzedmt/FxXDTz6cXyogVdD4ZQpU8pMn3jjjTfMy2w6L9Yu22bb8vXXX5uP0EMJvKG0NZLAa+e46uiy9kHwYeunX9PpHTqNQac96C8ROmqto5x62DCuv9zoXGI99MVFfXmwqsCrUwT05UGdXhD8DOj6zTqHN5RrVuZdfkqDzl3WqTPBv0jZ0XJ9iVBfTqwp8NrnwNZT52Nr/du2bWs+FeBAAAH/CRB4/dfntBiBgED5wKvfsIFD/9sGXjtaqC8l6RxWHQ3Uj8P1XPsSkI5K6oiavjxV1QYCOlVCXyrS0HL77beb8KhBTAOsjlzqNIdQRni1bjrKqVMj9N86T1NXYNBgpHOQQw28dq6nhm3dfEPro6ss6PUaNGgQWI1h06ZNokFYg5ZOoQgl8IbS1kgCr7Zd+0H74MEHHzTTQXbv3m1eUNP6B7/sptNEdJqBjpjqLxZ2VF3nCGsQ1oCrL3LpiLv2g/6SU1XgtWFVw63ON1YP/Ud/QdKvhXLNyrx1akbwHF47tUb7QOch6y9hOr1C+0ene+hRU+DVKRz6S5WW15CuK1TopxY6LUJXquBAAAH/CRB4/dfntBiBMoFXX/LREVZ72BHE4MCrcyJ1ZE8DrT30DXp9ccouFaXhSz9yri7walkNPTpXV5f70kNXKNCgapeL0o+kNUxX99KaltOQp6OcGujsoR+56wtvOtqoI7Tl5/BqqNORTLuZgo4EagCyL+OlYGluAAADd0lEQVTZaRUannXlBp0rbA+d06sG+hKcfuyu85qrm8MbSlsrq2coayDrx/W66oKdoqD30pF1neZgR3H1a7oqhfZT+ZFwfclNR7F19Q1bVqet6At7NvCWt9q2bZvce++9ATu11l+IbFgN5ZqVeWvddBqGXaVB66OrTQS/XKfTGNTeLtVWmb/+EqDX12Cu85J1eoyuDGHbZ6djaPjnQAAB/wkQeP3X57QYgYgF9C33zz//3GxDbFcViORi+tF7cOC1b9pHci19wUnXwdWlzXRUNpJDd3fTj9Z1LeDguuj8Yl0RQufGRrqZRDTbWr5tGjK17dru4NUiQjXQj/o1PGdlZZmpIKEcOlVC72c37yhfJpRrVuUdfC1tm4ZsHfWPdAMSXU9Z66Ptc3nGQnHhHAQQSGwBAm9i9w+1QwABBBBAAAEEEHAUIPA6AlIcAQQQQAABBBBAILEFCLyJ3T/UDgEEEEAAAQQQQMBRgMDrCEhxBBBAAAEEEEAAgcQWIPAmdv9QOwQQQAABBBBAAAFHAQKvIyDFEUAAAQQQQAABBBJbgMCb2P1D7RBAAAEEEEAAAQQcBQi8joAURwABBBBAAAEEEEhsAQJvYvcPtUMAAQQQQAABBBBwFCDwOgJSHAEEEEAAAQQQQCCxBQi8id0/1A4BBBBAAAEEEEDAUYDA6whIcQQQQAABBBBAAIHEFiDwJnb/UDsEEEAAAQQQQAABRwECryMgxRFAAAEEEEAAAQQSW4DAm9j9Q+0QQAABBBBAAAEEHAUIvI6AFEcAAQQQQAABBBBIbAECb2L3D7VDAAEEEEAAAQQQcBQg8DoCUhwBBBBAAAEEEEAgsQUIvIndP9QOAQQQQAABBBBAwFGAwOsISHEEEEAAAQQQQACBxBYg8CZ2/1A7BBBAAAEEEEAAAUcBAq8jIMURQAABBBBAAAEEEluAwJvY/UPtEEAAAQQQQAABBBwFCLyOgBRHAAEEEEAAAQQQSGwBAm9i9w+1QwABBBBAAAEEEHAUIPA6AlIcAQQQQAABBBBAILEFCLyJ3T/UDgEEEEAAAQQQQMBRgMDrCEhxBBBAAAEEEEAAgcQWIPAmdv9QOwQQQAABBBBAAAFHAQKvIyDFEUAAAQQQQAABBBJbgMCb2P1D7RBAAAEEEEAAAQQcBQi8joAURwABBBBAAAEEEEhsgf8Po7g/18y6UBMAAAAASUVORK5CYII="
},
"metadata": {},
"output_type": "display_data"
@@ -191,7 +191,7 @@
"outputs": [
{
"data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAArwAAAH0CAYAAADfWf7fAAAgAElEQVR4XuzdCXyNV/4/8I8lxJIgtBK71r63qCW0RdW+dNBhFI22tv+MrS1p/RTVTulULdOR0ilaFMWMrSqUaIktGEGCqCXWWEpEkFj/r++5OdeT/d48Nzc31+d5veZVbp7lPO/zxHxy8n3OyfPo0aNH4EYBClCAAhSgAAUoQAE3FcjDwOumPcvbogAFKEABClCAAhRQAgy8fBAoQAEKUIACFKAABdxagIHXrbuXN0cBClCAAhSgAAUowMDLZ4ACFKAABShAAQpQwK0FGHjdunt5cxSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetu5c3RwEKUIACFKAABSjAwMtngAIUoAAFKEABClDArQUYeN26e3lzFKAABShAAQpQgAIMvHwGKEABClCAAhSgAAXcWoCB1627lzdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43bp7eXMUoAAFKEABClCAAgy8fAYoQAEKUIACFKAABdxagIHXrbuXN0cBClCAAhSgAAUowMDLZ4ACFKAABShAAQpQwK0FGHjdunt5cxSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetu5c3RwEKUIACFKAABSjAwMtngAIUoAAFKEABClDArQUYeN26e3lzFKAABShAAQpQgAIMvHwGKEABClCAAhSgAAXcWoCB1627lzdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43bp7eXMUoAAFKEABClCAAgy8fAYoQAEKUIACFKAABdxagIHXrbuXN0cBClCAAhSgAAUowMDLZ4ACFKAABShAAQpQwK0FGHjdunt5cxSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetu5c3RwEKUIACFKAABSjAwMtngAIUoAAFKEABClDArQUYeN26e3lzFKAABShAAQpQgAIMvHwGKEABClCAAhSgAAXcWoCB1627lzdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43bp7eXMUoAAFKEABClCAAgy8fAYoQAEKUIACFKAABdxagIHXrbuXN0cBClCAAhSgAAUowMDLZ4ACFKAABShAAQpQwK0FGHjdunt5cxSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetu5c3RwEKUIACFKAABSjAwMtngAIUoAAFKEABClDArQUYeN26e3lzFKAABShAAQpQgAIMvHwGKEABClCAAhSgAAXcWoCB1627lzdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43bp7eXMUoAAFKEABClCAAgy8fAYoQAEKUIACFKAABdxagIHXrbuXN0cBClCAAhSgAAUowMDLZ4ACFKAABShAAQpQwK0FGHjdunt5cxSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetu5c3RwEKUIACFKAABSjAwMtngAIUoAAFKEABClDArQUYeN26e3lzFKAABShAAQpQgAIMvHwGKEABClCAAhSgAAXcWoCB1627lzdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43bp7eXMUoAAFKEABClCAAgy8fAYoQAEKUIACFKAABdxagIHXrbuXN0cBClCAAhSgAAUowMDLZ4ACFKAABShAAQpQwK0FGHjdunt5cxSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetu5c3RwEKUIACFKAABSjAwMtngAIUoAAFKEABClDArQUYeN26e3lzFKAABShAAQpQgAIMvHwGKEABClCAAhSgAAXcWoCB1627lzdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43bp7eXMUoAAFKEABClCAAgy8fAYoQAEKUIACFKAABdxagIHXrbuXN0cBClCAAhSgAAUowMDLZ4ACFKAABShAAQpQwK0FGHjdunt5cxSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetu5c3RwEKUIACFKAABSjAwMtngAIUoAAFKEABClDArQUYeN26e3lzFKAABShAAQpQgAIMvHwGKEABClCAAhSgAAXcWoCB1627lzdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43bp7eXMUoAAFKEABClCAAgy8fAYoQAEKUIACFKAABdxagIHXrbuXN0cBClCAAhSgAAUowMDLZ4ACFKAABShAAQpQwK0FGHjdunt5cxSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetu5c3RwEKUIACFKAABSjAwMtngAIUoAAFKEABClDArQUYeN26e3lzFKAABShAAQpQgAIMvHwGKEABClCAAhSgAAXcWoCB1627lzdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43bp7eXMUoAAFKEABClCAAgy8fAYoQAEKUIACFKAABdxagIHXrbuXN0cBClCAAhSgAAUowMBr4hmoWrUqjh8/buIMPJQCFKAABShAAQpQILsFGHhNCDPwmsDjoRSgAAUoQAEKUMBJAgy8JqAZeE3g8VAKUIACFKAABSjgJAEGXhPQDLwm8HgoBShAAQpQgAIUcJIAA68JaAZeE3g8lAIUoAAFKEABCjhJgIHXBDQDrwk8HkoBClCAAhSgAAWcJMDAawKagdcEHg+lAAUoQAEKUIACThJg4DUBzcBrAo+HUoACFKAABShAAScJMPCagGbgNYHHQylAAQpQgAIUoICTBBh4TUAz8JrA46EUoAAFKEABClDASQIMvCagGXhN4PFQClCAAhSgAAUo4CQBBl4T0Ay8JvB4KAUoQAEKUIACFHCSAAOvCWgGXhN4PJQCFKAABShAAQo4SYCB1wQ0A68JPB5KAQpQgAIUoAAFnCTAwGsCmoHXBB4PpQAFKEABClCAAk4SYOA1AW1L4N17aS/8ivihbNGyJq7EQylAAQpQgAIUoAAFsirAwJtVOQCZBd7z8efRfmV7dYVDAw6ZuBIPpQAFKEABClCAAhTIqgADb1blbAi8m89sxsiQkeoK89rNQ2PfxiauxkMpQAEKUIACFKAABbIiwMCbFbWkYzIb4Z19YDaCwoMYeE0Y81AKUIACFKAABShgVoCB14RgZoE3YEMApIZXtsn+k9G9SncTV+OhFKAABShAAQpQgAJZEWDgTaG2bds2DBw4EDt27MBTTz2VoWlmgbf5kua4efemOsfQ+kMxrMGwrPQRj6EABShAAQpQgAIUMCHAwGvAi4+PR//+/ZGQkIDvvvvOVOA1vrAml+hbsy8CXwg00VU8lAIUoAAFKEABClAgKwIMvAa1Dz74AM2bN8fcuXMxb948U4HX+MKaXKJR6UaY335+VvqIx1CAAhSgAAUoQAEKmBBg4E3C+/XXX7F8+XJ89dVX6NChA77//ntTgVe/sCZBV+p4GXhNPKU8lAIUoAAFKEABCpgQYOAFcPPmTfTt2xcLFiyAj49PmoH3H//4RypmGQn2/D/PDPnHNB6Dz8M+h1cBL+zos8NEV/FQClCAAhSgAAUoQIGsCDDwAhg7dixefvllFXRlS2uENzg4OJXvX//610wD74YeG7j4RFaeTB5DAQpQgAIUoAAFHCTAwAugdu3aKFCggJX01q1bKFKkCCZNmoSuXbumS53ZLA36wGY/NEP8vXiE9gmFdwFvB3UdT0MBClCAAhSggCMFDh48iMDAQFy5cgU1atTAtGnT8PTTTzvyEjxXDgkw8KYB74gaXuNp9Xy8XG0th55yXpYCFKAABSiQicCDBw/Qtm1bTJw4ES+++KKarUmmKJ0zZw7t3ECAgZeB1w0eY94CBShAAQpQIG2BkydP4qOPPsKJEydQsGBB9b5OpUqVUu0cHh6OTz75RL3ALtvDhw/RrFkz/PLLL/Dy8iJvLhdg4DXRgbaWNEzZMwWLjyyGvMDWr1Y/E1fkoRSgAAUoQAHXFpjxy/EcaeDIV6qmuq6M2nbq1AkjRoxQ7+dcv34dxYoVQ968eVPtu3r1ajWiO3XqVOvXevbsqcJyvXr1cuSeeFHHCTDwmrC0NfDqKcq42poJbB5KAQpQgAK5QqBS4E850s7TUzqlum5kZCTGjBmDdevWZdqmZcuWQfaX93f09sYbb0BeUG/atGmmx3MH1xZg4DXRP7YG3oWRC9XUZHq1NVmFTebmLVOkDBr7NjbRAh5KAQpQgAIUcC0BVxrhDQkJwaJFi/Dtt99mirRmzRrInPzyopreunXrho8//hj169fP9Hju4NoCDLwm+sfWwBsWE4aBwQOti08M3zIcIWdDUKZoGSzvspwzN5joAx5KAQpQgAIUSE/g0KFD+PDDD7F27dpMkSIiIjBu3DisWrVK7Xv//n288MIL2LJlC4oXL57p8dzBtQUYeE30T1YDb8+1PXHs2jF1ZT3qa6IZPJQCFKAABShAgTQEJLS2a9dOTTUmMzDEx8cjT548aurRlJu8pNa+fXuMHz8eLVu2VLM0yAtrCxcupK0bCDDwmuhEWwOvXKLud3XVlQ4NOGT9s760jPLW8KlhoiU8lAIUoAAFKECBtASOHj2KCRMmIDo6GoULF1blDZUrV04TS/aVmt8LFy7g2WefxRdffIHy5csT1g0EGHhNdKI9gbfHmh6Iuh6lZmqQel6/In5oXaG1mr2h67Nd8WmLT020hIdSgAIUoAAFKEABCqQnwMBr4tmwJ/CO2z4Oa06sUXW88sKa/PeTFp9Ylx2WJYjLFi1rojU8lAIUoAAFKECBzAT279+vRnxTbkFBQShXrlxmh/PruVSAgddEx9kTePVMDfpyeooyHYRZy2uiI3goBShAAQpQgAIUyECAgdfE42FP4NUzNejLTfafjO5VuuPotaPotbYXvAp4YUefHSZaw0MpQAEKUIACFKAABdISYOA18VzYE3jlMvrFNfmz8UU1Xd87r908zstroj94KAUoQAEKUIACFGDgdfAzYG/g1cFWmiGzNehNLz3Mldgc3EE8HQUoQAEKUIACFADAEV4Tj4G9gVfX61YrUQ0ru660Xnnzmc0YGTIS1X2qY0WXFSZaxEMpQAEKUIACFKAABVIKMPCaeCbsDbyrfl+F1b+vVsE28IXAZFfW5Q6hfUK58pqJPuGhFKAABShAAQpQgIHXgc+AvYE3o0sHbAhQ05XNaDUDbSq0cWAreSoKUIACFKAABSjwZAtwhNdE/zsy8M4+MBtB4UFcathEf/BQClCAAhSgAAUokJYAA6+J58KRgVdPWybTk8lsDVxq2ETH8FAKUIACFKBAFgVCQkIwevRoLF68GLVq1criWXiYqwkw8JroEUcGXmmGLmuQP+t5ek00j4dSgAIUoAAFKGCHwLfffostW7bg9u3b+PTTTxl47bBz9V0ZeE30kKMDrzTFuATx/PbzTbSOh1KAAhSgAAUocPLkSXz00Uc4ceIEChYsiAULFqBSpUppwuzatQvPP/88BgwYgPHjxzPwutHjw8BrojOzI/Cejz+P9ivbo0zRMgjuEWyidTyUAhSgAAUokAMCW6fkwEUBvJx89iNpxIMHD9CpUyeMGDECHTp0wPXr11GsWDHkzZs3wzb++c9/xoQJExh4c6Yns+WqDLwmWLMj8Epz9BRlxsUpTDSTh1KAAhSgAAWcJzCxmPOuZbzSxBuprhsZGYkxY8Zg3bp1drWJgdcurlyxMwOviW7KrsCra3m51LCJzuGhFKAABSiQMwIuNMIrL6AtWrQIUptrz8bAa49W7tiXgddEP2VX4NV1vHxxzUTn8FAKUIACFHjiBQ4dOoQPP/wQa9eutcuCgdcurlyxMwOviW7KrsCr5+QdWn8ohjUYZqKFPJQCFKAABSjw5Arcv38f7dq1Q2BgINq2bYv4+HjkyZMHRYoUyRCFgdf9nhkGXhN9ml2Bd/OZzRgZMhKNSjcCZ2ow0UE8lAIUoAAFnniBo0ePqhfQoqOjUbhwYVXeULlyZQbeJ+zJYOA10eHZFXiPXjuKXmt7caYGE33DQylAAQpQgAIUoIAWYOA18SxkV+CVJnGmBhMdw0MpQAEKUIAC6Qjs379fjfim3IKCglCuXDm6uakAA6+Jjs3OwNtjTQ9EXY9Syww39m1sopU8lAIUoAAFKEABCjzZAgy8Jvo/OwPv8C3DEXI2hEsMm+gfHkoBClCAAhSgAAVEgIHXxHOQnYF3YeRCfB72Oar7VFejvN4FvE20lIdSgAIUoAAFKECBJ1eAgddE32dn4I27GwdZgELKGlpXaI2ZrWaaaCkPpQAFKEABClCAAk+uAAOvib7PzsArzToffx491/RE/L141vKa6CceSgEKUIACFKDAky3AwGui/7M78ErTuOqaiQ7ioRSgAAUoQAEKUIA1vOaegQwD764gIOEG0KAPULxili/EVdeyTMcDKUABClCAAhSggBLgCK+JByHdwJsQC0xJCrkvBQKtPkh+Ffm6Z3GbrszAaxMTd6IABShAAQpQgALpCjDwmng40g28R9cBS/tazlzRHwhYb/lzbLTl8+qdUofgdNoRFhOGgcEDLcsMN50IbPgAiD0DvLnO5tBs4hZ5KAUoQAEKUOCJELh37x5k8YkVK1bgwYMHqFatGv7+97/Dz8/vibh/d79JBl4Ap06dwpdffok9e/agQIECaNKkCT755BN4enpm2P/pBt6fxwK7v358bGA0cPQnS1iVMofqHYE+S2x6tpIFXt9XgFXDLMfV6AT0/sGmc3AnClCAAhSgAAUyFoiNjcUPP/yA/v37o2jRovjnP/+J48ePY9asWaRzAwEGXkAF3cTERPj7+6uf6t59913Url0bgwcPzlrgnVHXMgpb0BtIjAOaDEkegItXAEYesunxkZka2q9sjzJFyyA4b2Ug3BCUey8GanS26TzciQIUoAAFKPAkCpw8eRIfffQRTpw4gYIFC2LBggWoVKlSphSRkZEYM2YM1q1bl+m+3MH1BRh40+ijRYsWISoqCh9//LH9gVfKFmbUs4TdpsOAX6c8PofU8+6abQnBMuprYx1v3e/qqnMcui5lEWeA+n0swde3LjBke9ptlHaYeFnO9R9dtpACFKAABVxRICg8KEeaNbT+0FTXlUGsTp06YcSIEejQoQOuX7+OYsWKIW/evJm2cfHixYiIiFBlDdxyvwADb4o+lJ8AR40ahQ8//BBNmza1flUCcMpNvomOj6uW/GMpWYg5ZClbaDYMWJA0Aqtreed3BKJDLTW4MtIbexbw9AZ866X7NFkD76mkUeMPzgKflbcE5yHbkh974Adg62eAZ7H0w3Duf255BxSgAAUo4KIC+v+znN28QwNS/+Y0q6O0MTEx6NevnxoNLlu2rLNvhdfLBgEG3iTUNWvWYOLEiYiPj8dbb72F0aNHw8PDw0o+bFhS7ayhEzZt2oTjfS+n3S3dZwMN+gL/HWIZlX0tyDLiGvKZZdRXRnujtwOnDSO0MmIrIVjCb6UWgG8dNQosK67tvbQX8y5eRuOKrS31v3JeGeWVcokOUy1tWNAp+fnsGEXOhmeLp6QABShAgSdQwJVGeENCQiC/tf32229t7gkZBR4wYADef/99tGzZ0ubjuKNrCzDwpugfedBnzpyJO3fuYOrUpCCZTh+ql9Y2zUv7qxJe0ypZ0DM4yNdlJFi20nWAS4fTPk/xCggoXhB78yRaAu/LEyylEjEHga9bWgKy1APrUgo5S7HywI2zllHkSvxmde1vQbaOAhSgAAWyS+DQoUPqN7Zr16616RI3b95EQECAGviSEghu7iPAwJtGX54+fVo97Js3b86wp7O00poxmMrZpR73taQZHU5vs4wGXzxoCcNS+iCrrZXywRqvohjzx3XU7DYHKFpaTVOG6XUswVbKGmR/mcFBSikkBMssEWnNAew+zy7vhAIUoAAFKJChwP3799GuXTsEBgaibdu26re4efLkQZEiRVIdd+vWLbz99tuqlKFjx46UdTMBBl4A4eHhqFKlivoGuHv3LqZNm4arV6+q/2a0ZSnwygl1/a38ObNR2NhozN47HUHnN6MaCiIKiapJqlZJT38moVlqh4+tB6SUQup3Zb5f4xzAbvbg8nYoQAEKUIACtggcPXoUEyZMQHR0NAoXLqzKGypXrpzq0KVLl2L8+PGpXmhbtmwZGjRoYMuluI8LCzDwApg/fz6+++47yKTTUrfbrFkz9dOgvMmZLYFXv7gmpQej0illMFxYr7ZmbMuGHhtQ9v59y4wQsknIldA78qDlz3qlt4k3XPjxY9MoQAEKUIACFKBA9gsw8JowzvIIr5QuyMtrz/W1vNiWyaYXnzDuNq/dPDT2bfz45TX5otQCD7WUQSQrd8hgBojMrs2vU4ACFKAABdxJYP/+/WrEN+Umq6yVK1fOnW6V92IQYOA18ThkOfDaec20Au+YxmPQr1a/5C+rGWds0LM4tP/M8pIbNwpQgAIUoAAFKPCECjDwmuh4ZwXeuLtx8F/in6ylfWv2ReALgZbPDiwGrp8Bmg19PDOEfCYvsRlfijNxrzyUAhSgAAUoQAEK5FYBBl4TPeeswCtN1BN5d322K9acWKNmaZjffn76rdezQUg9b+AZE3fJQylAAQpQgAIUoEDuFmDgNdF/zgy88uKajPS2qdAGA4MHokzRMgjuEZxx643TlrGO10RP81AKUIACFKAABXKzAAOvid5zZuA1NtO61HAayygmux09bZmxttfE/fJQClCAAhSgAAUokBsFGHhN9FpOBd4ea3og6noUlndZjho+NdK/A5kNYkFnQFZ1k03KG14O5OprJvqch1KAAhSgAAUokPsEGHhN9FlOBd6ADQHYe2kvZrSaoUocMtyMi1zIjnyJzUSP81AKUIACFKAABXKjAAOviV7LqcCrF6IYWn8ohjXIZMqxJX0sK7DpTZYdltXdZAYH2d78KbXAjLqWJY4zWwXOhB0PpQAFKEABClCAAs4SYOA1IZ1TgXfV76swPnQ8ZMaGT1t8mvEdxBwEjvwE+NUF/jsUSIwDpKZ399eW44ZsA1K+0DYxaYU5zuFr4ungoRSgAAUoQAEKuIoAA6+JnsipwKsXoqjuUx0ruqyw/Q70YhTGI9IKtTrw8mU32225JwUoQAEKUIACLivAwGuia3Iq8EqTbZ6pwXh/ejEK42fVOwJ9ljz+RL/oJp9U9AcCDOUQJqx4KAUoQAEKUIACFMgpAQZeE/I5GXj1TA3z2s1DY9/Gtt2FXoxCh9noUEBqekceSjvwyqcTb1i+FvIZ8OsUIGVA3hUEJNywzP7AjQIUoAAFKEABCrigAAOviU7JycA7bvs4teLamMZj0K9WP9vvYtds4OJBoNUHQFALS03vyINA8YqIuBAHz4ileDb0/cfnS/oadDlEyoCsyx/4gpvtfcA9KUABClCAAhRwqgADrwnunAy8dr24lt496hkcus8GGvRFx5nb8OrleRjp8Z/HR+ggO78jICPCxlFfeSHu65aWz14KtIRobhSgAAUoQAEKUMDFBBh4TXRITgbeo9eOotfaXrYtMZzePeoyhSZDEPfyJ6g3aSNG5luRPPDqIKtHcuVcOgSz3tfE08NDKUABClCAAhRwlgADrwnpnAy80mz94lpon1B4F/C2/04MK7EFt1yBwQv3YZnHx2iS76ilVlfm75WFKjpMAaZUfHz+pBFhGAOvceTX/pbwCApQgAIUoAAFKJBtAgy8JmhzOvDqFdfsenEt5f0mjdz+X/WfsCj8xuPAKyO78pKazNQgpQqyRLHe9KivHiHWn7OO18TTxEMpQAEKUIACFMguAQZeE7I5HXin7JmCxUcWw6YV19K7z6Ta3L8VnIy1N57F9oIjUC7PFcuCFLo+V0Z0ZWW2YuWBG2cfT1eWMvCyjtfE08RDKUABClCAAhTILgEGXhOyOR147VpiOL37/HmsWnVtxr0/4VuP3jiE1y17ynRkQf7ApcOAb10g5pClvCF8yeOpzJKORek6lv04b6+Jp4mHUoACFKAABSiQXQIMvCZkczrw6pkaWpVvhVmtZ2XtTo6uA5b2xe4HNfDvqv/CNyfbqPPs7H8SzS4vAzYYZl7ovVjtq7bAaGDJXywzN+jPPYsBgWey1g4eRQEKUIACFKAABbJJgIHXBGxOB169xHCj0o0wv/38NO9k+b5zWLnvHNrWKo23WlROtc/5mBiU/bo64h4VxrFWX6Px1v4q/D4MWI9mPvHAjHqPj5Eyh58DLSFX6nWlpEH/+b9DLeUOso+v4RgTvjyUAhSgAAUoQAEKOEKAgdeEYm4IvDK3buTFODSp7INlg5slu9uz1+/g/eXhmHauL8rmvQq0/0yN6Erg3fHiQoxqWw3XvmgMn/goddzZETEov3WUpaxB9j3wg6XUQULuztmPP286zIQqD6UABShAAQpQgAKOFWDgNeHpKoG3uk91rOiyItWdSKBtOXWL9fPTUzpZ/zzo+73YGHlJ/T2o8Fx0eLjVWqu78n5LnHnpSxV4v5s+BgNuzMH5h6Ww4dVf8Na9pZbZG5oMUbW/apN63wOLLS+2yXRmMqtDQhzgWwfwLG5CmIdSgAIUoAAFKEAB8wIMvCYMczrwStP1XLyHBhxKdSffbj+FyesirZ8vGdQUzZ4piZ0n/0CfubvU5z2eL4d3n96LMltHW/eTF9h2VhikRoTrffAjauWJRhwKI65YTWz/s4dlijJ5Qc248lpstKX8Qep4ZUu4wdXXTDxbPJQCFKAABShAAccJMPCasHT1wKvLGcoWL4TzsXcwvnMtVcf7zvd7sSnyEka0qapGcdX23yGWkgRAzdgggXdk22oqGNf088K563dwM+E+NgVUQtUlzS3BVkKtTFU26rDlHNPrWOp49cZZG0w8XTyUAhSgAAUoQAFHCTDwmpB05cCryxm8PPPjoy61Va2uvLgmf9ZlDoPsOtIAACAASURBVAcnvArvQh6PBX4ei8T7D9B4xwuIe1REBeKZm48jwL8S4u7cx8r959S+hz3fRlHcthxnDLWG0Ky+VrwCMDL1yLMJch5KAQpQgAIUoAAF7BZg4LWb7PEBrhB4e6zpgajrUVjeZTlq+NSwNm76pigVVqVkQUZqJeSWK1FIhd75oafV59Ner5/m3VcK/El9LvvLyO6cfg1RrkRhvLf8AI5cvPl4NbaUgVeWGpaX1yTo6vpemb6MdbwmnjIeSgEKUIACFKCAWQEGXhOCrhB401teuMXULSqs6rrduhODVUmC3n4a3hK1y3inefcdZv6mgq3eUo4E/2/OIDx3cZn68sUaA+DXO405gJNWcFPTl1VqaUKZh1KAAhSgAAUoQAFzAgy8JvxcNfAGR8Rg8MJ9kNrd0MDW6g513a6UOIx8pVqac/JqCpm799y124hLuKdKGVKNBO+abV2QYmOpN/HqX2emVtTlDTJ9GacpM/GU8VAKUIACFKAABcwKMPCaEHSFwDt8y3CEnA3BjFYz0KaCZZU0HW71S2rymZQ4yHy803rVT163m5X7l9IFmalBpjR71BNDJ32b+iw6FMtyxK8lTV+WlWvxGApQgAIUoAAFKGBSgIHXBGBOBV55IU1WT4u7cw+lym9FUHgQhtYfiue8/4yZvxzHrpN/qLtK9VKaiXtNdmhCLDClovpo0L3R6PGXwWhX2zf52XUo5kwNjlLneShAAQpQgAIUyKKAUwPvvXv3cPXqVfj5+WWxua51WE4F3klrI9SLZ7L9qXUENl1ciKfud8HJ4/7qM1vKFkxLflYeSIxD77v/B68arfBN/0bJT2kIxWphCm4UoAAFKEABClAghwScEnhv3LiBCRMmYMOGDXj06BGOHTumbnf16tWIjo7G8OHDc+j2zV3WmYE34kIcVuw7iwldakPPr6vCre9moMQmJF5pg4LxHTDQv7Kqz0023Zi520z76JiDuB13DbXm3VJfl+uOfKUqIi/eRNNnfCzH6Hl5Rx4EiltGhLlRgAIUoAAFKEABZws4JfC+9957SExMVMG2e/fuiIiIUPcpwXfQoEH49ddfnX3fDrmeswKvhN0+c3ciLuE+ZHaFTrO2qfY3qeyDvVe3oXC5hahQsDGWdJuT/UE3hZy84CZz/Bo3mc7s1Vq++PDyu8h/dgfQezFQw1Lzy40CFKAABShAAQo4W8ApgbdRo0bYuHEjfHx8ULt2bWvglZHfZs2aITLy8fK3zgaQ60mZxdSpU7Ft2zZ4eHjgxRdfxMSJE9WfM9rSCry95+5SNbQZzXMr55Q63GKe+TMMqLLPpsgYzNgUpcKubHpuXJlPV2Zb+Nt/l+NK0RloVLoR5refnxN8kEA+6Pu9ajU3vaqbNOS7Sr/gpZh5XGI4R3qFF6UABShAAQpQQAs4JfA2bNgQ69evR+nSpZMF3j179mD06NHYvn17jvbI//73P5w6dQpdu3bFw4cPMWTIELRp0wZ9+/a1K/Dq1c3kIG/P/Ng+tnWyQCvThcloqA6v/+hVH70alkt2jY2Rl7DrxB/YGBmj5tHVmwRcWQ5Yb3oGhrCYMAwMHpijgTcl0s6Tf6gliV/NtxdzPb4EqncE+izBDMMLdfLCncwakZZBjj4MvDgFKEABClCAAm4n4JTAO2nSJFy4cEGNmr7yyisICwvDvn371N87duyId99916Vg58+fj3PnzmH8+PF2BV7jy2RyoA5zEu4k7M0LPZXsfBJi5WUvKQvYFBEDCbvGTV4+k9KAps+WVMH43R/Drcv76oUjjl47il5re6G6T3Ws6LLCZRxlGrRNIZuwvsCHauW1iNdDraUYKRspI9VS/8uNAhSgAAUoQAEKZIeAUwLv3bt3MWPGDCxcuBAJCQnqPgoUKICAgACMHDkS+fPnz457y/I5pdZYRni7deuWaeBdGLzbus+g78PU6G2AfyU1i0ItP2+1rK+UJMhopmwyMivhtd6kjervxppc+XtNPy8Vcl+t7ZtqJTQZQX7vxwNq1Ng4K0Ld7+qqcx0acCjL9+zoAyXkd5i5DaEJr6lTT/cPsy513LNROXh7eigTXf/bs2E5fNEr7aWOHd02no8CFKAABShAgSdLwCmBV5PKtGTnz5/HgwcPUL58eRV6XW2T8orp06dj2bJlyYL4qFGjUjV13bp1uNdjRrLPJbD+POJFpFzKVz7/olcDa4jVi0NI6YOEZBntzeqiEK4YeAXl2+2nUCe4N5rkO4pAr8+w9EpFzOnXMNmcvcaX3ka0qYpRbau52iPB9lCAAhSgAAUokMsFnBp4Xd3q4MGDGDNmDKSkIeVcwYcPH07V/Ndeew3Pv/d9ss8HtqisAp2EvY0RMeprtcp4Y9Qr1ZLV86ac3WDb2NYoX6JQlohcNfDKiPSeab3QI/82fHy/H+bd74DTUzqlukejRbYtlpElWR5EAQpQgAIUoIA7CDgl8GZWoztt2rQctzx69ChkFPerr77Cs88+a1N7zE5LpkeBM5vRIbPG9FjTA1HXo7C8y3LU8KmR2e5O/frcz9/DoNvfYN79dthZbUzqBSqSWvPnOTux+9Q1LBnUFM2eKenUNvJiFKAABShAAQq4t4BTAu+sWbOSKcqcvGfOnMGWLVvU3LyDBw/OUeUTJ06odsycORNVqlSxuS1mA69M5yVTmEm9blZHd6WxARsCsPfSXsxrNw+NfRvb3H5n7Pjrxv/gpR0B2P2gBs50X5lqVgrdBnnJbebm42BZgzN6hdegAAUoQAEKPFkCTgm86ZHKTA3//ve/ERQUlKPqMrIr9bh58+a1tqNgwYKQEoeMNrOB11E3rQNvmaJl0L1KdwytP9RRpzZ9npuxV+A1w/JDxNkRMekGe13WoGeuMH1hnoACFKAABShAAQokCeRo4JU2tGrVCiEhIbmyQ1wl8C6MXIjZB2Yj/l68ctzQYwPKFi3rOqZ6ieGXAoFWH6TZLhntlhXkZGENmb+YGwUoQAEKUIACFHCUQI4GXllprUuXLvjtt98cdT9OPY+rBF5907qW1+VKG05vAxYkLS088iBQvGKa/VQp8Cf1ubzAd/76Hcg8xLXLeDu1T3kxClCAAhSgAAXcT8ApgffLL79MJSfz8W7duhXNmzdXC1Dkxs3VAu+UPVOw+MhiVdIwrMEw1yL9eSyw+2ugUgvgTUuwTbnpF9f08slprVbnWjfF1lCAAhSgAAUokBsEnBJ4ZaqvlFuRIkVQp04ddO/eHfny5csNVqna6GqBV0obPg/7HF2f7YpPW3zqWqYJscD0ukBiHJBOaUPKlerkBmQRjwldarvWvbA1FKAABShAAQrkKgGnBN5cJWJHY10t8IbFhGFg8EA0Kt0I89vPt+NOnLSrsbRhyDbAt16yCxvn421S2UdNUyabmTmKnXRnvAwFKEABClCAAi4skG2B98qVKzbf9lNPPWXzvq60o6sF3ri7cfBf4g+vAl7Y0WeHK1E9bosubajeEeizJFkb5cW13nN34mbCfRVyZUnmlfvPcZTXNXuSraIABShAAQrkGoFsC7wSBm3djh8/buuuLrWfqwVewdGrroX2CYV3ARd84Ss2GphRD/AsBgSeybA/OXODSz3ubAwFKEABClAg1wpkW+C9deuWzShSz5sbN1cMvK68CIW1j/U0ZWmUNaR8DvRqdD8Nb8kZG3LjNwnbTAEKUIACFHABgWwLvC5wb9neBFcMvOO2j8OaE2swpvEY9KvVL9sNsnQBXdaQwby8+rzv/hiuyhrGd66Ft1pUztLleBAFKEABClCAAk+2gFMC7507d7B48WJERUVBlhVOucmSvrlxc8XAKwtQBIUHuebUZLqTj64DlvYFKvoDAesz7PrgiBgMXrgPtfy8sX5ES6c9Ju8tD0e5EoXRo2E5U8s+O63BvBAFKEABClCAAukKOCXwjhgxAmfOnEHbtm2xaNEi9OnTB6dOnVILTnzyySdo3759ruwiVwy8Lj9Tg/S0TFE2JWnxiYk3Muz7uDv3UG/SRrXPwQmvwruQB+SziItxaPZMyWx7bvQiGEsGNc3W62TbDfDEFKAABShAAQpYBZwSeJ9//nls2rQJJUuWVCurrV27VjVg9erVKvROmzYtV3aJKwbeo9eOotfaXihTtAyCewS7rmuQP3DpMNB7MVAjaRW2dFr7zvd7sSnykhrlHdiiMj5eG6FGX1OO+O46eQ1Nn/Exfc/6ZTk50Yg2VTGqbTXT5+QJKEABClCAAhTIOQGnBd7t27ejcOHC6Nq1K1atWoW8efNCVltr0qQJwsPDc07AxJVdMfDK7eiZGg4NOGTi7rL50JDPgF+npLsIhfHqMqL757k7ceTizWSNkqnLzl2/jZm/HMeuk3+orzliRFaXUTDwZvMzwNNTgAIUoAAFnCTglMDbv39/vPPOO2jZsiWGDx+OV155RQVfCbqDBg3C7t27nXS7jr2MqwbeXDFTgx11vNJrOvSeu34H3p4eOB97B//oVR+7TvyhXmrTW9NnSmLpoKamOnr6pijM3GyZKq9trdL4pn8jU+fjwRSgAAUoQAEK5KyAUwLv4cOH4e3tjQoVKuDgwYMYMGCA+rssTiEBeMiQITmrkMWru2rgzRUzNdhRx6u7R0LvjYT7ajT3/eXhkNXYjlyMQ1zCfci0ZXrRCrOjvLqEQq4r11g2uFkWnxAeRgEKUIACFKCAKwg4JfCmvNGYmBhICC5btixq1qzpCg5ZaoOrBt6FkQvxedjn6PpsV3za4tMs3ZtTDrJjPl5je85ev4OWU7dYP9KhVI/Mmh3lbTF1C2QkWTZvz/w4OLGdUzh4EQpQgAIUoAAFskfAKYG3Q4cOqoRBXlgrV65c9txJDpzVVQOvnqmhuk91rOiyIgdkbLzkf4cA4UuA7rOBBn1tPMiyW4eZv1lreqW0oVfDcqrswX/qFrU0sZmFKvQMDbpBp6d0sqtt3JkCFKAABShAAdcScErgXblyJX766Sfs3LkT9erVU+FXQrCPj/k36nOS01UDr5jkihfXds0GNnwA1O8DvPa1XV05aW0E5oeeVsfo6crkz3+esxO7T11L9vKajAjP335K7ftRl1oZXmfnyT/QZ+4u1PTzQtyd+6pWWF6OK1+ikF3t484UoAAFKEABCriOgFMCr77d2NhYbNy4EevXr8fevXvRvHlzFX47d854WirX4UreElcOvD3W9EDU9Sgs77IcNXxquCbh6W3Ags6Ab11gyHa72ighdsXesyjnU1iN7upNlzUYpxMzvoSmR4P1/hJwV+47hybPlFTnWb7vnKoP7vF8OTUDRMrwbFcjuTMFKEABClCAAi4h4NTAa7zjEydOYPLkyQgNDcXx45Y34nPb5sqBd/iW4Qg5G4LJ/pPRvUp316WdWMzStsBowLO46XZ+u/0UJq+LRIB/JUzoUludzxh45e/Gl9p6z92lXoLTtbp65FgCc+TFODX/b8qQbLqRPAEFKEABClCAAk4VcGrgvXbtGjZs2KBGeA8cOIAWLVqoEd6OHTs69aYddTFXDrx6ieG+Nfsi8IVAR92y488zvyMQHQq8uQ6olLR0sMzgEHMY8PQGfOvZdU1dkmCcXeHdH8PV1GVSpiBz+cro7bTX61tHc/UFpDRCZmjQo7oy5ZlMT8bFJ+zqAu5MAQpQgAIUcDkBpwTe5cuXqxpemW+3QYMG1hre4sXNj+jlpKgrB95cscSwGm7tAxxbn3zFNV3bW9EfCFhvVxfrGRyMsyvoul4ZqZVyBfna9rGt0XHWNutsDHrkd/D3e9U0Z1K3uzEiJtVosV2N4c4UoAAFKEABCriEgFMCr4zgduvWTc3SUKZMGZe4cUc0goHXAYpprbimPyteARhp/2pxepYFPbuC8UU2WZZYRnll6jIpZZCRYFmmWEaAx3eupQKul2d+HJrYDmmNFjvgjnkKClCAAhSgAAWcLOCUwOvke3La5Rh4HUCdVuDVo75y+ok37L6InrJM1+oaA2/khTgVavUm++jSBQm/Us6gyyF04K3l5431I5LKLexuDQ+gAAUoQAEKUCCnBRh4TfSAKwfeuLtx8F/iD68CXtjRZ4eJu8zmQ/USw9U7An2WWC6m63rlz1l4mU2vlDanX0O0q+2LehODVZmC1OjKSm160QpdyxscEYPBC/dZb9T4wlvK0eJs1uDpKUABClCAAhTIBgEGXhOorhx45bZyxVy8emoyY73ulApAQtLIrvFlNhv7KuXUZClDq0w9du7abfRsVF7NrxtxIQ6dZm2znl1KG95qUVn93X/KFs7Fa6M7d6MABShAAQq4qgADr4meYeA1gacPTWsuXj1VmeyThcCr59JtW6s0vunfCLaM0hpXVzNOW6ZHizk1mQP6mqegAAUoQAEK5JCAUwLvCy+8gNWrV8PPzy+HbjN7LsvA6yBXHXClXlcHYH3q9p8BTYfZdSHjy2YSeOtN2qiOz2iJYONSxcb99Ly+uvzBroZwZwpQgAIUoAAFXELAKYH35ZdfxqxZs9Sywu605ZbAG9onFN4FvF2XPqPA+1Ig0OoDu9oed+eeNeTKaK0sFWyclzetk+mR3LLFCyE0sLV1F764Zhc9d6YABShAAQq4pIBTAu+WLVswZ84cTJ06FZUqVXJJiKw0ytUDb8CGAOy9tBfz2s1DY9/GWblF5xwzvQ5w4yww8iDwvx+AX6cABb2BxDigfh/gta/tbocuUbA18Oq637SCsT6XvPTmXcjD7rbwAApQgAIUoAAFclbAKYF32LBhOHLkCM6dOwdfX18UKlQo2V1v3Gj5lXNu2xh4HdRjxtXWTm23BF55iU1WYMvC4hPSKl2iIKukyWppmY3wyoIV567fVnPyyotsxs04rVmzZ0qq+Xl3n7yGJs/4QP7OjQIUoAAFKEAB1xZwSuCVEd6MttatH/8K2bW5kreOgddBvWUMvDIvrwTdJkOA3V8DvnWBIdvtvpAOqVJ7K4tKmKnBnbQ2AvNDT1uXGNbn1i/F2d04HkABClCAAhSggFMFnBJ4nXpHTrwYA6+DsI2LT8hLaxJ4ZXaGBZ0tF8jC4hPv/hiugq6smnYz4b41rGalxXqeXj1KrOf1NS5fnJXz8hgKUIACFKAABZwj4JTA+/DhQyxbtgzr1q3DpUuX8Msvv6i72759O27duoV27do5524dfBVXD7zjto/DmhNrMNl/MrpX6e7gu3fg6YyBV8oZdMg1vsxm5+V0Ta4+TEobRrWtZudZLLtLuYMsViEBd/vY1tYX4uRrsriFLFVcq4w3Xq1VOkvn50EUoAAFKEABCmSvgFMC71dffaXC7htvvIG///3viIy0LO26d+9efPTRR1i/fn323mU2nd3VA+/sA7MRFB6EofWHYlgD+6b2yiaytE+7azaw4QNAVls7lvQsyKhukD9w6TAwZBvga98MH3o6MUcEXjmHXoBCFqUwLk0sIVhWcZPtp+EtUbuMC8+G4dRO5cUoQAEKUIACriPglMDbsmVLfPfdd3jmmWdQu3ZtREREKIHr169Dvnb48OEcF3n06BH+9a9/Yf78+di37/Eysxk1jIHXQd2Wcu5dvcywsba3Uku7LqanE9MH6WWG7TqJYWddIiHTlp2PvaNegtt96lqy00kYllkcVu47h5p+XpjQpXZWL8fjKEABClCAAhRwoIBTAm+dOnWwa9cuFC1aNFng/f333/H6669j//79Drwl+0917949jBo1Ck8//TTWrl2LsLAwm07CwGsTU+Y7pQy83WcDDfoCS/pYRnz13zM/k3WPlMsFG1dPs+M01l1TjhhLuJ23/ZQKvxJupaxBXmKTbVPkpUxnhchKG3gMBShAAQpQgAJZE3BK4H3nnXfUohN/+9vfrIE3NjYWo0ePRrFixTB9+vSstd6BR/32229o0aIFGjZsiP/97382nZmB1yamzHdKiAWmVLTsV6w8MCppxN9Y22vn4hNyqvSWC868Qan3SCtAyxRmMpWZt6cHOs3apmp8dXmDnCGjld2y0gYeQwEKUIACFKBA1gScEngvXryowq6UMJw9exYy4iujuzVq1MDs2bNRqlSprLXewUfdv38fjRs3dpvAu/nMZowMGYlW5VthVutZDtZy8On0C2rGldV04JUpyjpMtfuCjgy8KQN0ykUodI2vsZFcqMLuLuMBFKAABShAgWwRcErg1S2X2lgJug8ePECVKlXwwgsvZMtNZfWkGQXerVu3pjqtjFwfP348q5fL9uPCYsIwMHggGpVuhPnt52f79UxdQF5cux5tWUbYs7jlVLrUIYuLT+j5ctWppnQy1Tw5WJ8v5fLD8jVd42u8iNkyCtMN5gkoQAEKUIACFFACTgm8spLaq6++moo8MTFRTVHWqZP5MOKI/swo8H7yySepLiEv4jHwOkI+nXO4WODNaPlhPVev3InU8kod7z961UevhuWyEYinpgAFKEABClDAFgGnBF4pE0jrRTAJmA0aNHCJWRoEy91KGnLVCG9aT6uu7fUsBgSeseV5TraPXiFNPnTECG9Gyw/H3bmHiItxqp53Y0SMWs7YzNy/dt8sD6AABShAAQpQIF2BbA28UVFR6sK9e/fG0qVLkzVCyhqkTODHH39ESEiIS3SRuwbe6j7VsaLLCpcwtrsRDlh8Iq0SBLvbYccBerSXSw/bgcZdKUABClCAAtkokK2BNzAwEKGhoYiJiUGBAgWS3UbevHlRrlw5jBs3Ts2O4AqbuwVeMa37XV1Fe2jAIVcgtr8Nn5UHEuOAkQeB4kkzOdh4luX7zuH95eFOnyJMzwGslyK2sbncjQIUoAAFKECBbBLI1sArbb579y569uyJNWvWZNMt5NxpXX1aMmPgDe0TCu8CuXAVMBOLT0gJwoq9Z1HOp7DTa2n1DBFSSiHlDrIgBTcKUIACFKAABXJGINsDr9yWjJzmz58fssDD1atX4efnlzN36+Cr5qbAKzM17L20F/PazUNj38YOlsjG05kIvNnYqkxPXXdiMG4m3IcsUDHzlyhsH9uaoTdTNe5AAQpQgAIUyB4BpwTeGzduYMKECdiwYQNkCd9jx46pu1m9ejWio6MxfPjw7Lm7bD5rbgq8XgW8cPPuTQytPxTDGgzLUGbKnimIuh6FMY3HoIZPjWxWzOT0JhefkLOv+n0V1pxYg67PdkX3Kt2dcj/GKdHkggH+lbjUsFPkeREKUIACFKBAagGnBN733nsPMgWZBNvu3bsjIiJCtUSC76BBg/Drr7/myr7JDYH31RWv4uKti1ZfWwJvwIYA1xkNdkDgnX1gNoLCg1Tg/bTFp1l+1s7Hn1eWfkX8ULZo2QzPw3l5s8zMAylAAQpQgAIOF3BK4G3UqBFkLl4fHx/r0sJyJzLy26xZM0RGRjr8xpxxwtwQeHV41R65LvAeXQcs7QtU7wj0WZKlbtWB1+wCHPacR+qHZXoyqd09d+22mqasXIlCqrRBb7LP+et3ULZEIZQvUShL98aDKEABClCAAhTIXMApgbdhw4ZYv349SpcunSzw7tmzB6NHj8b27dszb6kL7sHA64ROMbn4hLTQnqCa0R1JqcfiI4uztHKdXnr4p+EtUbuM5eVBvZAF5+t1wnPES1CAAhSgwBMt4JTAO2nSJFy4cAETJ07EK6+8ohahkGWG5e8dO3bEu+++mys7wV0Db/MlzVW9r0u84BZzEPi6JVC8AjAya1Or6VFusyO8Zs6jSxzkJba3WlRWz7uu8+V8vbny25+NpgAFKECBXCTglMArU5PNmDEDCxcuREJCguKReXkDAgIwcuRINYNDbtxyY+BtVb4VZrWelSG3nrvXJQKvtNTE4hNyuJmgaoQyc55vt5/C5HWR6PF8OUx7vX6ywOvo+XrnhZ5C3J376NGwHEslcuM/LGwzBShAAQo4XMApgVe3WqYlO3/+PGSVtfLly6dajMLhd5fNJ8wNgVf/Gl5T2DLK6a6BVwzMLMChXWwxTPnoRVyIQ6dZ25LV8eoR3pS1vWYfWz0H8JJBTdHsmZJmT8fjKUABClCAArlewKmBN9drpbiB3BB4df1qrg68ei7eBn8BYs8Ab/5k16NkfHHPEYG3TNEyCO4RbFcbZGcdRA9OeFW9zKb/Ll+TBSoctTHwOkqS56EABShAAXcRcErgPXv2LKZNm4YjR47g1q1bqez40lr2PU468Bb1KIr4e/E2vXDlciO8OvBqJjuXGe65tieOXbPM/ZzVwBsWE4aBwQOtHZWV8+gR3Tn9GqJdbd9kgVeHYLNPgsz80HLqFnUajvCa1eTxFKAABSjgLgJOCbx9+vRB5cqV0a5dO3h6eqaya9KkSa70zA0jvDJ37IX4C5D/jg8db1fgtWUKM6d03IHFwPUzgPz3xlngzXVApZY2X1oH+IwCr/jI4hzpLb/siMCbclYG4wivo8LpzpN/oM/cXQy8Nj8d3JECFKAABZ4EAacE3pYtW2Lbtm1u55kbAq9GP3rtKHqt7YXqPtWxosuKDPtCB0SXCby6tVlcZtgYeJd3WZ7m6nGZLU6RsjQkKyO8wRExGLxwH+QltZFtq1mDqSNHYxl43e6fGd4QBShAAQo4QMApgVdGdtesWYOCBQs6oMmuc4rcFHhFTQe/zMKaOwfe9GaeGL5lOELOhqQ7Au6IwBt35x7qTdpoHX3VI7HygaPm4tWzQTgyRLvOdxxbQgEKUIACFMiagFMC7/Lly7F582a8/fbb8PPzQ758+ZK11tfXN2utz+GjGHid3AH/HQKELwG6zwYa9LX54sYR3vQCb2ZTjulArC8a2ic03fKHjBpmfKEsOwKvLptg4LX58eCOFKAABSjwBAg4JfDKssKjRo2CzMeb1nb8+PFcSc3A6+RuC/kM+HUK8FIg0OoDmy4edzcO/kv8rfumF3gzm3Is5RLNWZ2jWL+4JmUNu09dQ9nihXA+9k6y+XlturF0dpq0NgLzQ0+rrzpq1NhMe1IdGxsNrBpmqcF+OdChp+bJKEABhPT60QAAIABJREFUClCAAukJOCXwtm7dGoMHD0b79u3TfGktt5Y6MPCm/Vit+n0V1pxYg67PdkX3Kt0d992XhcCb8mWzyf6TU7VJXlhrv7K9tZ1plXzo1ef8ivjh4q2LWV6F7p3v92JT5CXU9PPCkYs3VT2vBF9HLT6hA7XLBl69VLQ0cOINxz0bPBMFKEABClAgAwGnBN4XX3wRv/32m9t1BANv2l2q610d/tKbAwJvWm1KcwYGWdI4IQ7wrQN4FrfWP8uiE3sv7c1y4DWWHIieLCssAbiWnzfWjzDMPJEQa5lz2LeeXd83Lh94dR8y8NrVr9yZAhSgAAXMCTgl8Pbu3RszZ85E6dKlzbXWxY7ObYG3x5oeiLoehfRmKtC8Zl9a07/+d3jgPboOWNoXqN4R6LPEpqchZZhNq00LIxfi87DPk4/wzqibtMjFOhz1fkrNcCGju2WLljUVeI0vlckFZToyXcubbPEJGQmVe23/mV31yi2mbsG563fUvbhkSYOuw5YGDtlmd6C3qdO5EwUoQAEKUCCFgFMC77p16zB9+nS89tprKFOmTKolhTt37pwrOya3BV4dRDOrP3XZwKt/HV7RHwhYb9MzY0vgTTkDw4YeG1D2i1qW87+5Dqvu/6HmMG5VvhVu3r1pKvAapw3LNPAuSPq+qNHJ8qKeZ/FM79k4t6+jAq+ts3tARqVjDgOe3ukHWeMiInbOp5zpzXMHClCAAhSgQDoCTgm83btnXMe5atWqXNlBDLxpd1tmMx5kubOzEHilnljCqt7SGuFN9UJaq1loPC/pmX1zHcad26Bqksc0HqMCb1B4ELI6em1cCU0H3o/XRqh63mSLTxh/9S872viinqnAKwt7HPgBqNjC+lKgsb5Z/SBQtGz63WdLfe7EYo+PZ+DN8rcCD6QABShAAfsEnBJ47WtS7tmbgdf1A68evdW1t/Lf+e3nJ2t4u5Xt1Gp0evnlefVHofGqUZZ93lyH5jvHqqArgW/176tNBV45pTGUShmDrrs1G3gjLsSh06zHC7zYPcKryw0MI+jGEfLMfjOAXbOBDUmzZ6T1QprM0DDDUJNsY4jXnSUzbsgPG2Mbj809/0iwpRSgAAUo4BICDLwmusHdA6/8Cn9W61l2C2XbCK+0RI8Q2viGvy2BV//KXu5XFp+YUXsQ2qz7P3XfR7v+A70O/VPV727suRGOeCGvw8zf1IiubBJ43/0xHCv3n8M/etVHr4blLN76V/9NhgC7vwZsKONIWS5hd+DV18xq4DXW5448CBSvaLkXGfk9HQo8emSZVk5vdgReCbsDgwfi2LVjmNFqBtpUaGP3c8kDKEABClDgyRVwSuA9efIkPvvsM0RFRSExMTGZtixCERoamit7ILcFXr14QmaBIbM5aTPrrNwUePWSy9VKVFMhSpUrlGyEYXv/o25zdv2OCIo7jL41+yLwhUCHBF7jTAoSePXMDcaAGje/HaIuH0DRBgNQY+ecLAVemQHim/6NMusu69dnz6iAoBLF0CrhHmYNPmq5/wOzlYlsmY7wplef+3ULIOZQ6nZImO8wNdP2GcOu7CzT3X3a4tNMj+MOFKAABShAAS3glMArszRIOJRa3rFjx+Lvf/87Tp06hQULFmDatGmoVSvpBaFc1i+5LfDaOjrpqMBbpmgZBPcIdmyvOniEV//KXkodGvs2tgRe7zoYFm55Ka5n9Xo4djfWOqpoq2FGN60Xh9Bz7+qZGwL8K2FCl9rq0LAvn8HAkkXQyKsy5h/8FfCtCwzZnqGlDs42ze07pQKQcEOVbKhFIGKjMfyH1ggpUhiN7iRg/hDLYjDGwJtp3bKxPrf3YqBG0kt3xs/lpPX7WFbMq+iPVS0HqfpoGV3vV6tfmvc3bvs4tY+eA9mrgBd29Nnh2OeKZ6MABShAAbcWcErgrVu3LsLCwtSiEzIjg8zaINv+/fvx+eefY+nSpbkS2R0Dr/ElpbTqXW3pKF0TK/umtYiDLedId5/PygOJcUBgtE2zFuiwJC+cydRjKcNSmoG3cFUMi9ismlC3cgX1X30f+iU4W8s90poZwxhMlw1uBl2KYFx8YuGX5fB5yRKW9h6LsHBkUsaRMvB61bSsZJZqGWSZTWFKUrmBDrxHk2qV8+aF18OHiI+agIDWDXC10HcqbMqWYeCVeYu/NswjrMsVUtbtyonka1LaUNEfAX6l1awX1X2qY0WXFZb7NBwT9+FZ60p5UkMtnrLwh/4thYzQLzqyCGWKlMGwBsNMPVo8mAIUoAAF3FfAKYG3WbNm+Omnn+Dj44Nu3bphyZIlKFy4sFpq+LnnnkNERNL/oecyZ3cMvMaXlLIaePUIcbYEXv1rcxvf8LcGztKvYOClX5KFV/mLMfC+UesNjAwZiVaPCmHW6WNpBl7j/ilffkvr8dUrtBnLAWSmhl0n/0CDojdQteB17I/zwp9+OAff6vPRrGIV3D/WFDUTh6vyAmV46oxNgVePHKvFLI6dhle1Seq4VCUsxtkUkhzPbxqH9hcswVa22sf6w6tGK9x/+l8qkGYaePUcyfoEOvAaZ9aQUWqZWq1mJ0s49q2Ldk8VUS8MyiaB9nB0PoRs+A+m3LS8/Da7/VgEHVuiRoClnnzKnilYfGSxGo3X/Sf/TRaYc9m/I2wuBShAAQpkv4BTAu/o0aPx0ksvqbA7adIkSN3um2++iS1btuC7777D5s2W0bTctjHwpt1jLhl4L17GQL+nVYONo87GACsjhPJiVKNHBTH/tOVX+ilHeO0NvNpCRphT/co+6SWvxObvo+aO0ihS2fKC4LOnO+PVYoutgTc0JhbedzIf1da1wVIL/NXOYBSuOFedT9cfW3vLOOVZUjDdvLgTRt5PCtYyk8TZdnjg1Q2lq8+zBt4Ma2f1OUvXAS4dfrw4iEx1tmqYKmNonnBQzXahFj6Z1SSZr/xFln3+aFEhvHpvC77w+Fp9vXnV6rh5/461fljXXOt70TNrpOzX3PZvCdtLAQpQgALZK+CUwHvt2jV4eXnBw8MDly5dwqBBgxAZGYnixYvjiy++UGE4N24MvNkTeDMMlVkd4bUn8EoNa8xlhHl6qpBsHOm2J/DKy1b+S/wVUprlAPpeXgpE1bA4eJa2lPrUPvcSXiyyxhp45z16Go1P731cb5vON0t6gTfV6OeSPsCxpIU7kgLvlH83xGKPu9YzN79YG8Gx/aDLIuQL2iHuzj30nrsLcQn3ML5zLbSr7Zv+rBJJQTiu5Wj4n7OULKgAvv4zq6++aMOnWmDrb53xUZHVGPhgmfXr8kLhyq4rrW17dcWrqqxBziM/pIzYMsLUYiC2/tujS0+8PfPj4MR2th7G/ShAAQpQwAUEnBJ407rPmzdvomjRosiTJ48LMGStCbkt8G4+s9nyK/sMphtzdElDpm/2p0GfYajUU1/JymMN+qY6Wv/6XUKSdwFvVfMpn827eBnDfUsjPs+jZDWtellhCU/dq3RXSwhXv3sXK87HmA68Rss0A69evriiP567Wwz3PQ+r++l4rjoqFtnjsMAr50xWx6uvK19ICrwBX1fF3kKeaFTwaexNvIzn/iiD3y4PTxV4Z740V4XdyItxqq3WmSUMSzFDVogrXgEYeQhICtdhHSdj4JFv1DHqZcYz57HwUayqU9ZzJMvX7t1oCN+i4dh65iRmFy+mDFKOUMuLdFLSoMsadJlDejXGEtA3Rl5Ck2dKonyJQtZnRkaL4+/FQz8rmf0rsHzfOby/PFztlmzO5MwO5NcpQAEKUCDHBZwSeF944QWsXr0afn5+OX7DjmxAbgu8toxO2ht4pQRAjjGWCRhLGrISeI1z56aqk9W/Ok9nDld9bX1dXUMbGn0WIypUwd48iWhdobX6nwT/RZGLki0kYV1G99SZNAOv/pW6LTWjmQZePXtBRX/Uz3sVD3HHGngvlwxXAVS2oQXKY9ixUMvywmmEfP1MpzfCK1+31vEaX1iTL8iMCc/1Rd1fLS98yQwVMg1bretPY88fAShaZap1QQ655w4l/oHJ6yKt30Y9ni+Haa/XTz4/snEmjaRR7FXtx2H8sYXW42Y8KomwuFNYXMxbrWAnLwNGXY+yfn35+YtYVbSI9evpzeAgB+gf5CQ4Sx22hFjpW/mBR7aOM7ch8vIlVPG7h1/+2kd9JqPv7Ve2VyUWmc4+kdQq/VKg/NXuOY4d+Q8Pz0UBClCAAnYLOCXwvvzyy5g1axbq1TOssmR3U13vAAZeIGXAlF5yVOCVc6Wa5cHOwGsMsAGVqqrAqzcJxRJKjUsFZxZ4jfeX2QwUGU7pZZjV4GjFxuiV95K1XQGXC+CQV9zjwKunSctkoQYdeGUBi8ANP6BwuccB0zpKanxhTa5Y0R9hz/XEwPDp8HuUD582GK7+XO1OXhy4/LaqAzaOwLZ+NAvHoyJRs0ELtVhGsZqTVFCffOUPdLubD7t7h6Pp95UtofLdwxi49BWM/SMWYa+MRVDE4xXuuj4qjAsJ19Q96n74Zvc2JOIq8nnGYMalK1jk7WX9uh7NTetfAWPpiMxqISFW3+97y8Pxn4g91vpoqROWkfyUS06rumKfGqlOL/vJYiRv1HwDS38roO5ZNuOMGmm1SVa92xR5CWVLFHq8mIjr/RPGFlGAAhR4YgScEnjl5bQ5c+Zg6tSpqFSpktvgPkmBV6Yrk7pJmQu1bNGylkBjqFE1zgSQU4HXOKWaDjauEnhTzXhhCJ4LK9XD53lird8XXa554GKhm9bA26pQWcyK3GktP0jvG8i4RPGA/0xGwac2W8OqdUQ65ctlFf3RzjMOF+7eUCPJ3Z4bhPa7x6Pk/Tw4e/6dVIE38GRt9M3zM674NEKzWy+hUJnlqjldb8bjr/cro1nMu1jm8TFqehzDwJov4Njtixh6/QYOVH8dOy8Hq9FUvZCFvg8pt8CDQqg3aSMKlNqk2i3HLCrmhZt586rZG/Qzl96967pe/fXqXhUwvdUKtJy6BV5P7wRKrrYeKs+GTGUmq7ZJOYOMLEuZhYRqGRk2ruKmp9iTF/Z+j+iM3aeuWc8ji4bo7wMpm5E+1qPKesaMWn7eWD/CMF2b2/zrxxuhAAUokLsEnBJ4hw0bhiNHjuDcuXPw9fVFoUKP6+iEa+PGjblLLam17h54jQtH6NFKY/1vWr+yT/kWvQ6e9nSwcWQ01Siqnv6qekegz5Jkp02rPcbAO66UD9Z4FbUek9YIb48fWiLqXiyWPyyN81cOY2Tpp1LVPFvPOSCN1cMMLdIr28lHqQLvrtnABsvUW7pdjx54Ik++BLwVexvhBfNaA2+jIuXR7fQBeJd5Do16LrGGqpSmOvAuf7Mm+q6bZQmOhoCp6nh/GmNZ9CFp8YfZFesgKG8c/O7dx8Zn++N4ma74067e6tQJlzqrF+lktFSmApNt2Yk8qJU3Gg+LlUddLx/k9biuPpe5e2efKYbud8bhX4Un4tPyt1VYVfd+Jz92PyyPfEVOqdFcKUHQ5/PLVxgb39iN4IgYDF64D00rhyPCcwkaJD7EgYKW4w+9NNuyOIbM+BB7FmjQ5/GyxUkIer7loo/yqDpt2fr6foevQy6iQd3dOHH/v3iQ4KtGj/UmMzwE9wxGzzU91Q9zsqUsVTH+8Iboybh52wM1/bzU0tBz+jVUL+z1XNtThWc5Vu5PQm+LqVtw7rqlREUHY3u+B7gvBShAAQo4VsApgVdGeDPaWrdu7di7ctLZclvg1WE0oxXQjKFRhY2kUJdWXa1xX/0r5JTH21ofaewyY+BNtWiCcV7XgKSZBpIOzizwym6ykIG8yCajemkF3oClbdRLWzIzQljscfXSVMp7SBl45cU3qQlOOQqpX5hToa90IySrRzZMDdaufBlcyJ8f92/WQn6vSDW6uaVIIRwrUMASwoqWxc3YaLVPRj9A1JsYDO/Ei9hecASmFy+FeSUKq7aLi4xAqlH4zdOA6FDLjA8LOqN5xXIqmMpLfY2bjMTOioMwYUsLXPQwtKf+UBVSxUxqa2vcvYfNhQupHwYe3i2OGl55EZV4DV0ulsUPsX/DC2U/wxHvG6j2MC+i8j5EoYd5cPtRQRXm9Wjt8B/bI+TOeXQtUhmf9lwDPSL6XsPjmHP7W+vjoFZ9azoRqNHp8WIZaZR2yD3K/8rsCMIW3FYrxhW68RdcvlAPDZssQVRcOG6f64d2dYphW+xXlkBcoR0CK3cHfOuovpZ6dNn0M2f8DYZ8nnilDe5ebavqd2f9tgfdG5bA9O6dkpXwyHMwqPpkdJq1zXoPWX3BTfpMvmelnCKtTdqnR5Sd9E8eL0MBClAg1wo4JfDmWp1MGp7bAq/cTmajk1kNvDrQOTrwpnrpLYPAq2dckPuUkWiZskpmXaj2KD9Wnj5p6c2JNx7P3JBGDW/AIn/sfRCHeUXqIOxcqE2BV0yldlRerJLgr0OIHvlLM/AmzV4QlzcP/CuWV02TQKV/na8XnUj5CKb7A0TMQVSacRbT8gehR/5t1hkOZH/ZpIxA/VAS8VuywGudZ1gWt+i9GDsLNMPcjU3V6LLHg7K4l+88Xn/mrzh5a5d1tovGCQnWUWkZBR5V9bR60a3KjZI4HjsSBctPxN28D7Dh7Hn0LOOH+HyWkVrZ9A9QR7Z/j7JbhuOilz+qj/oJenR6c5vz6H7hn9b9W926jVk13wZKVLDM5ytbkyFAh6lpf3dOLIaF3l5q9odSNyri7u3RqNZgoWr77ehB6P9ca9SveRzjQ8djw+U4lL0Va53uTf+AIj8YHDxWHoeu7UdYwqfWl/ZkBP5pvIqyT1/DgT+2o2jCy5jVrb8KylLqI7XD8sJcx1IfY9k2yw8rssnUbW+1sNQ1Szvke+RRXAscOJWIL3rVx6OQz1BszzTcrP82vF6bpvYz/qYkrT7Xtefy/cGNAhSgAAUyF3Ba4D106BCioqKQmPj4pSHdvL/85S+ZtzSb91i5ciVmzJihVn9r27atdYGMjC7LwBtmHRXTo8Y5GXiNI8MSwNNaSCKzwDt8wQsIyXMHk5/piQv7vk0z8KZcLtj4a29jyYfx81SzOiTNXqDn+q1boh52H2hqqZm9k2AtZ0j5/KUZeJPKIyIfVlTlBrLpMgmZAUFexpJQptpw9bY18B794TX0Kuv3+AeCN9dh58Na+O/al/Bz8cfTBb5Xbwa2XlpkDbyNHhaAv5+3GhmO/30sFtbagqF3w+DxMD8qFm2B329vVfcgcxn/zdcXWwtZwp+UFEQO3qT+/O26rXhrbzfcylsERcZEYtOXA+GVcAF1SwJ/LfjH4xf2rt/AMBSzTHN2eruFo6I/kGJ0X32e9CLg+fz50L58WXg+zIvOpX7AimuWEo2bR6ZYXzaT57Tx169Yztf+M6DpMOjnp2fV3pi/pgHyFT5prWH+/foZxN69nKw7Ht4rgVGN38LMA19AanxlhF9+sCh4vxquHh+oriU1vwH+lTChS20Ya8xlNP/Ouf4oV6IQZtz+EI3yHEFkgbqo9aHlHvVUa/qCukZeXoa7cuciPtwToAK28QdCOf+IkBEY23isdco2a4PlB8WtUyyO3YMy/ddUXtaT0WqOIGdKxR0oQIFcIuCUwPuvf/0L33zzDerXr4/w8HDUqVMH0dHRiI+Ph4Td999/P0e5Tp06hf79+2Pp0qV4+umn8e6776JBgwYYONDyK870ticx8MpI5o4+OxSJcURV/i6jd9keeOVCxmmvDJ1jDAkZBV5d7ynlAbKsrXGWhtnz/VVN69CKHYEDSzINvHJ5PcKn60D1KGay+s+UM04k3cPsslURVCARfcu+jG92VEsVeB/cqqxqX/WWatU0CXkLOqsXCGXzfvgIKOiNgBKeyWY40G0JfVgG3tG71Khm2LqhGOiVB43uAfPPnQGGbMP0Q56otrMvdvhctdY7S+DdHvMzdl0OVrMx1KjUBr1uHUDxBx44GzUZ25/+B0YUvYiogo9HNVWJRP03Mc7DG2vOL1Vtk3v5tv18NHumJGSKr+a/9UOTfEctI7a7LSuryWastR52GRh6K2kFuILeQGLc4zl+U35jJo3+33xUGD3KF1dlGXPuP4XB+a+gcP4iuHRovAqY28e2BoyzVSSVSOhnt1TByjh1YDA8fLZba5gL3vgTvtqzCvWqnkXbajUx58ACPMxzB88UrYeT8QfV1GrdqnTDKz++ijsPbiHPmf4IrnYOb4VXg1fF57FsQE0sXNQWn3skWFutSyQOFnwb3nluq2nYwhq9h+41WmJq2FRVFyw/QMksEfJ9J+UgUzccxepL45GngGU5ZnnRTo/y6udafy7Pt7XMZkGnxz8wjDyYqgbaSLn80B58vP8t9HhmICa2HJX5v80HfgAa5PygReYN5R4UoMCTLOCUwNu8eXPMmzcPNWrUQJcuXbB27Vo8fPgQU6ZMQYkSJTB0qOXXrjm1zZ07F3FxcXjvvfdUE+QFuw8++ACrVq3KsElPYuAVkJR1vRpJ/g9ZAqSuhZTPM1rkIj1c48teadasphN4jTWzEhAkhMivrmUKrE9PH7Vc7s11mB17yBpy5VfHEij0CFpWA68EbL2QgdSASlhRyxSXbmRdmjfZC3hJ9zC8UnXLiHL1ftj5R1Wsv/pRshHelIE3VS3w1y2wOe53fO5XDt6XnsF7N6LRpO94BGwdlSzwWn9df7cw2pw/qsJt2IZRGJjnsnU0Vka/JYjW+W0I2ubbBxkpDSz1NDrWW4jjl+dh+Y3t6B97H0Xqd0VQ9Hp0vFUAy858jM0+U/FUwkEM8O2E3wseRJ57XuiQOAxT/98bmL0rGEHHLN9XideaY2zjQPXrfanZvbnre+sSwsZnQZckyGcSHA+WWmVZrjjpRTu178QbwM9jgWbDrOFt41cj8OrVBVj0qAPCfQ9iQ+F8kJIIqedtVLIOQrZbamHVS2T65Uf5wDBirH8wkNFg64wR9YfiRJS/mpJMlye89uPf8PudrdZm65FW/6/HIq7Q49ry5xLuIfqRL+bkTcSIfLGqDrtKoZfVsR55CqN5fCC+uvymOo+u5S5X5Fmcu3VClVLs/MtOawmOjCJvOhqNOx7hKHyvIB56AAlIxHv1ZqLOyfV4Mz5YnadQviIqdNf0aoX2pUdhYNlz6ocivZ1v8wFCSpWz1gfr6djOXr+Djv8MxsMyX6qXEfM+KoT1f/oZZb1LWo+VZ1xmuJDnUKZ3w6qhgATepFFy2VEW+vAu5KGOMf5Zf6/pMpt0/3GVuaI9i9v8fwdyXvl+z2wmD5tPyB0pQAG3FHBK4K1bty727duHAgUKoHPnzli3zrKE6o0bN9CuXTvs2rUrR3E//PBDNGrUCH/6058s/8ecmKj+LmUYGW0MvLOTTTGlR5qMgTdVQLOhp43BNc1f4dsQeOUyeoaCoQ+9MSzasopZysCrX+jSgcUYeI9GrlBhyTjlmpzCWNIg/2f7edjnqj5WQq5a1S2prjLdwGuoQ26X9wIu4AGWN/wQN0tVsYwW37uvRifzP/TBnbhK8Ci+36rWqFhVzO/+H0uY+D0Y4zcNw5bChS3P7bXmuHupK06PLI+AVT1U4J3q/zU6VvG3/rpe9usWH48x74Rjyw9dMD7PNRUKZ12+qkKkBN48Wz/DSA/LNWS7UqQalnnGYo63J5rdroJrfg9w7MYpfHE1HoOvfAU9Qjm7yWY0aeCJ178JQZGH1dTyu3K+eRd6WNp3pQ26VAhQC1VIze6R02cRWmAEvPLcVl+fXHQcxsd/ijBvHwwsaZlNQ4Jn6MiGKDu/MfBaEPBzIHDjrArs+Lol4FlM9eny8z64+d/RGJg/GJefH4VDTxXCyKhF1nvo6vEU9l+ZqGZXUC+RRc8Ffp1iaVfZZij4zgb15/YruuH8rZOq3tej2F5lLz84rdtWRZUn6BfQZu9diqCIT63n/+ZBB2zxbI11kafx6Jl/Jqtblp1kFgspAZFZKeoWmIufr79nmW+49iC0Wfd/6ocLKcMwbrXzVEDnin/Hv/YdRHzJz61fyvfAA0tjzqh+l1rvwonFUPuRLIVdUE0PNyz2hvVcr5+uj2FFo5Bw53eElKuNVfcuW1+G1CcMuPkA/69SJ0yIboBVDzbBo9g+67VqPWiJZa3/gtWXdmH16Q0IS3g8X3SzOG9MuHEEFz081LVDS7VH+ImO8MMVJFb8N4o9LIzEy9VR4J4XPEr9geOFwtR5SzwqhQHFh+FCoTK4ezcala7vVJ/fKlAKVa/+gtv3zuNgqefgWeH/t3ce4FFUbRt+Ewi9CUiTAEovARRCMRQFpCrwCaiISrEBFkD9EEQURWkfCjZAVMACqBRpIqCgIggK/EiRLr1KDy0kgfzXcyZnMrvZbGZ3UtjNc67LC9idc2bmPpN477vveU9TqZ+zrMTEnJLFZ3+T6nnLy+0VnpF1B2fIxtOr5FT8WdkXY0S680iYtEgoJ8Uv55O4awlSKHeYlCpkVAO6mBArhy7tkdBrV+VqvtKSkC2XxOa+Wa4mZJeTMSFSIOaY3BEWJjdly+nCH7U+kNhzOTZeLl29pt47nu26HMrhWmXIpZOI5Iy/KHkTLkqeHNnc35Jr1xPMsZK96ecL4ZJf8iQYHzBSapdD4uSQXPDzDOx2oxB4u0fS7+Ub5ZoC6ToyRHjbt28vI0eOlOrVq0v37t1l4MCB6u8nT55U+bJ//fVXpjJ74YUXpEWLFtK2bVvzOiCzyDnWWx8/8MADya5x48aNsnv37ky9dl9P7p5/6t4/tUVrOD6lCC/EALm82L5YtwwR3nNG3mqr5U+pCLNu+utg7OrVfP9G42W3CK+78M6b1lRJYPvi9eXo/l89bnxgZWjduAI5lAv+WaAECQ0ijKgcXrNy09HF6LINJCrUuN4t9UdK9G1NJGpmlHn9N2evKgcP1pAKt22XvFd2yK6QeLEK7/LFfaX/yaRqAIgGXz74tIpg6q2CkY7Q/fbm8snaDfL5to/lfDZDLnCNF/6YZKRvIE+R3NNsAAAgAElEQVQ2oYDIgK1mtQREMq9eOC0d1jwot4SeMhfBFZcWckJ+Mq5530GpefUT2ZzzSfXvWff9rTZZKDfoe/VvXAeE95O9zyi5g0RWLlhb1aXVi9T0Irs58Y3l2/Ah6qv/6NBQtQPe7PWHZO+eRoZk5jkiUqKmSGLus0saRK6CMr/Ec1Lin9lGikSPRRJdqpYLS9zjthzvyaKDOVQ5sWYb+0nYHkNyVUPE+K8Z8uCeTbLt0mKplquTbDn9f2YptfWTJsvt17dIeN95ckuJEi41qMtcTZDvjx6SN+MflcMJN8vksHdFiteQ6Md/kLkTH5Ld2bab6SHdEvLJ2UvPSI2rg9TCuva5bpG3t68xq14kXZCo9JFmFxJkyrW28kWxEIkr+IckxBSTr87ultox5wULHluEl5EroUYZNrTPjoVKvZj9MqpwIbVLXan4eCXbuuIHjsl37brEXK4oFRNOqWoa6nk4fVYdN/TmIpIvQaTvmbPq+vAa/kNkWvdtdvmyS3k/6zU3vSCSLdtl80OY9T3d37qI0f19f/6tPyD605d9SCCQCKS22VEg3UtmXGuGCO+CBQvUtsKRkZGyePFitSCsadOmKp+3atWqarFYZrYhQ4ao/GIttcgtbtCggWzdmhgVFJEzZ5IKzutrrV+/fpYUXh0N1Tmz+OoVq9OtFQH0V/npLry6vFflthIRa8yX3kxAzxNKjEXuX2/8MxXhXTetufE1f8GKIse3eBRefd+QRggvUiLc84FxKp0bjAVAyO/Fbl44FouvKu9aLkertJbRZzYYKQWVe4ncPdilxFWJHNVk96bHjG1sD/eXiNBD6hbMDxyz7peJl3dLt0IRMv3cFkEVgYu7hinRbPFZDTmRPUTGVh8nreq2UOL53vLdMrBEHzMvWXb/pPor4S1YQy0Es25esfaf0/Ljzz/Kd7lHyIZcYfJ0sfzmj2ndhJwydf9ueSruBSV4yJvd2n2rys+1jjH+x11mZLTrZOObHFyf2u73mJF3fE+14mpXMvz5yWN1zXNYx8G4qiGNwZLvi3xl5PWeCC0mp+NzG4v2EP0tUVM6LehkbleMndvK5qonLY4+rXg+sr2P3Hwm8ZkQkZXtlkuT75vLd4XLy2sF4wTsj5y9Yghv5KsS+e1TxvmRVvEfI9+43udt5IocVlHVt0+dkX9uaiLnC1WVO/Z9bG4S8tmqfbL9h0mytvxZ9WEDlSvWxtwpDXL9rqKw+a8nyO8HDsmo8rVl+vUzgm8jdpStI3nOnJbndvwgt8QbkcUVUkX6lSghLx+NlUdCfxdBHeqdi5X0zs+XT74sWEAiK7STt5v+TyTmnEQf/kNa/T5YLibEqf4h13JI7IUa8njcEfnvRSPSioa8YUiutUG0O168JD1LFDMXDxaOC5Hqp8vKnZfipY4clh/y55CpxWIlx/UwuXKltLS/tk9+zXvdjGxjwWDrK6XkYO5/JTQkROKvJ8gzUlAqXMsuvUL/lX3ZrynxrhIbK1ez55f4kBySPSFWroQVkrDQfFIz+phcSTgly/PmlesSIk0v5ZANuWPlZHaRYvHXpdOFvFI0LkRuuxgi/+atJAfyh8jqXIckNvS6ZM8WIldir0nctevqtnIlZJP810vI5ZC8clPscckdGieFrp2VvAmXJE/CJTmTq6yszZb0oUGzuBISJ7kTwiR7aIhkCzUWcd58PZuUi73kwsv9H1ez5ZdLofnk8tX4ZMdlyxbqMfLrdcBU3jwUEi1XJPm5rN1yS3YJx4datoAmMLKX9zTLgL65DLj4DBFe9/tYvny5SmMoXbq0PPTQQ5Izp+tXSRlw3y6nQH7xiRMnVN4u2ubNmwUSjFxjby0QUxqsC7ZUDp5bsxPh1cKrI516YwJEVJEPCNHT0dVk1QlsTK41pcFjDvC4GsbX2lh8s3GG+dW0LrFlzZvF6cyFWviHu/AeXSPrT/5lSE21B2Xdu7dJryJ5pe7NtUUOrvUovHo1v7XGrXsaA3IKtQjP3zNfpTpAkBH1dW/dzkfLoIje3oU3+xyJODhDddXC23NaXbVV8pRaA+TZvyfL5fhLku/ESFkz8F5TnL+pNEqqNWxnSqiL8O77TZUT8ya8kOTJETvlWMxWGZs9SZTG56kizf9eJj9eq6Pyff+4VkWqvrJK5W5qUUUkdcqqfabwvrnwbzOlQMuvlYUS+3sqmS+9+O0ml7xZ9YalfrH6N+oJf9fHeB50Q7RWxCWNQy2ii4mRtderyrC8r8iSK4+qY/aGlpPbru+XH0LvkjbXf1ECqcvE6eE+DOksTfe+mzT+XYNFzh2QLy6HyXv/JsikmG/V2KoCQsHwpLJv5RqrHNao0SvkQky8vFr+bXniyN/mOC1Ll1KpK5DxibfWVLvSTTl9SSJf2CsyqZH6wIVIdtyGryQs/qI8FPuqfJ3jLaM/nn2wwCYiaLW6SnS7MS5VFbCodPO/2+SHP4tKyOUa0ivqVuldfLvknvtY0r10nCAT5LyZmmTUJu6gNvrA74IpWz6XbBebyj8Hi0uvRrepfvgQ0+C2Iio1BW3N3tPSsFQ22bHnB+myYYRxDGo+l2nu/qirf5v1g7GRCJhhUxGbTVXXKBFp82geRgIkQAJJBDJFeG+0CcAOcKgWMXPmTLNKQ6VKleTZZ5/1eqmBKLxWWUPZLvfmj/DqXFmIJv5npCObegtZX7+G8bphAy5Yf60N2dm3SgnvjhxhRomtmyqp/+lDMHXbcuy8SIwhQVhcMyFXUvR13f6fZP353Wqjicgey2XdqOLSq2QxtShHRXgThTKydlLFDmzCgLQNHHP00lGVRoEFeygThbq/SOvQqRVYwNZvRT91PZpTyYRsUuXyBblQqpZEXz4pjx79RzrWeS6Z8JbNEyFbN3QzIp/hPyYTXrPyQtuvpeeqkWpzhXKxL8nCJ7ubwruwaHcp1+4lU0JnFeqVdH/nDhqbbCRuOoEIszWqClwQU5TXqhgeLfOOv6o2j1ALCS9eSqqLi/qyCVWl7htGBFdHkyGwU1fvk+iYePnt5Wby0rd/JYv2Wp8/d+G1jmOKsLW6AqK7gw/J/u/HSrl1w42hIJwDjEi/tZat2q1t5sMqGrzsWh1pmW2DikrnKF1Lch5Zo1IRSoeclG1hEdK/+GklorrNPBkuNS6ulkM5bpPw2MR6zsg5LVFfTka+KOUWJk93kkEHzIVX+j7GFl0snS8aecUnQ4vJZ8VzyvRcIYIUAZ2HjTQRvSmIil4PPiTyXW8lttEJeVQ1BzPKbF14lxjV9vQLC6XMwm/KbS4k83QMFori+XXZHMWPX9T4NgPcB9Ub5EdvdiEBEiCB9COQrsIbExOjcnex09q1a9fUgjWUIAsL855gn363m/LIWEg3evRowTUj3WLEiBFqkZ23FojCq0uJJStvlXijWub0feudsawS6h7hhQChGgJEr0P5DhkrvIhyHVgtup6tVbpxDyqlYq0l0b/pIJlwU0HzGt2F98hbhdVXzbiX/BdPyk6JUwvKqtToaj4Kup4qoriQBDRPpcgg33PazzEXuZmR8GvZ5f2De42v3lFbFlsMJ35V3nJabTkWYnyN3azU/TJ/eT2jdmylX6TT7mmq9BdSI9Ag12pL4Ce2S78fX5cVR+dK8fgH5KfHh5rCu/7kdcmZ7yaZeKyy/BpXRV4sMDZJeKOPyvorR5XwVqr5jBRs+5q5Je73zzeW6Jg4U3gR0UO095Em2eWttq3U1+YyqqzJZHGeDtJ24Bfq3/gaf/iibdLpjtIqQouGNAa9m5r1devPl96qV7/mUXgTa+2qYxK3l/7yl03y6C9NjG5uNXrxAQ5NRQVjzknc2OoqWqrmLFt1iWh0n/kNAV5rLR/Kv4UXSVzBbeqYonEJsvDIGcmXcEm+qv2VPHJslLFQDrvVlYgQuWuQyNfdXH9NWKRbvxE1aoWER28wI7Qrb+okxaoUlS7HfzD76jQRNT4+oOn0CeSoj6+ZdA580NNRUXwAQPMhSup6sca/zLJ2Ofi1tyc+fI0ESCDwCaSr8L7zzjsqZxd1bbH464MPPpB27drJM888E/jkRCQQhVdHcFPKrbVu3oBJcpdbvKYrJ1gXb+nKDGZlhFp9zK9J0zXCmyi8eqtbd+FVKRErP0963poOknnhVY1yZeXby9FT240IL6Kcg06oGr86NUJ3UtFBN6FoOKOhyltG0+Wj8Hdr3qj+UKE56VQLlUJw7ryxUMpt5zi92Axj3Vumu8xcWtUQ3lbx0nNJLzPFYseexTLmn9lGybUef8jw3z6Wb/d+KDfFN5OVj7+XtKMeIoYiKuVg3LXOMizfaBUJR6pJ6JVY2X5ln7r362Wfk/pdX3VZcIavqnWEV41hqVKgBk3cLQ5/XVa0h7R89j31MspbNR69Qgrkyq6iu/lzZZctiRUbIM16Qwb3XwLuW/Dq86N27synGqoopWq6SkdiKSykPty5eYjaYc6aY+vxl4wlJWLbze2kWp1GxgcOrMbPXVKqnX1HshfcILlLGR8q9AYaR64Xla0PrpZW1Uu4XgNq+CZWezDPZ8nz1a/N2nBY/jtrk+zPZdSrVfLcurHsOLFReq5+RT1L3YrWkUGblxu1htGsUVvNOqVNNzzeLF8kARIgARLQBNJVeJs3by7Dhg2Txo2NHK1t27bJE088Ib//bmxcEOgtkIU3pdxaX4S31ZxW5tf5ree0VtOZFsKrx8V41o0uzOfFmtKQKLwTChU0FmNhw4hCZU3ZVnI+zxAa1ZoOknVVW5g1cuXyaVl/wZA+X4Q3pbQL6+s6j9F91ywlvFcSjK+rvQjvw5WekI/nVzA2S3gwzEV45/35riw4u1UGZispjz6yTD7fuFzGbu4vea5VkrkPvC+YDxX9PWxUgcDX4VhghhxQLfQ3h90kJ+POqoVUM4uNkZeeesI34UUOZuJ2v2tKPykNnxhrYtapEXhBCfvTDVWupzV3FyKM3FbdkPZgSq2I4Kv4hyavUcdAnv/XpZYhnFi4dnyL7L7zf1K8TCV5aPJaiT7+j4zr3kQiqxh5pim2mHOS8G6EhGCjDsjqrY3MGrW/57tHHj7VU0oWuSwXi72phtAl26bEt5L6fT+V6qUSI6ATo4zawBBQRHsTF5GpTpaatNbrQJT3yLkr6iVrNBvRVTw3SDFSea/guu83c3Gc6oAo77mDRs5roaTIuveb5bskQAIkQAKaQLoKL0qPrVy5UooUMVYBJyQkSLVq1VRN3jyJtUMDeSoCUXjBW+d+eoq8+iK81nH0363Ci/SIXWd3qa/gdXF7O/PtdYcyDOBJeIsUk4kFckmf4lESWftxc/MLlW/6Vfek07oJ786Tm+TC9ThZfeCQFHjtnO0Ir/s2xjr3ccL6d2Xi31PV+VIq36bkungdY3tcL8ILlmO+NuQGtXWHzO6gykE9VrmfzN06US6GxcqsEm2kSqsxsnzXfum/5j4JldzyaauP1P1XuyLyzfHEXcpEzEVPySLY+w7K4/K6fDbsBRfh1ZFaCPcthXInj/Ba0hrW3fWFRN7VweSsI5p4ISXhdU9tUBtCuDVcAxa7uVdxgAx3nWyUWEMUWTHy0N/j84bcV6QNQB6ROoB6viIyVPrKlzGN1PWeLfK6qqzR6+xlGXDulFGN4u3Xk4azRLfViw9NT0ptSCGf1srEPZrNlAI7vxl4DAmQAAn4TyBdhRdCiBq7efPmNa8QpclQ/aBEicSvBv2/9kzvmVWFV39VbxXelrNbKkHQJcEgmro6gU6LsDth7sIL8cOiL5xXRcD019KI0EEYD6yWIeWqyIKQyzK8SAO5pe5TpvCqc09qkXTqWl1lXUNDiF12QcNX/1hoNKqstAy/RY5lTyoa7ymlwZrrbN0cY/m87jL61J9yS/a8MrWHsXAu2YcICG/l/xgRPJ2fCfl6aLq5YQT6uQjvqHYyYXwZFcWukLO17Llq1JDdUmuQSO1uKnr65C+t1aIyRJaxqA51eZ+sOFYGrDZWtb9ZZIy8dnqgtClVVg7nTCrDhIVSja6+J/06t1Bfu0NuVw9qpvromro6DcFd1BCN3H/kmBSJ6i75C93sMsURw5aq6Kx1MZoeDwfidaQ46JaSsKLSQc03lqnDNr/e0vDLyWvNsmb4d9WS+eWHfol5vHYftMTjoifcIwX+/VPVFI5OyKsWCeYu/YVs+nebyIFqcu3cHZJQsIzJRHVzrxYBydULI73k0yIvGZUssNscGwmQAAmQQMYRSHfhxS5roaGh5h2htm3lypVdFq7Nnj074+44Dc8UqMKr80w9iaidCK/O/7UKr/WrfCDG2Fggp0tzeSqBltJUuAuvXhyGhWSIFhdY/ZGRNwnh/WOiEo2elWrL+rgzMqV4C8lf72m1oAsNVRIKjAhPOlXZKFnX5g3PwovFQNPulZ7lKqrqDLphUwipkrQ1K17XC9fwd5fd4EYlRg0tuZbuTFU0ufHLqiqDajonFSXT5nVTUqsZ9p92QX0Njq/7F35e2Xjvym0iufeq/NIxjSbJzTWaK+F9fElPVTdWR9ndhffX215Q5bXuLV5RDuRJur81e09Jjaufmrm1OiKLa9CCqvNxIZx629jUfpQgd6i126vRrao+r3U8/B3Cu3bvaRU5tp7T07hPfrFeRXmR1vDH3tMye8NhJblju9RWpc8K5M4ur99XPbVL8vi+Vaj1daEqhPX1ZNdnrZCg5jCxCohfV8BOJEACJEAC6U0gXYV3zpw5tq6/Uydj69FAa4EqvN52W0tL4YUUYnGYr7V4tfC619M15fLs+SThTVww1LNKXaPEVvEWEtl6nIqqoqnSa1oo8YI34e04QeWkWoW30tVYmVOxR5KcWh5SvXDN/OBglSCL8OrKGLqrKj2Fc9VOXN3vRXjHzo83Uwn2LG0qY/IYBfAVi7Pn5fb2G5VMLv37uDy37HXJWfh3swayFt4uW56S0tEb5XCB29Wf9xSvLsfzGNUlcMyD4SNl6ur95riehFe/aTttIIUfZmtur7VOb2rCq9MBrAvhvn6qYVJOrcNfHtbIM3aZ0xFY/bp7yTSxVovQ5cMcXgO7kwAJkAAJpB+BdBXe9LvsG2PkQBVe605hj1YzCvDrZld4IZI6LQD5q54ivCgFlUwKbUydJ+HV8oto75JiraTAb+8aEV4tvDWiZP2lQzIlbw2J7JxYjF+fy67wJo5nFV7rLmjul46ao6i326FCB7kl3y1mvVR1nEV4rbWNUYN32f59Rq1V/dW3vj5sAvDLYJcIr1V4s698QO0Cp9vIE9FSpPP/KeFFNHXCxqmSq/gitdAP5dLizt0hT1cbIu02PC6VYjbLmaKRUvjUOnmsQAPZWMRYzAbhHdd0soz/aZfaFAIto4QX6RGz1x9Wpct6RpXzGqF1j8Ii0ottjNOqWUXcmrahF5p5PJ+eN1ZOSKtp4DgkQAIkkG4EKLwO0Aaq8HrbfMKO8CJi+3Lkyy7C616JQEc99XgoAfZ2o7dt0fYkvMhL/WrbVyqXV0Vx105xFd7azV02kHA5kY/Cq/OBMYY34XU5h1tdWlWjtfcqdYhVeOtejZepR48a+cK5ChlD6J3jKreVCSdWuwjv1ytzKCGEcJXb0t1FeN/fm1v+avSV2p0MwvvhmqWSp+xk87KunmwufWr1lVZ/9pJqcVskulg9lavqLrxrexrfxDw4eY2SXmuE0yqCOMZphFfX1sVYEEs0bGHcoHwRM+0hpYdE1/FNTY5tPWRuB+mUCX1dOgUDkeXDZy5L57rhLhUk3OdNurp9yPLnItiHBEiABEgg3QhQeB2gDVThRWRS16F1l1A7wgtkEFprhDelftZcV7v1eLXw6u2JcT7k4o7+c7Qs+GeBDC9xt3Rc87nadlX+mKRmsGdKwqu/esbXzqhvWqiMHHliiSrbZd0RTaUZJEZ4J5StIRNDjVqotoVXL2LSZarQOTGv0yq8usyVS86nrjpRNkqWn/w/6V/cWPyFfOUfNoSqhV34Sr3N+eHS5dJf6j2UHGu7r4Ek3DXYFN73ftks+Su9kUx4667pK42vr5PYotUkx6lt0jd3E/mthJHCcPXMnbJrwMfq74iiollzdK3Ca13M5u+Pjd6UAv2dyrO/1+Cpn1XEbV+Xnjc8NzofOy0vimORAAmQAAmkGQEKrwOUgSq83jafSGvhBV5dwcFOeTKUZ4qaGaU2c0C6BbYn1ovkzMh02bbS95dJxjay5w+pP3veWslzhNda9gv1UhNFNFnpMyfCi+gudsLCKv3EhW/6PPjTKryqBm/OcJE+ideCA7Q4FSoj62L+VTuhoeEDghYxRDV7xX0tbS/MU+9BnHMc6Sz5GzymUgH0cYWrvSlxCZcNmU2M8Ib8MlL6hyXtNvdc2H3yS+lN6picF1rL+mf/l+JPgad6ug5+ZFxq8doWSycntNnXWjLM9nXh2cKcI5rP2rg2SfMwEiABEsgcAhReB9wDVXix1z2qGCDXE/VxUUpsUL1BioS78OrNE9xzdPV2wjpVISVRxpjeFsm547fKOPKEEY1GLjCqPOhSYHULVpSpfy1P6lo2SiJCDxmSeOy8yKCk2rMudW5TE17skLVppkyoHCUTY43xbEV4rdFd1NbVKRSJEV6Xig4Q3oI1jBq8ullKXOktkrXwWnc7G3vzYlN4B54+K9+f6icJZRupTR208Fas9YUcjzW2xYXwdqv8pBRcO9ZVeIv3k1/yfKeOuTnuXlnxxMgME17Uz233vrEdrm2xdPAzarcrrgv1fkvflEfeeaCW3W48jgRIgARIIEAIUHgdTFSgCi9u2RrhtG4z7C6uuuSW3v0MkVdsgwrRRXqBft8axcT4qhxYDmNXqiGrhhipCNgEokJHr8S9RZ+1qJfKVViWbje+2lfNKryI1FpLRNmI8OrtY/Vw8+o+IENPr1X/VBFZ7N6GmrkpNV2KTC9EcxNeK2+16UStHiJtRieNhgjxuAiVcuEuvNbNH14ut0deiflA9cM4Xc5NMReYaeGtX3elbLtkyPSVo10k/nwd6Z9ttovwji83Wj4L+chAF9JVFj32Soq3hm17kUOMllolBQc/SuxKAiRAAiRAAulKgMLrAG8gCy8ip2juZcNSEl73hWS6akJKwmvN1/W2SM4dvzfhtYqjyrnVLY2Fd12DXtLrxE9JwusekXW/aHfBHRlu5AtbFqZpfkp46/dPnvOZGOXdkSNMutxSUp1BM9SlscZGRkvnLb3Ve/H5S0uFk2OMbYdfbmZGeFs02Cl/nDd2ert84Cm5dvk26ZxtpYwNSxL22RGT5I2LI9QxbYu+KaPb/SfFnwJrbis2ZPjksboOfmLYlQRIgARIgAQyhwCF1wH3QBZefdvu2wynh/DqOrR2KjWkJrx604xZR45JlVhjkZXtCO/xLaaIRnxjbCeL5h7hdSy81q2PE0uPac5q04nHFiaVJNMXgSjvzIeN190WQLV5b6WqntAs106ZIomL0spGSbmdz6jeSA3QYvpg41hZfOo1F+FtELpNvs7xlnm/21rNkAe2D1O7svkivMlq0Tr42WFXEiABEiABEshIAhReB7QpvOtTTGmwRnhTk1jrFKR2rM4HHn/ipDS/fMXoWqurREQbi8BU5BfbvJaoabxnTWnAv5HH22ORRPzaN22EV1eBKF4jaSGaB+FVJ5vUSATSba3Ba+P50wvHCoRcks05nzR6NB0k5ZYa9wjh1ceMf7iyDN1oRGx1hNddeE92nisNF66W0LAz8kbzHtL19topXoU1wkvhtTFZPIQESIAESOCGJEDhdTAtWVF4de6uxmbdntiaF2wVXp17a2fHtdSE10yPQG7tucTtXJsOkoiDM5KE1yqUfgjvkahnpfXRBWo8lcPrLaXBOr5eiJaS8HrI7bXz+LmUzMr1sNGl4wRp82u4ivyinu34H3eZO7K9sPZelWedkvAixzli2FK5EBOv+uqas56uhcJrZ4Z4DAmQAAmQwI1OgMLrYIaCQXh1ioD7RhEai87R1TKLqg39f+5vUrMjvDjYPXUiJeypCa+uIWzWs8VAfghvy/XD5dilY+oy3FMarOMNP3laOsZnF6nSzogaN+jjeul2hVdvTOHHNrQuJbO08PZYJA8uzW5KrlV4IbC6ugMutnTISVmVs1/SdQ87L6hKEB0TJ9VLFnCpu+s+L9iy+OkvN6iXGeF18MuCXUmABEiABDKVAIXXAf5gEF73kmGp5fAicmvdVc1X4bVWb/CEPjXhNd+/EiNTjxvb7OqIrMdte7WQVm5r1ExNTGnouWOK2rXNo/BaUh7UIrOYmKRLte6Qhlc9Ce/MriI7F4s8NF2kyr1GX0/H2Xz2rPJqrZSgdwf7+NE6MmXVPlN+3YVX9TnW2kV4bZ7aRZxTiwbbHZPHkQAJkAAJkEBGE6DwOiCeVYUXyLAQDTuVoUauLj/WcEZD9VU6mvuuanZr8dqp6KCjxUsOHZFb4q+JXmRWN2cxmbpjvbED24mtxsyWbSTy6yhjFzXk2yaKaM/9c/wTXvf82x2LRL7uJgKh1tvL6rq61h24/pouMq+v63E2nz3sgFbzjWXqaGulBJ1ugMjr2r2nUxRetRXvhjspvDZ58zASIAESIIHgI0DhdTCnwSC87jVytZi6lx2zk5Jg7Tu1tVEaS7e0FF59zdh84dHoC7KuyXPS69B8MYXXur1v4nbBSnjREuW359VdXoVXpzwki/C6byPrSW6tr93+sMj+1SL7VqpNLdR1+LENrS5NZk0r8Ca8un4vbnnovdXk8Z8sC9OsdYpTef6t0WVGeB38smBXEiABEiCBTCVA4XWAPxiE1z2iml7Cq8+DXd1alG0hd4ffrXZ5c292Irx6x7VS8fGy9NBRWddxnPTaNC5JeHMVNNIX0GwIr1qYphfAoU+PRTIv/rQcvXhUOix6VUWRUfpMpUPgT0+7pFlF1iq8SGXQO7xh7I4TRGp38/mp01UYlLw2ulX117m9ne4oLYfPXnaJ8OJ9LQcnw3gAACAASURBVMlKVJe2N6Le1moSNq7CujMahdcGMB5CAiRAAiRwQxKg8DqYFgqvKzxvEV733GDr7m7WUewIL45vOa22HAu5pnYck9YjXYXXOiDSG/6YZIhv7oIiSwarMmY9c181I7yehFfVw0VDxYVzB0W6zhCZlPiaNUKaWoT3j4lJ8p0o0+bYPjx7nhaZWbcdxlB/7DvjUnVBC+/m11tKga87eBZ2G9fgMk7uMBs9eAgJkAAJkAAJ3FgEKLwO5iMYhFdvCtGtajcZVG+QaGlFdYaJmyaKFlM7KQ061cCTzOqorMbtVHhHTWso00MuSvsLF6Vjo6GG8BasKFP/Wu46ozoyC+G9tZHItHtVlLZnyeJJwhubU/oe2Z3UL6U6uROjjCip9f3UhBcpFNbWf7NIobIOnrqkrlbh3X4MVRfi5beXm0n4TbnNg3CMKjumS6W5R6htXIkWXtT7ZSMBEiABEiCBQCRA4XUwa8EgvO5VEZwIr47OepJZfZ60Et5105pLr5B/VUmxvg0GOxPe6wWk74HERW64QOvGFdbn44eXk6LFOg83tYoMOp0BqQwlIpI2xHDw3Omu1sVs+rUUpdSB8EKa0bzV602D2+EQJEACJEACJJBuBCi8DtBmFeF9ud7L0mVhF0H+7Zz2c1Ik5k14o2OjBdK74uAKWfDPAjNy7D6Y3ZQGv4QXC8jG1xQpVEaG1G2vrgOtj7vwprSoS1dkKFRGpP8W49I9bTKhS5Dpm/Mxb9aXR1JHX1MVXh2J9iPC68v18FgSIAESIAESuBEJUHgdzEpWEd6+tftKr6W9UpRUjdCb8OpjUquza1t4Z3eVXpe2qghvZIMBKv2iT/n7pe9P411n1JrSgKhs4m5nEzqOVH18El4cPK6GyPlDSTV27QhvOkqmz8Jbq6vIfyY5eOrZlQRIgARIgAQCjwCF18GcBYPw4vat+bmeUhrsCi9kFv+hPm/HCh09kj1y8Yi0ntNaHbO009Jkxzy/4nn5+dDPgh3dmpdpnvLs/DzS3E5Y5xurXeHmDXbtUzDcEFRdRcGp8K6dYCx803V3M1l4dfUG2xFeP8uiOfgxYVcSIAESIAESyHQCFF4HU0Dh9Q+etwVwduv1il3h1ZfYeqRIg75mhHZC65dl4s6ZSRHe4wdFrkYbR3urU4stgsdFGMdiARoWwaGKgzXv1z2lIR0l07bw+jdV7EUCJEACJEACQUGAwutgGim8/sFzF15EhSsXrqx2bEsz4dWRXX2JurJCYkR2wl29ZeKBxUnCKwWT6uWmtjHDd72TNpHQVRjc+yRGktUJ0lF49fbCqUZ4/Zsq9iIBEiABEiCBoCBA4XUwjcEivJ0WdJJdZ3fJrPtmyeg/R6tyXdayZHZTGuyitAqvLmUG4X0r6i3z/FNaTVHbFqfYUorw/jDSiL5ad1vDIF6Ed0rxFhJ5bKd94bWWIrMjvH5uNmGHp95tjcJrhxaPIQESIAESyKoEKLwOZj5YhNcaVcWisYwUXn1uTEP+HPml8k2V1fntCG/DfV/JxWyhghrC07dPV5IOOVdNS6l7hDcxOjuh7v0y8fR69a6L8NqpqKCrNVil2luEN6W6vg6ePd3VKrz5c2WXLcNapcGoHIIESIAESIAEgosAhdfBfFJ4/YPnaZGc+0ipCu+ORdLzlwGyPncuVT1CS7opvFpK3YU3UYQn1GorE6ON2rsuwmunooLO0b0BhPezVftk+KJt6j7q31pYvnm6oX+Twl4kQAIkQAIkEMQEKLwOJjdYhFdXRhgeNVzm75mfoRHeO2feKRdiL6gav0ir0C1V4d3/m/Rc0ksJb8m8JeXYpWOC6zerQ7gvHNNR1kThnVf3ARl6eq1/wouFa6Msu6V5igpbc3gHHRDJVcjBk5ZyV73bGoU3XfByUBIgARIggSAhQOF1MJHBIrzW2rdYQKZTClB7V0VAW02xVYfXLkprCoU+h84Zti28MecM4T2ftCWwiySnJLyJkd91lZpKr7h9ScKbkMPYRc1OhBe9rELrqY8uV6aOPW8Xjc/HUXh9RsYOJEACJEACWZAAhdfBpFN4/YPnSXgRnR26eqg5IBbQVSlcxesJrPm/WszNhW4pCW/i6+vK1VVbE6t+WLSWs5gIFqDZFV69AQUGyEThPXT2ijQevULdB1Ma/Hse2YsESIAESCD4CVB4E+c4ISFBPvroI5k6daps2LDB1sxTeG1hSnaQJ+HVUWR98JbuiVv3ejmFTsXQh9iK8HoS3vAOEhmS1zfhtUZwvQmvXYH2D6XqpXdbo/A6gMiuJEACJEACQU2AwisicXFxMmDAAClWrJgsXLhQ1q1bZ2vSs4rwYtez/j/3l7vD75b3m71vi423gzwJ7+quqyVqZpTZzY7w6lQMj8KLF61pBzqHNzH/dl2uXNKrZDHVdUqtARJ59oRvwqtr8WIAT9v1aiGm8Dp+XjgACZAACZAACTglQOFNJLhy5Upp1KiR1KlTRzZu3GiLa7AIL/J2kUsLocUCMvccXpete3XZL1uEPB+khVeLdL6wfLLm4TXmFsfolebCi13RCiUuNBtWUKzCO6vOK1Ll5H5DeO1uEmEte+apjxZevQWxA16pdW3z3krZfuwCUxpSA8X3SYAESIAEsiwBCq9l6uPj4yUyMjLLCi/Kex29dFSOXjwqSzotkdZzWis66SW81s0tpraeau6yli7Ca1045ia8W5pOECnX2LdfAmsniCwZbPTxJLzWzSnuTjzOtzPYPlpvL8yUBtvIeCAJkAAJkEAWI5BlhBe5uXPnzk02vdOnT5cCBQqo170JL45zb8OGDZPdu5OqBATqs6MjvLqerRZOXS83rYVX766mN4zAeSG8+vV0F96pbWXdiQ1mSoNfwmtdFOdJePH+mgkiVe8Vqd0tXR8NLbz9mleUAfdUStdzcXASIAESIAESCEQCWUZ47UyON+GFMLu3ESNGUHjtgHU7RufeasHWwmvNybWT0jBvzzyXyg7J+owMN7YZRrNGeL/rLUf+/lZah9+i3vJLeI9vFpmUGBW2mwbhBys7XfRuaxReO7R4DAmQAAmQQFYkQOG1zDpTGowdy3SEtdOCTmozCOT2/nzoZ9etex38tKQkvMsPLleL4/T5UzuFjkzr45IJb0q1cP+aLjKvr0TcWsZ/4VUSXdA4dTpuHZwaA7wP4V227bj0anSbdKlT2k4XHkMCJEACJEACWYoAhZfCKymlNOjFZR637nXwY5KS8Orr0IvYUjuF38J77oDI+JpJwttqukiJmqmdLvn7OoKcycLr+4WzBwmQAAmQAAlkLQIUXgpvpgmvRo8c4b5+VH/wW3hVWLSGTAgx0h369j/o30/9Dy+LHN8i0maUf8Ls31nZiwRIgARIgARIwEcCFF4fgVkPD7ayZO6L1tI7wpupwmuto5uOW/86eLzYlQRIgARIgARIII0IUHgdgAwW4d1xZod0WdhFKheuLDvP7FREkA97owtvdGy0980qEIH9Y5Ixw+5Sm5jH6/E9B88Eu5IACZAACZAACdx4BCi8DuYkWIQXCHQJMo0jPYXXPRXB35QG9+tOtmjNujmEu/Baqywwwuvgp4BdSYAESIAESODGJ0DhdTBHwS687lv3OhFTK+b0EN6SeUvKss7LXGdTC2/xGiJ9VjuYaXYlARIgARIgARIIZAIUXgezR+H1D5678GKL4eZlmvs1mI5M61q+LoNo4S0bJdJzsV/jsxMJkAAJkAAJkEDgE6DwOpjDrCa8w6OGS8cKHR0QM7q6C++UVlMkskSkX+NSeP3Cxk4kQAIkQAIkkKUIUHgdTHdWE14nYmrFnJbCqzfH8Bjh1dv/MsLr4ClnVxIgARIgARIIfAIUXgdzSOH1D15aCq+1ksTU1m7bP1N4/Zsg9iIBEiABEiCBICNA4XUwocEuvF9u+1LGrBtjEkqrCK8ug6YHdjKuV+HFjmobZ4hUbceNIRw85+xKAiRAAiRAAoFOgMLrYAaDXXjTMhLrjtlaBm3WfbOkSuEqfs2EV+H1a0R2IgESIAESIAESCDYCFF4HMxpMwqtzYTUO1LTNKOFNVj/XhznRwnt3+N3yfrP3fejJQ0mABEiABEiABLIKAQqvg5kOJuHV4hhowqtrBadVjWAHjwO7kgAJkAAJkAAJ3KAEKLwOJobC6z88a0qDkwgvhdf/OWBPEiABEiABEsgqBCi8DmY6WIU3X1g+WfPwmoBIadDCOzByoDxa7VEHs8muJEACJEACJEACwUqAwutgZoNVeHVN2yMXj0jrOa1NQk6qKbhjTqsIr4PpY1cSIAESIAESIIEsQoDC62Cig114gcYqpmkpvF43jHAwJ+xKAiRAAiRAAiRAAu4EKLwOngkKr//wWE7Mf3bsSQIkQAIkQAIk4BsBCq9vvFyOpvD6D4/C6z879iQBEiABEiABEvCNAIXXN15ZWnhXd10tBXIUcEAsqSuFN00wchASIAESIAESIAEbBCi8NiCldEgwRXh1tQPcq160hr+n1+IyCq+DB49dSYAESIAESIAEfCJA4fUJl+vBWUF4rTuwOamX645ZC2+3qt1kUL1BDmaBXUmABEiABEiABEjAOwEKr4MnJCsIr3UHtrQU3udXPC8/H/pZuEOagweQXUmABEiABEiABGwRoPDawuT5IAqv//C4Q5r/7NiTBEiABEiABEjANwIUXt94uRxN4fUf3rw982T+nvnSoUIH6Viho/8DsScJkAAJkAAJkAAJpEKAwuvgEaHwOoDHriRAAiRAAiRAAiSQQQQovA5AZwXh1bm2wJSWObwOsLMrCZAACZAACZAACfhEgMLrEy7Xg7OC8FrLlVF4HTws7EoCJEACJEACJJBpBCi8DtAHk/Aip3bo6qGKhrUOL4XXwQPCriRAAiRAAiRAAjcEAQqvg2kIJuFdd3yd9Frai8Lr4HlgVxIgARIgARIggRuTAIXXwbxQeB3AY1cSIAESIAESIAESyCACFF4HoLOS8OYLyydrHl7jgBa7kgAJkAAJkAAJkEDmEKDwOuCeFYRX5/Za83odIGNXEiABEiABEiABEshwAhReB8izgvDq3F4Kr4MHhV1JgARIgARIgAQylQCF1wH+YBXePrX6SN/afRUZCq+DB4RdSYAESIAESIAEbggCFF4H00DhdQCPXUmABEiABEiABEgggwhQeB2AzgrCGx0bLTvP7JT8OfJLlcJVHNBiVxIgARIgARIgARLIHAIUXhE5deqUjB49Wn777TcJCwuTJk2ayLBhw9TfvbVgEl6IbdTMKHW71pSGzHkseVYSIAESIAESIAESSDsCFF4R2bhxo+zbt0/at28v169fl969e0vz5s2lW7duWUZ4caMRn0dQeNPuZ4sjkQAJkAAJkAAJ3CAEKLweJmLq1Kly+PBhGTrU2Go3pRZMEV4K7w3yE8nLIAESIAESIAESSHMCFF4PSJ9//nkV4e3QoYP5bkxMTLIjIyIiZPfu3Wk+KZk1ICO8mUWe5yUBEiABEiABEkhPAllGeBG1nTt3bjKW06dPlwIFCpivr1q1SsaNGyfffPONZM+e3Xz9vvvuS9Z3x44dFN70fDo5NgmQAAmQAAmQAAmkAYEsI7x2WG3evFkGDhwokOOSJUum2oUpDaki4gEkQAIkQAIkQAIkkOkEKLyJU4Bo7YABA+TDDz+U8uXL25oYCq8tTDyIBEiABEiABEiABDKVAIVXRP755x9B3u57770nFSpUsD0hFF7bqHggCZAACZAACZAACWQaAQqviIrsLlq0SEJDQ82JyJkzpyDFwVsLNuFtOKOhXIy7yDq8mfbjyBOTAAmQAAmQAAmkBwEKrwOqwSa8PZf0lPUn1lN4HTwT7EoCJEACJEACJHDjEaDwOpgTCq8DeOxKAiRAAiRAAiRAAhlEgMLrADSF1wE8diUBEiABEiABEiCBDCJA4XUAmsLrAB67kgAJkAAJkAAJkEAGEaDwOgBN4XUAj11JgARIgARIgARIIIMIUHgdgKbwOoDHriRAAiRAAiRAAiSQQQQovA5AU3gdwGNXEiABEiABEiABEsggAhReB6CDVXjH3z1empdp7oAMu5IACZAACZAACZDAjUOAwutgLoJNeB2gYFcSIAESIAESIAESuGEJUHgdTA2F1wE8diUBEiABEiABEiCBDCJA4XUAmsLrAB67kgAJkAAJkAAJkEAGEaDwOgBN4XUAj11JgARIgARIgARIIIMIUHgdgKbwOoDHriRAAiRAAiRAAiSQQQQovA5AU3gdwGNXEiABEiABEiABEsggAhReB6ApvA7gsSsJkAAJkAAJkAAJZBABCq8D0BReB/DYlQRIgARIgARIgAQyiACF1wFoCq8DeOxKAiRAAiRAAiRAAhlEgMLrADSF1wE8diUBEiABEiABEiCBDCJA4XUAmsLrAB67kgAJkAAJkAAJkEAGEaDwOgBN4XUAj11JgARIgARIgARIIIMIUHgdgKbwOoDHriRAAiRAAiRAAiSQQQQovA5AU3gdwGNXEiABEiABEiABEsggAhReB6ApvA7gsSsJkAAJkAAJkAAJZBABCq8D0BReB/DYlQRIgARIgARIgAQyiACF1wFoCC8bCZAACfhCoEGDBrJ27VpfuvBYEiABEpDdu3eTggMCFF4H8L788ktJSEiQxx57zMEo7JoagV27dsn48eNlwoQJqR3K9x0SeOCBB2TSpElSuHBhhyOxuzcC7777rlSpUkXatm1LUOlI4Pfff5elS5fKG2+8kY5n4dAgcM8998iPP/5IGCRwwxKg8DqYGgqvA3g+dKXw+gDL4aEUXocAbXan8NoE5fAwCq9DgD50p/D6AIuHZgoBCq8D7BReB/B86Erh9QGWw0MpvA4B2uxO4bUJyuFhFF6HAH3oTuH1ARYPzRQCFF4H2Cm8DuD50JXC6wMsh4dSeB0CtNmdwmsTlMPDKLwOAfrQncLrAywemikEKLwOsFN4HcDzoSuF1wdYDg+l8DoEaLM7hdcmKIeHUXgdAvShO4XXB1g8NFMIUHgzBTtPSgIkQAIkQAIkQAIkkFEEKLwZRZrnIQESIAESIAESIAESyBQCFN5Mwc6TkgAJkAAJkAAJkAAJZBQBCm9GkeZ5SIAESIAESIAESIAEMoUAhTdTsPOkJEACJEACJEACJEACGUWAwusH6evXr8vbb78tixYtkrCwMOnTp49069bNj5HYxZ3A+vXr5ZFHHpEcOXKYb61evVry588vZ86ckf/+97+yefNmtRPYiBEjpE6dOoToA4H9+/fL008/LQ899JD07NnT7OmNLXgPGjRITp48qXYHe+edd6RYsWI+nDXrHXrp0iV58cUX1Y1j5zrd+Hyn3bMQFxcnEydOlNmzZ8u1a9ekUqVK6ndCyZIl1UnmzJmjdmiMjY1Vu4Bht7Vs2bKl+l7aXWHwjIT/12Gny7Nnz0qRIkXktddek3r16qkbxE6jGzdulJCQEPVv/P4eOHCg+jt/dwTPMxAMd0Lh9WMWZ82aJQsXLpTJkyfL5cuXlTygzFCNGjX8GI1drASwDeiaNWtk2LBhycC89NJLcsstt0i/fv3UL1L8ieNz5cpFiDYIrF27Vv1Pv2LFinL77be7CG9KbPGBDrKA+WjSpIl8/vnnglJPH3/8sY0zZs1Djh07Jr1791aMjx8/7iK8fL7T7pk4d+6czJgxQwlXvnz55IMPPpDdu3fL+++/L/v27VOvf/311+rDGT581K5dW3r16uX1vbS7uuAa6aOPPpKOHTuq379//vmnPP/88+r3NCS3Xbt2ghKd7tuR40MIf3cE13MQ6HdD4fVjBh9//HHp0aOHNG7cWPWeNm2a4H9ygwcP9mM0drESwP/AEG189tlnXcAgql63bl1BtDd37tzqPUTWu3TpIs2aNSNEGwQgA4iUQwIKFixoCq83tojmvPXWW4IPeWg4tmHDhvLTTz+psdiSE7h48aLs3LlTRRbxAcEa4eXznX5PzLZt21RkEdFIBCOio6MFH+TQtm/frn4/z5s3z+t76Xd1wTUyPsz98ssv6vcI/j+4cuVKM8Kr73TTpk383RFc0x7wd0Ph9WMKmzdvLl988YX6tIuGH3b8+9NPP/VjNHaxEgDDqVOnSs6cOZXYdu3aVX1FhkgZIun4Javb//73PylUqJA8+eSThOgDAXwbcdNNN5nC640tomOI6I4ePdo8Q+fOndVXmjVr1vThrFnvUHw4Q+TLKrx8vtPvOZg+fbr8/fffKq3hlVdeUR+Q77//fnXCq1evqn9v2bLF63vpd3XBM/KePXtUWtTy5cvVTUF+S5Qoob7trF69uvpgER4eLvPnz+fvjuCZ9qC4EwqvH9MYFRUlCxYsULlMaPiKB7liiN6wOSOAr8HwP6c8efLIgQMHVBQXqQuVK1dWXxMvWbLEPAG+wkTEEe+z2SfgLrzI602JLf5HhsgZUiF0wwcQROAbNGhg/6RZ8EhPwsvnO30eBHxoe/TRR9W3bQhEvPDCC9KiRQtp27ateUKk8mDXRqQ3pPSezkNNn6sM/FHj4+PVt5vdu3dX6Qpo+EYjb968gvfwAW/u3Lkqyv7NN9/wd0fgT3lQ3QGF14/pxC/Lzz77TMqWLat645MuZBevsaUtAXz9rvN1O3XqJKtWrTJPgEhO0aJF5amnnkrbkwb5aO7Ce+LECUmJLYT3119/VQvVdOvQoYO8+eabUqtWrSAn5ez2PAmv+4h8vp0xRm8spIKAYUGrTjMbMmSIej6xVbaWMnxA27p1q3h7z/nVBO8ICC4gRQT/3/MWZLjzzjuV9CIQxN8dwfs8BOKdUXj9mDV8nfPggw+auaOffPKJQBpeffVVP0ZjF28E8DXl3r17FdvIyEhZsWKFFChQQHV54okn1DzoSANJ2iPgLrwJCQkpsi1VqpQSBOQ+oiGKg9XZmAekk7ClTMCO8PL5dvYEXbhwQaXmYF1FmzZtzMGmTJmififrdRX40IznGIuNvb3n7GqCtzd+R4AfIrn401urX7++LFu2TA4fPszfHcH7SATknVF4/Zg25Cah5I2u0oAowsiRI5U0sDkjgHxR5IZi1fWRI0fUqurhw4cryUJeHiK6/fv3V1FffA2PxVM4ls0+AXfhRc+U2CK1pHXr1jJ06FAVPcMiLDDHV5ds3gl4El4+32n31KD0Gz70IpXBmrqAM0C2Hn74YZk5c6ZZpQFly5CK4+29tLu64BoJv4MhvcjdtzZ8qPj3338lIiJCvY8cdUR1v/rqK5Vuxt8dwfUcBPrdUHj9nMExY8Yo6UXOF6ILXDjlJ0i3bih3hQWAaIjkIpqOcjhoWHWNVdgbNmxQ76FUlv4KM23OnjVG8SS83tju2LFDcT969KiUL19exo4dqxalsPkuvHy+0+6pQToIPoiFhoa6DIrcUZQgQx4pFlvGxMRI06ZN1WI2Xd/b23tpd4XBMRJKvLVs2TIZZ+RCoyQZ0hsOHTqkFhpjARsiwLpON393BMczECx3QeENlpnkfZAACZAACZAACZAACXgkQOHlg0ECJEACJEACJEACJBDUBCi8QT29vDkSIAESIAESIAESIAEKL58BEiABEiABEiABEiCBoCZA4Q3q6eXNkQAJkAAJkAAJkAAJUHj5DJAACZAACZAACZAACQQ1AQpvUE8vb44ESIAESIAESIAESIDCy2eABEiABEiABEiABEggqAlQeIN6enlzJEACJEACJEACJEACFF4+AyRAAiRAAiRAAiRAAkFNgMIb1NPLmyMBEiABEiABEiABEqDw8hkgARIgARIgARIgARIIagIU3qCeXt4cCZAACZAACZAACZAAhZfPAAmQAAmQAAmQAAmQQFAToPAG9fTy5kiABEiABEiABEiABCi8fAZIgARIgARIgARIgASCmgCFN6inlzdHAiRAAiRAAiRAAiRA4eUzQAIkQAIkQAIkQAIkENQEKLxBPb28ORIIDgKbN2+WIUOGyIEDB+S+++6Tt99+Ozhu7Aa6i5EjR8rly5dl+PDhN9BV8VJIgARIIG0IUHjThiNHIYGgI9C0aVMJDw+Xr776yuXe/vnnH+ncubNs3Lgxw+4Z52vRooX06tVLSVmhQoUcnfuDDz6QS5cuyaBBg9Q4+BNjV6pUKU3HdTRYBndOa+H97rvv5OLFi/Loo4+ad/LYY49J9+7dpXnz5hl8dzwdCZBAVidA4c3qTwDvnwRSIADhvXDhggwdOlT+85//mEdlhvDWrl1bZs6cKVWrVk2T+frzzz/l6tWr0rhxY4mNjVUC9tlnnzkWXuu4aXKhGThIWgtvv379pG7dui7CO3v2bKlXr56UKVMmA++MpyIBEiABEQovnwISIAGPBCC8Tz31lLz//vuydOlSM6rqSXhnzZolkyZNkmPHjqmoMGSnbdu2tshCOMeOHSsLFixQgl2zZk157bXXlNzu3r1bBg8eLJs2bZL8+fNLtmzZ5Mcff0wW4cUYY8aMkYULF6qoYvny5eXLL7+UDRs2yJw5c6RLly7y5ptvqqjuH3/8IVruXn/9dbn//vtl+/bt5vhIl2jZsqUcP35c8P7q1aulSJEi8tBDD0nv3r0lJCREVqxY4XVcnRaAMfD3VatWSWhoqNx1111qTB2hxnXkypVLjhw5Iv/3f/8n8fHx0qBBA3WteN1Twz3h/T179kjZsmXlxRdfVMK+ZMkSGTFihPz666/qGnXDOV9++WVp06aNvPvuu/LDDz+oeSpVqpR6XUdbrcJ7+PBhufvuuxWX7Nmzm2NVrFhRfvrpJ3XeM2fOqPOtXbtWoqOjpVq1aurft912m7z66quKT1hYmOTMmVMxxjx27NhRnnjiCbn33nvVmLhnnHfbtm1SsGBBeeCBB+T5559XrNBwPKLCc+fOFVwT7qtbt25qDDTwGjVqlHz//ffqGiDSL7zwgtxzzz22nj0eRAIkkHUIUHizzlzzTknAJwIQ3k8//VQmT56spAdi1fIuRgAACeZJREFUguYuvBCsgQMHyrhx46RWrVqybt06JWGImCIym1qDqEJC8efNN9+sRAnyvHz5csmXL5/qXr16dcFX5CmlHEB6fv/9d3nrrbeUyO3atUvuvPNOJaYQzqJFi6oc4OLFi0vJkiVN4dVi6ml8SHKTJk2kZ8+e8u+//0qfPn3U3yG+dsdFKgauGRJ27do1JWfnz5+XKVOmqPsC06lTp8rEiROVeF65ckUJHoQNHzbcGyQT4gqxxP1BGJ955hnBBw5IaP369RX3O+64Q3X966+/pEePHkpKIdD44IIPEmCwaNEiJc6YLz2/OofXjvDGxcXJ4sWLVZQ8T548iv2JEyfkk08+UedGigik2ZrSYBXe06dPqzQVPDvIy4b047nB359++mlTeE+dOqU+vNx6661KevH+tGnT1LOGqD9k+KOPPlLPCsQZzxBYsJEACZCAlQCFl88DCZCARwIQXohE6dKllWQh0hsZGZlMeCE0kB6roL333nuyc+dOmTBhgle6iMzefvvtMmPGDCUwuj388MPSqlUrle+ZmvDqMZBrjLGsDWIKeUJUuFy5cuZb7l/fuwsvoqgQMUi3bhgf0dHp06cr4U1tXIgkIsIQcUQ50RB9joqKkm+++UaqVKmihBcL8iBuuuEDxtatWxVv9waekD4Ir27PPfecIPKKyCiiqJBPpKGg4TgI9ujRo5ONlZCQoD5ILFu2TM2xrxFe9wHBDGKPD0BoqQnvhx9+qPLAIei6Qcwh8BgLDYKMCHX//v3NYx5//HFp1qyZivRiLubNmydffPGF5M6dmz/JJEACJJAiAQovHw4SIAGPBCC8kJKIiAj1dTkkFmkHBw8edFm0hkgjorONGjUyx4EoIpoJ0fTW9u3bp9IH/v77b8mRI4d5KEQN0UZEDdG8RXhTGgP9IKZIIfjtt99cLiM14YWQQhrxlbxuEEREj/GVvp1xv/76axV9/Pbbb13O3aFDB/WVPCKVuI5z5865CCmimT///LMZBbZ2hlAiqorUDt2uX7+ucqzBbM2aNSpKihQKfP2PCDXmAZKNiCwiy7h+RJLx/o4dO9TcIgXEV+FFNPfjjz9W0oqxY2JilNBDWu0I74ABA5Ro43p1Q398aEEaSbFixcyUBqRE6Pbss8+qZxIfOJCH/d///leQO/3ggw/KI488oiK8bCRAAiTgToDCy2eCBEggVeHFARANiCcir506dTKrNGBh0vjx412EF1KFqGJqwosoMPI53YUXebSQMjvCixSL1q1bJxtDCy+ED1FMa0tNeBFxxgIrCKunBuFNbVxEHZE24C687du3lyeffNIUXvdSYN6EF7nROvfW03VBfiG5yNVFmgKipZBfCDKEGNFkzFWJEiUEAl+5cmUVtbYjvJDLGjVqmDm8EHekTrz00kuSN29eJZ2IMtsVXlwb8n1TE15rzq9+DrXwagZ4jpAagpQNfEiD4LORAAmQgJUAhZfPAwmQgC3hRS4lvmJGNBeyosuSIaqGlAadd4nBEA3Ggifk4npriAoiTQKSZ833RUoD8liRM4vmLcILEdNVHNxzhu2IKcaHQEFMdRUIyBtEC5FhLKZyb3bGxRjI+/WU0oAUBizy8lQZwZvwopzaypUrVc5uSg1jYjGXjgK/8sor6lCINlJE8GEFbf/+/Yqxpwjv2bNnVTUFzLHOo8YCQixExIeZwoULK9lFJBpRWjRExZHHrYUX/BD1Rw6xbtYcXtwLxtf5zDgGfcEMucmIQLsvcktJePX4SI+A4EN+2UiABEiAwstngARIIFUC1pQGfTAWjn3++edqAwgtvEhfgFTpRWvr169XuZyQXcgsckix0AvR2jp16iQ7LwQNfSDS+BobkVXkqmLcAgUKpCq8OGDYsGGqkgOirohe4qt6VHvAV/ypRWLRHznKEELkIUMWkXOLigGQXdToheAhkgxBh8TZEV5EW7FoDbm6etEaot744IDoL5qvwotFa1jchrxpfChANQNEbbGgC1FaNCzcQjQewogPHojKokEksXAN5zx58qRiBsFEmop7hBfHI1KMPl27dlWpA4jkQo4hvKiGgLlFRBdziwg9ngGIshZejI+FaIi4ggVybK0CiwoWEGiMC/Z60RrmAh+o0FITXnyowFyBMXK5sQgRaSjcmCTVH28eQAJZjgAjvFluynnDJGCPgCfhRU8sRoLsWjeemD9/vpLUQ4cOqYgfhAsSg4ZcT+SYQohRRcC9QTAhZhBdlJZCtBVlrbSo4fjUqjQg/QFii/xWpAhgERcWmUGI7AgvIrk4JyoHQErbtWunpBDihEVYkCmUW8PiKaRP2BFeXDcE9Y033lBRWS2RyCmGQPsjvOgDscc14k+kLSAtAekKFSpUMNFCGsHVmlKCXGcsxIMQ40MBBBUfSsDHk/Bi0R0EEh9YIJVIw8AiRnzgQRUEMNCVGRCtxn3iGOTfokFgIcT48IH8WpSacxfYLVu2KMb4Ex9uEH1Gbq+OTqcmvLgGsMBCPizWQyoDRNvpxiT2fkJ4FAmQQCARoPAG0mzxWkmABEiABEiABEiABHwmQOH1GRk7kAAJkAAJkAAJkAAJBBIBCm8gzRavlQRIgARIgARIgARIwGcCFF6fkbEDCZAACZAACZAACZBAIBGg8AbSbPFaSYAESIAESIAESIAEfCZA4fUZGTuQAAmQAAmQAAmQAAkEEgEKbyDNFq+VBEiABEiABEiABEjAZwIUXp+RsQMJkAAJkAAJkAAJkEAgEaDwBtJs8VpJgARIgARIgARIgAR8JkDh9RkZO5AACZAACZAACZAACQQSAQpvIM0Wr5UESIAESIAESIAESMBnAhRen5GxAwmQAAmQAAmQAAmQQCARoPAG0mzxWkmABEiABEiABEiABHwmQOH1GRk7kAAJkAAJkAAJkAAJBBIBCm8gzRavlQRIgARIgARIgARIwGcCFF6fkbEDCZAACZAACZAACZBAIBGg8AbSbPFaSYAESIAESIAESIAEfCZA4fUZGTuQAAmQAAmQAAmQAAkEEgEKbyDNFq+VBEiABEiABEiABEjAZwIUXp+RsQMJkAAJkAAJkAAJkEAgEaDwBtJs8VpJgARIgARIgARIgAR8JkDh9RkZO5AACZAACZAACZAACQQSAQpvIM0Wr5UESIAESIAESIAESMBnAhRen5GxAwmQAAmQAAmQAAmQQCARoPAG0mzxWkmABEiABEiABEiABHwmQOH1GRk7kAAJkAAJkAAJkAAJBBIBCm8gzRavlQRIgARIgARIgARIwGcCFF6fkbEDCZAACZAACZAACZBAIBGg8AbSbPFaSYAESIAESIAESIAEfCZA4fUZGTuQAAmQAAmQAAmQAAkEEgEKbyDNFq+VBEiABEiABEiABEjAZwL/D5ZinD62GgUGAAAAAElFTkSuQmCC"
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAArwAAAH0CAYAAADfWf7fAAAgAElEQVR4XuzdCXxNZ/4/8I8lESRBaMUaWhVLLS0GDdNB7bVUxFBVpIulrabTDumKGiPUTNHW1n9pawkTuqAl1NISS6X92ffaSVBEglgi/q/vc/NcJ8lNcm9OcnNzfc7r1Ve5Oev7nLSfPPme71Pk7t27d8GFAhSgAAUoQAEKUIACbipQhIHXTe8sL4sCFKAABShAAQpQQAkw8PJBoAAFKEABClCAAhRwawEGXre+vbw4ClCAAhSgAAUoQAEGXj4DFKAABShAAQpQgAJuLcDA69a3lxdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43fr28uIoQAEKUIACFKAABRh4+QxQgAIUoAAFKEABCri1AAOvW99eXhwFKEABClCAAhSgAAMvnwEKUIACFKAABShAAbcWYOB169vLi6MABShAAQpQgAIUYODlM0ABClCAAhSgAAUo4NYCDLxufXt5cRSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetby8vjgIUoAAFKEABClCAgZfPAAUoQAEKUIACFKCAWwsw8Lr17eXFUYACFKAABShAAQow8PIZoAAFKEABClCAAhRwawEGXre+vbw4ClCAAhSgAAUoQAEGXj4DFKAABShAAQpQgAJuLcDA69a3lxdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43fr28uIoQAEKUIACFKAABRh4+QxQgAIUoAAFKEABCri1AAOvW99eXhwFKEABClCAAhSgAAMvnwEKUIACFKAABShAAbcWYOB169vLi6MABShAAQpQgAIUYODlM0ABClCAAhSgAAUo4NYCDLxufXt5cRSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetby8vjgIUoAAFKEABClCAgZfPAAUoQAEKUIACFKCAWwsw8Lr17eXFUYACFKAABShAAQow8PIZoAAFKEABClCAAhRwawEGXre+vbw4ClCAAhSgAAUoQAEGXj4DFKAABShAAQpQgAJuLcDA69a3lxdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43fr28uIoQAEKUIACFKAABRh4+QxQgAIUoAAFKEABCri1AAOvW99eXhwFKEABClCAAhSgAAMvnwEKUIACFKAABShAAbcWYOB169vLi6MABShAAQpQgAIUYODlM0ABClCAAhSgAAUo4NYCDLxufXt5cRSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetby8vjgIUoAAFKEABClCAgZfPAAUoQAEKUIACFKCAWwsw8Lr17eXFUYACFKAABShAAQow8PIZoAAFKEABClCAAhRwawEGXre+vbw4ClCAAhSgAAUoQAEGXj4DFKAABShAAQpQgAJuLcDA69a3lxdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43fr28uIoQAEKUIACFKAABRh4+QxQgAIUoAAFKEABCri1AAOvW99eXhwFKEABClCAAhSgAAMvnwEKUIACFKAABShAAbcWYOB169vLi6MABShAAQpQgAIUYODlM0ABClCAAhSgAAUo4NYCDLxufXt5cRSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetby8vjgIUoAAFKEABClCAgZfPAAUoQAEKUIACFKCAWwsw8Lr17eXFUYACFKAABShAAQow8PIZoAAFKEABClCAAhRwawEGXre+vbw4ClCAAhSgAAUoQAEGXj4DFKAABShAAQpQgAJuLcDA69a3lxdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43fr28uIoQAEKUIACFKAABRh4+QxQgAIUoAAFKEABCri1AAOvW99eXhwFKEABClCAAhSgAAMvnwEKUIACFKAABShAAbcWYOB169vLi6MABShAAQpQgAIUYODlM0ABClCAAhSgAAUo4NYCDLxufXt5cRSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetby8vjgIUoAAFKEABClCAgZfPAAUoQAEKUIACFKCAWwsw8Lr17eXFUYACFKAABShAAQow8PIZoAAFKEABClCAAhRwawEGXre+vbw4ClCAAhSgAAUoQAEGXj4DFKAABShAAQpQgAJuLcDA69a3lxdHAQpQgAIUoAAFKMDAy2eAAhSgAAUoQAEKUMCtBRh43fr28uIoQAEKUIACFKAABRh4+QxQgAIUoAAFKEABCri1AAOvW99eXhwFKEABClCAAhSgAAMvnwEKUIACFKAABShAAbcWYOB169vLi6MABShAAQpQgAIUYODlM0ABClCAAhSgAAUo4NYCDLxufXt5cRSgAAUoQAEKUIACDLx8BihAAQpQgAIUoAAF3FqAgdetby8vjgIUoAAFKEABClCAgZfPAAUoQAEKUIACFKCAWwsw8Lr17eXFUYACFKAABShAAQow8PIZoAAFKEABClCAAhRwawEGXre+vbw4ClCAAhSgAAUoQAEGXhPPwCOPPILDhw+b2AM3pQAFKEABClCAAhTIbwEGXhPCDLwm8LgpBShAAQpQgAIUcJIAA68JaAZeE3jclAIUoAAFKEABCjhJgIHXBDQDrwk8bkoBClCAAhSgAAWcJMDAawKagdcEHjelAAUoQAEKUIACThJg4DUBzcBrAo+bUoACFKAABShAAScJMPCagGbgNYHHTSlAAQpQgAIUoICTBBh4TUAz8JrA46YUoAAFKEABClDASQIMvCagGXhN4HFTClCAAhSgAAUo4CQBBl4T0Ay8JvC4KQUoQAEKUIACFHCSAAOvCWgGXhN43JQCFKAABShAAQo4SYCB1wQ0A68JPG5KAQpQgAIUoAAFnCTAwGsCmoHXBB43pQAFKEABClCAAk4SYOA1Ac3AawKPm1KAAhSgAAUoQAEnCTDwmoBm4DWBx00pQAEKUIACFKCAkwQYeE1A2xN4Y8/FolLpSqjiXcXEkbgpBShAAQpQgAIUoEBuBRh4cysHIKfAe+bqGXRa2kkdYffA3SaOxE0pQAEKUIACFKAABXIrwMCbWzk7Au/ak2sRtj5MHWFOxzlo5t/MxNG4KQUoQAEKUIACFKBAbgQYeHOjlrZNTiO803dMx4ydMxh4TRhzUwpQgAIUoAAFKGBWgIHXhGBOgXfwqsGQGl5ZxgWNQ89aPU0cjZtSgAIUoAAFKEABCuRGgIHXoHbr1i0MHDgQ169fx/fff5+jZ06B94nIJ5B0K0ntZ1ijYRjeeHiO++QKFKAABShAAQpQgAJ5K8DAm+Z59+5dhIeH45tvvkG9evVMB17jC2tyiP51+yP8L+F5e/e4NwpQgAIUoAAFKECBHAUYeNOIZs6ciRUrVqB79+744YcfTAde4wtrcoimFZtibqe5Od4QrkABClCAAhSgAAUokLcCDLwAVq1ahdGjR6uQu379eixatMh04NUvrEnQlTpeBt68fXC5NwpQgAIUoAAFKGCvwH0feHft2oXnn38eX3/9NRo2bIjIyEibgfejjz7KZDp79mx4veeVrfXIZiMxafsk+Hj6YHO/zfbeF65HAQpQgAIUoAAFKJBHAvd94B0zZgxiYmLQpk0bRbpv3z7s2bMHffr0wWuvvQYfHx/1eXR0dCbyV199NcfAuyp4FSefyKOHlbuhAAUoQAEKOEPgwoUL8Pb2RsmSJZ1xOB7DCQL3feD9+eefVcjVy44dOyD/DBo0SI38li5dOsvbkFOXBr1hy4UtcfX2VcT0i4Gvp68TbisPQQEKUIACFKCAowInTpzAiy++iOPHj6tNQ0JCMHbsWHh4eDi6K67vYgL3feDNeD+yKmmwdd/sDby6Hy9nW3Oxp5+nQwEKUIACFDAIhIaGqpHdiRMnIi4uDr169VKBt0ePHnQq5AIMvBluIANvIX+iefoUoAAFKHDfC5w9exYRERHYtm0bPD090b59e3zwwQfZuly5cgVNmzbF4sWL8fjjj6t1JexK8JVOTlwKtwADr4n7Z+8Ib8SvEViwfwHkBbYB9QaYOCI3pQAFKEABCriewJSfDhfISYU99Uim48okUl27dkXFihXx0ksvITU1VQVWCbLZLUeOHEHnzp3Vez0PPvigWvXLL7/Et99+a1fnpgIB4EHtFmDgtZsq84r2Bl7dooyzrZnA5qYUoAAFKOCyAjXCfyiQczse0TXTcSWwyns4a9asQY0aNew+r99//x1///vfERsbizJlyqjtpE3pZ599ho0bN9q9H67omgIMvCbui72Bd96+eao1mZ5tTWZhk968lUtXRjP/ZibOgJtSgAIUoAAFCl7AlUZ4o6KiMH78ePUCuiOLHuHdvHkzHnjgAbUpR3gdEXTtdRl4TdwfewPv9vjtCI0OtU4+MWLdCKw/tR6VvSsjqlsUOzeYuAfclAIUoAAFKGAUWLduHYYMGYItW7agQoUKduPYquGV1qXx8fGs4bVb0XVXZOA1cW9yG3h7L++Ng5cOqiPrUV8Tp8FNKUABClCAAhRIE7h06RLatm2L7t2745VXXlEtxebOnYs333wzRyMphfD19WWXhhylCt8KDLwm7pm9gVcO0eCrBupIuwfutv5ZH1pGeev41TFxJtyUAhSgAAUoQAEtIHW84eHhanRWFum+IF2YclqOHj2q+vCeOnVKrSptycaNG6c6PXAp3AIMvCbunyOBN3hZMA5dPqQ6NUg9b6XSldC2elvVvaH7w90xvtV4E2fCTSlAAQpQgAIUyChw8eJFlChRQvXWTUpKwp07d2wiFSlSxPqimqwgQVm2kX+4uIcAA6+J++hI4H1307tY9scyVccrL6zJv//V6l/WaYdlCuIq3lVMnA03pQAFKEABClAgK4H33nsPp0+ftvllKXv4/PPPiefGAgy8Jm6uI4FXd2rQh9MtynQQZi2viRvBTSlAAQpQgAIUoEA2Agy8Jh4PRwKv7tSgDzcuaBx61uqJA5cOIGR5CHw8fbC532YTZ8NNKUABClCAAhSgAAVsCTDwmnguHAm8chj94pr82fiimq7vndNxDvvymrgf3JQCFKAABShAAQow8ObxM+Bo4NXBVk5DujXoRU89zJnY8vgGcXcUoAAFKEABClAAAEd4TTwGjgZeXa9bu1xtLO2+1HrktSfXImx9GAL9ArGk2xITZ8RNKUABClCAAhSgAAUyCjDwmngmHA283x35Dt8f+V4F2/C/hKc7si53iOkXw5nXTNwTbkoBClCAAhSgAAUYePPwGXA08GZ36MGrBqt2ZVPaTEG76u3y8Cy5KwpQgAIUoAAFKHB/C3CE18T9z8vAO33HdMzYOYNTDZu4H9yUAhSgAAUokBcCqampuHv3LooVK5YXu+M+XECAgdfETcjLwKvblkl7MunWwKmGTdwYbkoBClCAAhTIpYAEXZmkQpbx4zkLai4ZXW4zBl4TtyQvA6+chi5rkD/rPr0mTo+bUoACFKAABSjggMDKlSsxZswYXLp0CX369GHgdcDO1Vdl4DVxh/I68MqpGKcgnttpromz46YUoAAFKECB+1Pg7NmziIiIwLZt2+Dp6Yn27dvjgw8+yBHj+vXrSExMxOTJk1GiRAkG3hzFCs8KDLwm7lV+BN4zV8+g09JOqOxdGdHB0SbOjptSgAIUoAAFnCSwIcJJB8pwmL+l73gkX7116xa6du2KihUr4qWXXoLU486cOROLFy+2+xxHjx6NlJQUBl67xVx/RQZeE/coPwKvnI5uUWacnMLEaXJTClCAAhSgQP4KjCmTv/vPau9jrmT6SkxMDAYNGoQ1a9agRo0auTovBt5csbn0Rgy8Jm5PfgVeXcvLqYZN3BxuSgEKUIACzhNwoRHeqKgoNTK7Y8eOXF8/A2+u6Vx2QwZeE7cmvwKvruPli2smbg43pQAFKECB+1Jg3bp1GDJkCLZs2YIKFSrkyoCBN1dsLr0RA6+J25NfgVf35B3WaBiGNx5u4gy5KQUoQAEKUOD+EpAOC23btkX37t3xyiuvwMPDA3PnzsWbb76ZI8SdO3cg/4wdO1bV8I4bNw7FixdH0aJFc9yWK7i2AAOvifuTX4F37cm1CFsfhqYVm4KdGkzcIG5KAQpQgAL3pYDU8YaHhyM+Pl5df9OmTREZGZmjhayTsZvDv//9b4SEhOS4LVdwbQEGXhP3J78C74FLBxCyPISdGkzcG25KAQpQgAIUuHjxomov5u3tjaSkJDV6a2spUqQIypQpoBfveJucIsDAa4I5vwKvnBI7NZi4MdyUAhSgAAUokEFAZk87ffq0TRcpe/j8889p5sYCDLwmbm5+Bt7gZcE4dPmQmma4mX8zE2fJTSlAAQpQgAIUoMD9LcDAa+L+52fgHbFuBNafWs8phk3cH25KAQpQgAIUoAAFRICB18RzkJ+Bd96+eZi0fRIC/QLVKK+vp6+JM+WmFKAABShAAQpQ4P4VYOA1ce/zM/Am3kqETEAhZQ1tq7fF1DZTTZwpN6UABShAAQpQgAL3rwADr4l7n5+BV07rzNUz6L2sN67evspaXhP3iZtSgAIUoAAFKHB/CzDwmrj/+R145dQ465qJG8RNKUABClCAAhSgAGt4zT0D2QberTOAG1eAxv2AsgG5PhBnXcs1HTekAAUoQAEKUIACSoAjvCYehCwD740EICIt5D4ZDrR5O/1R5OteZe06MgOvXUxciQIUoAAFKJAnAsnJyZDpiStVqsQphfNE1DV2wsCbdh9u3ryJ8+fPo3Tp0vDz87Pr7mQZeA+sABb1t+wjIAgY/KPlzwknLJ8Hds0cgrM44vb47QiNDrVMM9xiDLDqbSDhJDBohd2h2a6L4UoUoAAFKECB+1xg6NChWLt2rVKQLBAcHIyRI0fe5yrucfkMvADefvttLFmyxHpHZc7t6dOno1y5ctne5SwD78pRwLaZ97YNPwEc+MESVqXMIbAL0C/nOb1lB+kCr/9TwHfDLfut0xXou9A9nkJeBQUoQAEKUMAFBKZOnYpOnTohICAAmzdvxpAhQ1Q+aNSokQucHU/BjAADL6DC7V//+lcEBgbi7Nmz6NOnDwYNGoRhw4blLvBOaWAZhS3hC9xMBJoPTR+Ay1YHwnbbdd+kU0OnpZ1Q2bsyoovWBHYagnLfBUCdp+3aD1eiAAUoQAEK3C8C8v/yiIgIbNu2DZ6enmjfvj0++OADhy+/devWePbZZ3PMAw7vmBs4XYCBNwP5rVu3IA94WFgY+vXr53jglbKFKQ0tYbfFcODniHv7kHrerdMtIVhGfe2s423wVQO1j92XpSziJNConyX4+jcAhm6yfY5yHiZelnP6k8gDUoACFKBAoRWYsXNGgZz7sEaZB6bk/+Ndu3ZFxYoV8dJLLyE1NRUzZ87E4sWLHTrH48ePq6A8e/ZstGnTxqFtubLrCTDwpt0T+Qb5/PPPsWHDBjzwwAOYNGkSvL29rXfs0KFDme6efEMdfrd2+s+lZCF+t6VsoeVw4Mu0EVhdyzu3C3AixlKDKyO9CacAL1/Av2GWT4c18B5LGzV++xQwoZolOA/dmH7bHQuBDRMArzJZh2HXew55RhSgAAUoUIgF9P+nnH0Juwdm/m1pTEyM+i3tmjVrUKNGjVyd0tWrV9G3b1/4+Phg/vz5KFasWK72w41cR4CBN+1eyFuZo0aNwv79+9VPhR999JF6Q1Mvw4en1c4a7p18Mx3uf9723ew5HWjcH/h2qGVU9pkZlhHX9RMso74y2ntiE3DcMEIrI7YSgiX81mgF+D+qRoFlxrXYc7GYE3cezQLaWup/Zb8yyivlEp0nWs7hy67p9+fAKLLrPJI8EwpQgAIUKGwCrjTCGxUVhfHjx2PHjh25YpQ88MorryAuLg4LFy7M8X2eXB2EGzldgIE3A/ndu3cxePBgFXYnTJiQ7Q1RL62tmWN7HQmvtkoWdAcH+bqMBMtS8VHg3B7b+ylbHYPLlkBskZuWwPu30ZZSifhdwMzWloAs9cC6lEL2UqYacOWUZRS5RmunP1Q8IAUoQAEKUKCgBNatW6deNtuyZQsqVKjg0GkkJiZCBriuX7+OL774gmHXIT3XXpmB18b9+fDDD3H06FF8+eWXOQfew4cdu8PGYCpbSj3uM2kdHY5vtIwGx+2yhGEpfZDZ1ir4YZmPN0ZevIy6PWYB3hVVmzJ8/Kgl2EpZg6wvHRyklEJCsHSJsNUD2LGz5doUoAAFKECBQiUgPXTbtm2L7t27q5FaDw8PzJ07F2+++Wa21yEhV9qQpaSk4JNPPrGWNUo5g/E3voUKgydrFbjvA6/U6cyYMUM95NWqVcPevXtV7c/LL7+sfsrLbsn11MK6/lZ2ntMobMIJTI/9GDPOrEVtlMAh3FSnpOqWdPszCc1SO3zwR0BKKaR+V/r9GnsA86GnAAUoQAEK3CcCUscbHh6O+Ph4dcXSbjQyMvt2oLKuvLSecZF+vNLtgUvhFmDgvXoV/fv3x759+6x3slevXpBR3hIlSuRP4NUvrknpwRtZlDIYjqxnWzOezKrgVaiSkmLpCCGLhFwJvWG7LH/WM72NuVK4n1CePQUoQAEKUCCXAhcvXlT/L5eX0JOSknDnzh2beypSpAjKlCmTy6Nws8IgcN8HXn2TZKRXvjGkQ0OpUqXsune5HuGV0gV5ee2x/pYX23JY9OQTxtXmdJyDZv7N7r28Jl+UWuBhljKIdOUO2XSAyOnY/DoFKEABClDAHQTee+89nD592ualSNmDdGri4r4CDLwm7m2uA6+Dx7QVeEc2G4kB9Qakf1nN2LFBd3HoNMHykhsXClCAAhSgAAUocJ8KMPCauPHOCryJtxIRFBmU7kz71+2P8L+EWz7bsQC4fBJoOexeZwj5TF5iM74UZ+JauSkFKEABClCAAhQorAIMvCbunLMCr5yiburd/eHuWPbHMtWlYW6nuVmfve4GIfW84SdNXCU3pQAFKEABClCAAoVbgIHXxP1zZuCVF9dkpLdd9XYIjQ5FZe/KiA6Ozv7sjW3LWMdr4k5zUwpQgAIUoAAFCrMAA6+Ju+fMwGs8TetUwzamVEx3ObptmbG218T1clMKUIACFKAABShQGAUYeE3ctYIKvMHLgnHo8iFEdYtCHb86WV+BdIP48mlAZnWTRcob/hbO2ddM3HNuSgEKUIACFKBA4RNg4DVxzwoq8A5eNRix52Ixpc0UVeKQ7WKc5EJW5EtsJu44N6UABShAAQpQoDAKMPCauGsFFXj1RBTDGg3D8MY5tByL7GeZgU0vMu2wzO4mHRxkGfRDZoEpDSxTHOc0C5wJO25KAQpQgAIUoAAFnCXAwGtCuqAC73dHvsP7Me9DOjaMbzU++yuI3wXs/wGo1AD4dhhwMxGQmt5tMy3bDd0IZHyhbUzabDPs4Wvi6eCmFKAABShAAQq4igADr4k7UVCBV09EEegXiCXdlth/BXoyCuMWtkKtDrx82c1+W65JAQpQgAIUoIDLCjDwmrg1BRV45ZTt7tRgvD49GYXxs8AuQL/Ie5/oF93kk4AgYLChHMKEFTelAAUoQAEKUIACBSXAwGtCviADr+7UMKfjHDTzb2bfVejJKHSYPREDSE1v2G7bgVc+HXPF8rX1E4CfI4CMAXnrDODGFUv3By4UoAAFKEABClDABQUYeE3clIIMvO9uelfNuDay2UgMqDfA/qvYOh2I2wW0eRuY0cpS0xu2CygbgL1nE+G1dxEejvnnvf2lfQ26HCJjQNblD3zBzf57wDUpQAEKUIACFHCqAAOvCe6CDLwOvbiW1TXqDg49pwON+6PL1I3ocH4Owjy+ubeFDrJzuwAyImwc9ZUX4ma2tnz2ZLglRHOhAAUoQAEKUIACLibAwGvihhRk4D1w6QBClofYN8VwVteoyxSaD0Xi3/6FhmNXI6zYkvSBVwdZPZIr+9IhmPW+Jp4ebkoBClCAAhSggLMEGHhNSBdk4JXT1i+uxfSLga+nr+NXYpiJLbr1EgyZ9xsWe3yI5sUOWGp1pX+vTFTROQKICLi3/7QRYRgDr3Hk1/Ez4RYUoAAFKEABClAg3wQYeE3QFnTg1TOuOfTiWsbrTRu5fS/wB8zfeeVe4JWRXXlJTTo1SKmCTFGsFz3qq0eI9ees4zXxNHFTClCAAhSgAAXyS4CB14RsQQfeiF8jsGD/Atg141pW15lWm/taiXFYfuVhbCrxOqoWuWCZkELX58qIrszMVqYacOXUvXZlGQMv63hNPE3clAIUoAAFKECB/BJg4DUhW9CB16EphrO6zpWj1KxrU273whcefbEbfSxrSjuyGUHAuT2AfwMgfrelvGFn5L1WZmnbouKjlvXYt9fE08RNKUABClCAAhTILwEGXhOyBR14daeGNtXaYFrbabm7kgMrgEX9se1OHfy/Rz7D50fbqf1sef4oWp5fDKwydF7ou0Ctq5bwE0Dks5bODfpzrzJA+MncnQe3ogAFKEABClCAAvkkwMBrAragA6+eYrhpxaaY22muzSuJ+u00lv52Gu3rVcQLrWpmWudMfDyqzAxE4t1SONhmJppteF6F39TBP6Kl31VgSsN720iZw8pwS8iVel0padB//naYpdxB1vE3bGPCl5tSgAIUoAAFKECBvBBg4DWhWBgCr/TW3ReXiOY1/bB4SMt0V3vqcjL+GbUT/zndH1WK/gl0mqBGdCXwbv7rPLzRvjYuTW4Gv6uH1HanXo9HtQ1vWMoaZN0dCy2lDhJyt0y/93mL4SZUuSkFKEABClCAAhTIWwEGXhOerhJ4A/0CsaTbkkxXIoG29cR11s+PR3S1/vnlr2Oxet859fcZpWajc+oGa63u0pTWOPnkf1Xg/erjkRh4ZRbOpFbAqg4/4YXbiyzdG5oPVbW/apF63x0LLC+2STsz6epwIxHwfxTwKmtCmJtSgAIUoAAFKEAB8wIMvCYMCzrwyqnrXry7B+7OdNgI7wEAACAASURBVCVfbDqGcSv2WT+PfLkFWj5UHluOXkS/2VvV58GPV8WbD8ai8oZ/WNeTF9i2VH9ZjQg3fPt/qFfkBBJRColl6mLT3z0sLcrkBTXjzGsJJyzlD1LHK8uNK5x9zcSzxU0pQAEKUIACFMg7AQZeE5auHnh1OUOVsiVxJiEZ7z9dT9XxvvR1LNbsO4fX2z2iRnHV8u1QS0kCoDo2SOANa19bBeO6lXxw+nIykm6kYM3gGngk8glLsJVQK63K3thj2cfHj1rqePXCrg0mni5uSgEKUIACFKBAXgkw8JqQdOXAq8sZfLyK44Nu9VWtrry4Jn/WZQ67RneAb0mPewIrR+Fmyh002/wXJN4trQLx1LWHMTioBhKTU7D099Nq3T1eL8Ib1y3bGUOtITSrr5WtDoRlHnk2Qc5NKUABClCAAhSggMMCDLwOk93bwBUCb/CyYBy6fAhR3aJQx6+O9eQ+XnNIhVUpWZCRWgm5VcuVVKF3bsxx9fl/+jSyefU1wn9Qn8v6MrI7a0ATVC1XCm9F7cD+uKR7s7FlDLwy1bC8vCZBV9f3Svsy1vGaeMq4KQUoQAEKUIACZgWcHnjv3r2LY8eOIS4uDg8//DD8/f1x4sQJlCpVCg888IDZ63Hq9q4QeLOaXrjVxHUqrOq63QZjolVJgl5+GNEa9Sv72vTqPPUXFWz1knEk+P9mvYzH4harL8fVGYhKfW30AE6bwU21L6vR2qn3hQejAAUoQAEKUIACRgGnBt6rV6/ipZdeQmxsrDqHyZMno0ePHhg+fLgKwStXrixUd8dVA2/03ngMmfcbpHY3JrytMtV1u1LiEPZUbZs9eTW+9O49fek6Em/cVqUMmUaCt063TkixusIgdHh1aub7pssbpH0Z25QVqueaJ0sBClCAAhRwNwGnBt5Fixbh448/xjvvvIOvvvoKAwcOVIF327ZteO6557Bp0yZUrFix0Bi7QuAdsW4E1p9ajyltpqBddcssaTrc6pfU5DMpcZB+vP8JaZS+bjc32lK6IJ0apKXZ3d4YNvaLzHvRoVimI34mrX1Zbo7FbShAAQpQgAIUoIBJAacG3qeffhqdO3fGK6+8gtDQUBV25Z9Lly6hefPmWLp0KRo2LDyzdBVU4JUX0mT2tMTk26hQbQNm7JyBYY2G4THfv2PqT4ex9ehF9VhkeinN5MNi3fxGAhARoP768u1/IPjZIehY3z/93nUoZqeGvFLnfihAAQpQgAIUyKWAUwOvhN1evXqpsgZj4D1y5IgKwuvXr0fVqlVzeSnO36ygAu/Y5XvVi2ey9Gq7F2vi5uGBlG44ejhIfWZP2YJprQnVgJuJ6HvrPfjUaYPPn2+afpeGUKwmpuBCAQpQgAIUoAAFCkjAqYF3zJgx+OWXX7Bw4UJV1iCju+3bt8cbb7yBHTt2YPPmzShWrFgBUTh+WGcG3r1nE7Hkt1MY3a0+dH9dFW791wLl1uDmhXYocbUzQoNqqvrcdO3GHL+0nLeI34XriZdQb841ta4cN+ypR7AvLgktHvKzbK/78obtAspaRoS5UIACFKAABShAAWcLODXwSumChNz4+Hh1ndWqVVPlDNeuXcPMmTPRrp2lBrUgluTkZHUulSpVQtGiRe06BWcFXgm7/WZvQeKNFEh3ha7TNqrza17TD7F/bkSpqvNQvUQzRPaYlf9BN4OMvOAmPX6Ni7Qz61DPH++cfxPFT20G+i4A6lhqfrlQgAIUoAAFKEABZws4NfDKxUmwlJfXdu/ejaSkJNSsWRO9e/dG7dppM345WwDA0KFDsXbtWnVkPz8/BAcHY+TIkTmeia3A23f2VlVDm12fW9mx1OGW8SqebUCVddbsi8eUNYdU2JVF98aVfrrSbeG1b6NwwXsKmlZsirmd5uZ4zvmxggTyl7+OVbO56Vnd5Dhf1fgJT8bP4RTD+YHOfVKAAhSgAAUoYLeA0wOv3WfmxBWnTp2KTp06ISAgQJVVDBkyBEuWLEGjRrYnZtCnljHw6tnN5Ou+XsWxaVTbdIFW2oXJaKgOrx+FNEJIk/Q1y6v3ncPWPy5i9b541UdXLxJwZTpgvegODNvjtyM0OrRAA2/GW7Xl6EU1JXGHYrGY7fFfILAL0C8SUwwv1MkLd9I1wpaBE289D0UBClCAAhSgwH0g4NTAK312T548mSXroEGDUKJEiQJnb926NZ599lkMGzYs23PJGHiNL5PJhjrMSbiTsDcn5li6/UmIlZe9pCxgzd54SNg1LvLymZQGtHi4vArGb/5vp3V6Xz1xxIFLBxCyPASBfoFY0m1JgdvpE5A2aGvWr8GPnu+omdf29omxlmJkPEkZqZb6Xy4UoAAFKEABClAgPwScGnjfeust/PTTT5muQ2p4Zfntt9/g62t79q/8uHhb+zx+/Lh6kW727Nlo06ZNjoF3XvQ26zovf71djd4ODqqhuijUq+SrpvWVkgQZzZRFRmYlvDYcu1r93ViTK3+vW8lHhdwO9f0zzYQmI8hv/W+HGjU2dkVo8FUDta/dA3c7iynH40jI7zx1I2JuPKPW/Thou3Wq495Nq8LXy0OZ6Prf3k2qYnJI9iPqOR6UK1CAAhSgAAUoQAEbAk4NvFndgX/84x9ISUnBtGk2pqh14m2TmeD69u0LHx8fzJ8/P13HCOkkkXFZsWIFbgdPSfexBNaVr/8VGafylc8nhzS2hlg9OYSUPkhIltHe3E4K4YqBV1C+2HQMj0b3RfNiBxDuMwGLLgRg1oAm6Xr2Gl96e73dI3ijfcHVcjvxUeOhKEABClCAAhRwooBLBF5pSRYSElKgM63Jy3QyIUZcXJxqm1auXLl0t2HPnj2ZbsszzzyDx9/6Ot3noa1qqkAnYW/1Xks3inqVffHGU7XT1fNm7G6wcVRbVCtXMle33lUDr4xI//qfEAQX34gPUwZgTkpnHI/omukajRb5NllGrmS5EQUoQAEKUIAC7iDgEoH38OHD6NKli+re0KRJE6e7JiYmYvjw4bh+/Tq++OKLTGE3qxMy25ZMjwLn1NEhJ5DgZcE4dPkQorpFoY5fnZxWd+rXZ096Cy9f/xxzUjpiS+2RmSeoSDubv8/agm3HLiHy5RZo+VB5p54jD0YBClCAAhSggHsLODXwSgcEGUE1LhI2ZUrhCxcuYOPGjfD09HSquIRcaUMmJRWffPIJvL291fFlAgzpyZvdYjbwSjsvaWEm9bq5Hd2V8xu8ajBiz8ViTsc5aObfzKl+OR3s59Xf4MnNg7HtTh2c7Lk0U1cKvb285DZ17WGwrCEnUX6dAhSgAAUoQAFHBZwaeI39bo0nKtMKS1eEFi1aOHr+pteXSTCkK0PGRfrxbtt274U0WwcyG3hNn3zaDnTgrexdGT1r9cSwRtl3l8ir49qzn6SEC/CZUkuteur1+CyDvS5r0J0r7Nk316EABShAAQpQgAL2CDg18N6+fRt37txJd17FixeH/FMYF1cJvPP2zcP0HdNx9fZVxbgqeBWqeFdxHVI9xfCT4UCbt22el4x2ywxyMrGG9C/mQgEKUIACFKAABfJKwKmBN69O2lX24yqBV3voWl6XK204vhH4Mm1q4bBdQNkAm7ewRvgP6nN5ge/M5WRIH+L6lQu2TZ2rPGs8DwpQgAIUoAAFci+Q74FXpuzNbrIJ46lLWYMrTDxhL6erBd6IXyOwYP8CVdIwvPFwey/DOeutHAVsmwnUaAUMsgTbjIt+cU1Pn2xrtjrnnCyPQgEKUIACFKCAOwnke+AdMWIEZIY1e5bY2FiUKVPGnlVdYh1XC7xS2jBp+yR0f7g7xrca7xJG1pO4kQB83AC4mQhkUdqQcaY62VYm8Rjdrb5rXQvPhgIUoAAFKECBQiWQ74G3UGk4eLKuFni3x29HaHQomlZsirmd5jp4NU5Y3VjaMHQj4N8w3UGN/Xib1/RTbcpkMdOj2AlXxUNQgAIUoAAFKODiAgy8Jm6QqwXexFuJCIoMgo+nDzb322ziyvJxU13aENgF6BeZ7kDy4lrf2VuQdCNFhVyZknnp76c5ypuPt4O7pgAFKEABCtwPAk4NvLdu3VK9bmNiYpCUlJTJ95tvvlHT+haWxdUCr7jpWddi+sXA19MFX/hKOAFMaQh4lQHCT2Z7q9m5obB8J/A8KUABClCAAq4t4NTA++mnn2Lq1Kno2LEjoqOj0a9fP5QuXRqRkZEICAhQM62VLJm76XULgtkVA68rT0JhvUe6TZmNsoaM91HPRvfDiNbs2FAQDzmPSQEKUIACFHADAacG3l69eqFly5Z45ZVX0KhRI/z0008q6C5evBj//e9/1chvYerJ64qB991N72LZH8swstlIDKg3wDUfUV3WkE1fXn3ib/5vpypreP/penihVU3XvB6eFQUoQAEKUIACLi3g1MArM5q99tpr6NOnDyQsfv311yoAnzhxAk899RS+//571KtXz6XBjCfnioFXJqCYsXOGa7Ym03gHVgCL+gMBQcDgH7O939F74zFk3m+oV8kXP76eeUa8/HpY3oraiarlSiG4SVVT0z7n1/lxvxSgAAUoQAEK2C/g1MDbo0cPFWwl9A4aNAg1atTAmDFj1Miu/H358uWoU6eO/WdfwGu6YuB1+U4Ncs+kRVlE2uQTY65kexcTk2+j4djVap1dozvAt6QH5LO9cYlo+VD5fHsC9CQYkS+3yNfj5NsFcMcUoAAFKEABClgFnBp4//GPf+DUqVOIiopSo7lvvfUWatWqhSNHjiAwMBArVqwoVLfGFQPvgUsHELI8BJW9KyM6ONp1PWcEAef2AH0XAHXSZmHL4mxf+joWa/adU6O8oa1q4sPle9Xoa8YR361HL6HFQ36mr1m/LCc7er3dI3ijfW3T++QOKEABClCAAhQoOAGnBt6rV6/i5s2bKF/eMjInwVdmYqtfvz5CQkLg7+9fcBK5OLIrBl65DN2pYffA3bm4Kidtsn4C8HNElpNQGM9CRnT/PnsL9sel7+whrctOX76OqT8dxtajF9UmeTEiq8soGHid9CzwMBSgAAUoQIF8FnBq4M3na3H67l018BaKTg0O1PHKjdWh9/TlZPh6eeBMQjI+CmmErX9cVC+16aXFQ+Wx6OUWpp6Fj9ccwtS1h9U+2teriM+fb2pqf9yYAhSgAAUoQIGCFXBq4JUSBpk6WEZzC1Otbla3yFUDb6Ho1OBAHa/2l9B75UaKGs39Z9ROyGxs++MSkXgjBdK2TE9aYXaUV5dQyHHlGIuHtCzY71IenQIUoAAFKEABUwJODbwLFy7EpEmTcO3aNTz++ON49tln0aFDh0LVe9eo7aqBd96+eZi0fRK6P9wd41uNN/WA5OvGDvTjNZ7HqcvJaD1xnfUjHUr1yKzZUd5WE9dBRpJl8fUqjl1jOuYrA3dOAQpQgAIUoED+Cjg18Mql3L59G7/88ou1flcmnpAJKHr37o2HH344f682j/fuqoFXd2oI9AvEkm5L8viq83B33w4FdkYCPacDjfs7tOPOU3+x1vRKaUNIk6qq7CFo4jo1NbGZiSp0hwZ9Qscjujp0blyZAhSgAAUoQAHXEnB64DVe/rlz59SkEzLdsCyxsbGq5KGwLK4aeMWvULy4tnU6sOptoFE/4JmZDt32scv3Ym7McbWNblcmf/77rC3YduxSupfXZER47qZjat0PumXf53nL0YvoN3sr6lbyQWJyiqoVlpfjqpUrPDMAOgTJlSlAAQpQgAL3gUCBBd49e/Zg6dKlmD9/vmKWSSk+++yzQlXe4MqBN3hZMA5dPoSoblGo4+eivY2PbwS+fBrwbwAM3eTQt5uE2CWxp1DVr5Qa3dWLLmswthMzvoSmR4P1+hJwl/52Gs0fKq/2E/XbaVUfHPx4VdUBImN4dugkuTIFKEABClCAAi4h4NTAm5CQgB9++AGRkZE4ePAgpJxh4MCBCA4ORvXq1V0CxJGTcOXAO2LdCKw/tR7jgsahZ62ejlyWc9cdkzaiH34C8Cpr+thfbDqGcSv2YXBQDYzuVl/tzxh45e/Gl9r6zt6qXoLTtbp65FgC8764RNX/N2NINn2S3AEFKEABClCAAk4VcGrgHTp0qOq7GxQUhL59+6Jt27bw9PR06gXn5cFcOfDqKYb71+2P8L+E5+Vl5+2+5nYBTsQAg1YANdKmDpYODvF7AC9fwL+hQ8fTJQnG7gpv/m+nal0mZQrSy1dGb//Tp5F1NFcfQEojpEODHtWVlmfSnoyTTzh0C7gyBShAAQpQwOUEnBp4V69erWZUCwhIm1bW5TgcOyFXDryFYophNdzaDzj4Y/oZ13Rtb0AQMPhHh26K7uBg7K6g63plpFbKFeRrm0a1RZdpG63dGPTI75CvY1WbM6nbXb03PtNosUMnw5UpQAEKUIACFHAJAacGXpe44jw8CQbePMC0NeOa/qxsdSDM8dnidJcF3V3B+CKbTEsso7zSukxKGWQkWKYplhHg95+upwKuj1dx7B7TEbZGi/PgirkLClCAAhSgAAWcLMDAawKcgdcEnt7UVuDVo76yzpgrDh9EtyzTtbrGwLvvbKIKtXqRdXTpgoRfKWfQ5RA68Nar5IsfX08rt3D4bLgBBShAAQpQgAIFLcDAa+IOuHLgTbyViKDIIPh4+mBzv80mrjKfN9VTDAd2AfpFWg6m63rlz7l4mU3PlDZrQBN0rO+PhmOiVZmC1OjKTG160gpdyxu9Nx5D5v1mvVDjC28ZR4vzWYO7pwAFKEABClAgHwQYeE2gunLglcsqFL14dWsyY71uRHXgRtrIrvFlNjvvVcbWZBlDq7QeO33pOno3rab66+49m4iu0zZa9y6lDS+0qqn+HhSxjr147XTnahSgAAUoQAFXFXBq4I2JiUFiYiI6d+7sqh4OnRcDr0Nctle21YtXtyqTLXIReHUv3fb1KuLz55vCnlFa4+xqxrZlerSYrcny4F5zFxSgAAUoQIECEnBq4B0xYgSuXr2KOXPmFNDl5u1hGXjzyFMHXKnX1QFY77rTBKDFcIcOZHzZTAJvw7Gr1fbZTRFsnKrYuJ7u66vLHxw6Ea5MAQpQgAIUoIBLCDg18MoUwt9++y3WrVvnEhdv9iQKS+CN6RcDX09fs5ebf9tnF3ifDAfavO3QsROTb1tDrozWylTBxr68tnamR3KrlC2JmPC21lX44ppD9FyZAhSgAAUo4JICTg28f/75J5566ilMnToVTz75pEuCOHJSrh54B68ajNhzsZjTcQ6a+Tdz5NKcu+7HjwJXTgFhu4D/Wwj8HAGU8AVuJgKN+gHPzHT4fHSJgr2BV9f92grGel/y0ptvSQ+Hz4UbUIACFKAABShQsAJODbxhYWFqauGsltjYWJQpkzbVbMG62HV0Bl67mHJeyTjb2rFNlsArL7HJDGy5mHxCDqhLFGSWNJktLacRXpmw4vTl66onr7zIZlyMbc1aPlRe9efddvQSmj/kB/k7FwpQgAIUoAAFXFvAqYF3zZo1OHnyZJYizz33HEqUKOHaYoazY+DNo1tlDLzSl1eCbvOhwLaZgH8DYOgmhw+kQ6rU3sqkEmZqcMcu34u5McetUwzrfeuX4hw+OW5AAQpQgAIUoIBTBZwaeJ16ZU44GANvHiEbJ5+Ql9Yk8Ep3hi+fthwgF5NPvPm/nSroyqxpSTdSrGE1N2es+/TqUWLd19c4fXFu9sttKEABClCAAhRwjoDTA++lS5ewdu1anDlzRtXzPvroo1i+fDkqVKiAli1bOueq8+gorh543930Lpb9sQzjgsahZ62eeXTV+bAbY+CVcgYdco0vszl4WF2TqzeT0oY32td2cC+W1aXcQSarkIC7aVRb6wtx8jWZ3EKmKq5X2Rcd6lXM1f65EQUoQAEKUIAC+Svg1MAbFxenevBeu3ZNXdXkyZPRo0cPTJo0CUuXLoX06S1evHj+XnE2e09NTcXdu3dRrFgxu87B1QPv9B3TMWPnDAxrNAzDGzvW2ssugLxaaet0YNXbgMy2dvDHe4F3RhBwbg8wdCPg39Cho+l2YnkReGUfegIKmZTCODWxhGCZxU2WH0a0Rv3KLtwNwyFBrkwBClCAAhRwHwGnBl5pS/bTTz/hs88+wwcffKDCrvyzZ88ePPPMM2rkt3r16gWiK0H3vffeU8ceP368XefAwGsXU84rZey9q6cZNtb21mid834Ma+h2YvojPc2wQzsxrKxLJKRt2ZmEZPUS3LZjl9LtTsKwdHFY+ttp1K3kg9Hd6uf2cNyOAhSgAAUoQIE8FHBq4G3dujWGDh2K/v37IzQ01Bp4ExIS0KxZMzXK27ChYyN5eWGxcuVKjBkzBlJu0adPHwbevEB1ZB8ZA2/P6UDj/kBkP8uIr/67A/vMOF2wcfY0B3ZjXTXjiLGE2zmbjqnwK+FWyhrkJTZZ1uw7l2NXiNycA7ehAAUoQAEKUCB3Ak4NvCEhIXj88cfx9ttvpwu8v/76qwrBmzdvxgMPPJC7KzGx1fXr19WUx1JiIV0iOMJrAjM3m95IACICLFuWqQa8scfyZ2Ntr4OTT8jmWU0XnJtTtBWgpYWZtDLz9fJA12kbVY2vLm+QY2Q3s1tuzoHbUIACFKAABSiQOwGnBt5Zs2ZhxowZmDBhAhYuXIiePXuiVq1aeOutt1C2bFlERUXl7iryaKvRo0cjJSXFbQLv2pNrEbY+DG2qtcG0ttPySCmfdqNfUDPOrKYDr7Qo6zzR4QPnZeDNGKAzTkKha3yNJ8mJKhy+ZdyAAhSgAAUokC8CTg28EiYl3GacfKJatWqYPXu2Cr8FuWQXeDds2JDp1F566SUcPny4IE8522Nvj9+O0OhQNK3YFHM7zXXZ81QnJi+uXT5hmUbYq6zlXHWpQy4nn9D9ctWuIrqavn69v4zTD8uOdY2v8SBmyyhMnzB3QAEKUIACFKCAEnBq4NXmu3fvxv79+5GUlIQaNWrgiSeeQMmS6We3Koj7k13g/de//pXplL766isG3vy8US4WeLObflj36hUOqeWVOt6PQhohpEnV/BTivilAAQpQgAIUsEPAqYF3/vz5qFSpEtq1a5fu1I4fP65GeN9///0CDb7uVtJQqEZ4bT2surbXqwwQnvUMfVk953qGtLwa4c1u+uHE5NvYG5eo6nlX741X0xmb6f1rx/cuV6EABShAAQpQwE4BpwZe6dAgE028+uqr6U7v/PnzCAoKwooVKxAYGGjnqefdanfu3IH8M3bsWFXDO27cONUPuGjRotkexNXbkunAG+gXiCXdluQdmDP3lAeTT9gqQcjPS9CjvZx6OD+VuW8KUIACFKCA/QIFHnglYErQ/ec//1lgXRoiIyNVX2Dj8u9//xvSVSK7xdUDr5x7g68aqEvYPXC3/U+FK605oRpwMxEI2wWUTevkYOf5Rf12Gv+M2un0FmG6B7CeitjO0+VqFKAABShAAQrkk4BTAm/z5s1Vj9vsFpmBbdo0F+8kkOECClPgjekXA1/PQjgLmInJJ6QEYUnsKVT1K+X0WlrdIUJelpNyB5mQggsFKEABClCAAgUj4JTAKxNKJCcnY8GCBaqGt23bttar9fDwQJMmTQq8Q0Nu+AtT4JVODbHnYjGn4xw082+Wm8stmG1MBN6COWHLURuMiUbSjRTIBBVTfzqETaPaMvQW5A3hsSlAAQpQ4L4WcErg1cK7du2Ct7c3HnroIbdAL0yB18fTB0m3kjCs0TAMbzw8W/+IXyNw6PIhjGw2EnX86hTsvTI5+YSc/HdHvsOyP5ah+8Pd0bNWT6dcj7ElmhxwcFANTjXsFHkehAIUoAAFKJBZwKmBVw6/detWLFmyBCdOnMDw4cPRpk0bTJw4EeXLl8eLL75YqO5RYQi8HZZ0QNy1OKurPYF38KrBrjManAeBd/qO6Zixc4YKvONbjc/1M3bm6hllWal0JVTxrpLtftiXN9fM3JACFKAABSiQ5wJODbx79uzBM888A39/f9WDV7oi9OjRA9KuTP4s/Xm9vLzy/CLza4eFIfDq8KoNCl3gPbACWNQfCOwC9IvM1a3UgdfsBByO7Efqh6U9mdTunr50XbUpq1qupCpt0Iusc+ZyMqqUK4lq5Qq+D3WucLkRBShAAQpQoBAIODXwvvPOO0hISMBnn32GF154QYVd+efo0aPo2LEjVq5cWahqeRl4nfCEm5x8Qs7QkaCa3RVJqceC/QtyNXOdnnr4hxGtUb+y5eVBPZEF+/U64TniIShAAQpQ4L4WcGrglW4NMrWwtPsKDQ21Bl7p4CBf+/7771GvXr1Cc0PcNfA+EfmEqvd1iRfc4ncBM1sDZasDYblrraZHuc2O8JrZjy5xkJfYXmhVUz3jus6X/XoLzbc8T5QCFKAABQqpgFMD76BBg1ChQgVMnjw5XeCVoCtB+Pfff4ePj0+hoSyMgbdNtTaY1jb79m+6d69LBF55GkxMPiGbmwmqxofRzH6+2HQM41bsQ/DjVfGfPo3SBd687tc7J+YYEpNTENykKkslCs1/TXiiFKAABSiQnwJODbzR0dFqlrXnnntOvbwm7cn8/PwQERGBXr16qZfXCtNSGAKv/jW8drVnlNNdA68YmJmAQ7vYY5jxOd57NhFdp21MV8erR3gz1vaa/R7QPYAjX26Blg+VN7s7bk8BClCAAhQo9AJODbyitWjRIhVwr127ZsXr1q0bRo8ejTJlyhQq0MIQeHX9aqEOvLoXb+NngYSTwKAfHHpOjC/u5UXgrexdGdHB0Q6dg6ysg+iu0R3Uy2z67/I1maAirxYG3ryS5H4oQAEKUMBdBJweeAXu1q1bOHXqFK5fv46qVauiXLlyhdKzMAVebw9vXL191a4XrlxuhFcHXv2UODjNcO/lvXHw0kG1dW4D7/b47QiNDrU+p7nZjx7RnTWgCTrW908XeHUINvuNIJ0fWk9cp3bDEV6zmtyeAhSgAAXcRaBAAm9qamq6EV6NWZjq+wEYpAAAIABJREFUd+WcC0Pgld6xZ6+ehfz7/Zj3HQq89rQwc8o3wo4FwOWTgPz7yilg0AqgRmu7D60DfHaBV3xkco6spl/Oi8CbsSuDcYQ3r8LplqMX0W/2VgZeu58OrkgBClCAAveDgFMD7+nTp/HJJ59AanmNJQ0aOjY2tlCVNRSGwKttD1w6gJDlIQj0C8SSbkuyfbZ1QHSZwKvPNpfTDBsDb1S3KJuzx+U0OUXG0pDcjPBG743HkHm/QV5SC2tf2xpM83I0loH3fvjPNq+RAhSgAAUcFXBq4B0xYgR++eUXDBs2TE0+UaxYsXTn26FDB3h6ejp6DQW2fmEKvIKkg19OYc2dA29WnSdGrBuB9afWZzkCnheBNzH5NhqOXW0dfdUjsfJBXvXi1d0g8jJEF9g3GA9MAQpQgAIUyCMBpwZe6bXbv39/SPB1h4WB18l38duhwM5IoOd0oHF/uw9uHOHNKvDm1HJMB2J90Jh+MVmWP2R3YsYXyvIj8OqyCQZeux8PrkgBClCAAveBgFMD75AhQ1QbsgkTJrgFLQOvk2/j+gnAzxHAk+FAm7ftOnjirUQERQZZ180q8ObUcizjFM257VGsX1yTsoZtxy6hStmSOJOQnK4/r10XlsVKY5fvxdyY4+qreTVqbOZ8Mm2bcAL4brilBvtv4Xm6a+6MAhSgAAUokJWAUwPvb7/9hr59+2Lx4sV48MEHM51T5cqVUbRo0UJztxh4bd+q7458h2V/LEP3h7ujZ62eeXc/cxF4M75sNi5oXKZzkhfWOi3tZD1PWyUfeva5SqUrIe5aXK5noXvp61is2XcOdSv5YH9ckqrnleCbV5NP6EDtsoFXTxUtJzjmSt49G9wTBShAAQpQIBsBpwbe48ePo2fPnjZfWJNz5Etr+fusOquGV9e75vlLb3kQeG2dk80ODDKl8Y1EwP9RwKustf5ZJp2IPReb68BrLDmQuy3TCksArlfJFz++bug8cSPB0nPYv6FDD4XLB159Dxl4HbqvXJkCFKAABcwJODXwSknDjh078Oabb6JixYqZRnNbtmyJ4sWLm7siJ25d2EZ4g5cF49DlQ8iqU4GmM/vSmv71f54H3gMrgEX9gcAuQL9Iu+50xjBr65zm7ZuHSdsnpR/hndIgbZKLFTjg+4DqcCGju1W8q5gKvMaXyuSA0o5M1/Kmm3xCRkLlWjtNcKheudXEdTh9OVldi0uWNOg6bDnBoRsdDvR23XSuRAEKUIACFMgg4NTA27p1a/Tp0wevvfaaW9yIwhZ4dRDNqf7UZQOv/nV4QBAw+Ee7niF7Am/GDgyrglehyuR6lv0PWoHvUi6qHsZtqrVB0q0kU4HX2DYsx8D75dOWc6jT1fKinlfZHK/Z2Ns3rwKvvb8ZgIxKx+8BvHyzDrLGSUQc7Kec48VzBQpQgAIUoEAWAk4NvNKdQdqOTZ482S1uCAOv7duYU8eDXN/8XAReqSeWsKoXWyO8mV5IazMNzeak1R4PWoF3T69SNckjm41UgXfGzhnI7ei1cSY0HXg/XL5X1fOmm3zC+Kt/WdHOF/VMBV6Z2GPHQiCglfWlQGN9s/pBwLtK1rfPnvrcMYbpwxl4c/2twA0pQAEKUMAxAacG3nXr1kHKGmbPnq1KGjIugYGBmXrzOnY5zl2bgdf1A68evdW1t/LvuZ3mpjvxjks7qtno9PTLcxq9gWbfvWFZZ9AKPLFllAq6Evi+P/K9qcAruzSGUilj0HW3ZgPv3rOJ6Dpto/XaHB7h1eUGhhF04wh5Tr8ZwNbpwKq07hm2XkiTDg1TDDXJdoZ4fUHScUN+2BjVbJRzv9F5NApQgAIUKPQCTg28Q4cOxdq1a7NE40tr+fs8OVrSIL/Cn9Z2msMnlW8jvHImeoTQzjf87Qm8+lf2cr0y+cSU+i+j3Yr31HUf6P4RQnZ/oup3V/dejbx4Ia/z1F/UiK4sEnjf/N9OLP39ND4KaYSQJlUt3vpX/82HAttmAnaUcWQsl3A48Opj5jbwGutzw3YBZQMs1yIjv8djgLt3LW3l9OJA4JWwGxodioOXDmJKmyloV72dw88lN6AABShAgftXwKmBV7o0XLmSdSui+vXr86W1fHwW9eQJOQWGnHrS5nSKhSnw6imXa5errUKUKlco3xTDY79Rlzm9URfMSNyD/nX7I/wv4XkSeI2dFCTw6s4NxoCaOLcjDp3fAe/GA1Fny6xcBV7pAPH5801zul3Wr0+fUh0zypVBmxu3MW3IAcv175iuTGTJcYQ3q/rcma2A+N2Zz0PCfOeJOZ6fMezKytLubnyr8TluxxUoQAEKUIACWsCpgdfd2AtbSYO9o5N5FXgre1dGdHB03t72PB7h1b+yl1KHZv7NLIHX91EM32l5Ka53YEMcvJVgHVW01zC7i9aTQ+jeu7pzw+CgGhjdrb7adPt/H0Jo+dJo6lMTc3f9DPg3AIZuytZSB2e7evtGVAduXFElG2oSiIQTGLGwLdaXLoWmyTcwd+jhTIE3x7plY31u3wVAnbSX7oyfy14b9bPMmBcQhO9av6zqo2V0fUC9ATav791N76p1dA9kH08fbO63OW+fK+6NAhSgAAXcWsDpgffPP//Evn37bPbifeqpp+Dh4VFowN0x8BpfUrJV72rPzdE1sbKurUkc7NlHlutMqAbcTATCT9jVtUCHJXnhTFqPZQxLNgNvqUcwfK+l9KZBzerq3/o69Etw9pZ72CojMQbTxUNaQpciGCefmPffqphUvpzlfA/utXDkUMaRMfD61LXMZJZpGmTpphCRVm6gA++BtFrlokXhk5qKq4dGY3Dbxviz5FcqbMqSbeCVvsUzDX2EdblCxrpd2ZF8TUobAoIwuFJF1fUi0C8QS7otsVynYZvEd05ZZ8qTGmrxlIk/9G8pZIR+/v75qFy6MoY3Hm7q0eLGFKAABSjgvgJODbzSgzckJCRLTdbw5u+DZs/opPElpdwGXj1CnC+BV//a3M43/K2Bs+JTCD33U7rwKn8xBt7n6j2HsPVhaHO3JKYdP2gz8BrXz/jym627p2doM5YDSKeGrUcvorH3FTxS4jJ+T/RBr4Wn4R84Fy0DaiHlYAvUvTlClRcow2Mn7Qq8euRYTWZx8Dh8ao9V22UqYTF2U0hzPLPmXXQ6awm2stQ/+Dx86rRByoOfqUCaY+DVPZL1DnTgNXbWkFFqaa1Wt6slHPs3QMcHSqsXBmWRQLvnRDGsX/UNIpIsL79N7zQKMw5GqhFgqSeP+DUCC/YvUKPx+v7Jv9MF5vz9NuLeKUABClCgEAo4NfAOHz4cZ86cwdixY1XwXbFiBfz9/fH2228jNTUVM2fOLFSE7jjC67aBN+48QitZprM2jjobA6yMEMqLUU3vlsDc45Zf6Wcc4XU08OrwLyPMmX5ln/aS180n/om6myuidE3LC4IPH38aHcossAbemPgE+CbnPKqta4OlFvjTLdEoFTBb7U/XH1u/uYwtz9KC6doFXRGWkhaspZPEqY6449MDFQPnWANvtrWzep8VHwXO7bk3OYi0OvtuuCpjeOLGLtXtQk18Mq15Ol/5i0z7/MH8kuhwex0me1j+W/DEI4FISkm21g/rmmt9LbqzRsb7Wqj+Q8KTpQAFKECBfBdwauBt3749Bg4ciL59+6Ju3br4/vvvUa9ePfz+++/4+9//jk2bNtlsV5bvCrk8AAOvbTizI7zZhsrcjvA6EnilhjX+PLZ7eamQbBzpdiTwystWQZFBCslmOYC+lifD8cj2RHhVXKHWrX/6Sfy19DJr4J1z90E0Ox57r942i+c1q8CbafQzsh9wMG3ijrTAG/H/mmCBxy3rnp+Iq4/ohAHQZRHyBe2QmHwbfWdvReKN23j/6XroWN8/664SaUE4sfU/EHTaUrKgAviPE6y++qBNHmiFDb88jQ9Kf4/QO4utX5cXCpd2X2o9tw5LOqiyBtmP/JDy+rrXTU0GYu+3vy498fUqjl1jOtq7GdejAAUoQAEXEHBq4G3bti1CQ0Px3HPPQf4sM64988wzkO4NEoYjIyPRtKn9b5UXtF9hC7xrT661/Mo+m3ZjeT3Cm+Ob/TZuYrahUre+kpnHGvfPtLX+9buEJF9PX1XzKZ/NiTuPEf4VcbXI3XQ1rXpaYQlPPWv1VFMIB966hSVn4k0HXqOlzcCrpy8OCMJjt8ogxWuPup4upwMRUPrXPAu8ss90dbz6uPKFtMA7eOYjiC3phaYlHkTszfN47GJl/HJ+RKbAO/XJ2Srs7otLVOdq7SxhmIoZMkNc2epA2G4gLVxv7zIOofs/V9uolxlPnsG8uwmqTln3SJav3b7SBP7eO7Hh5FFML1tGGWQcoZbSHClp0GUNuswhqxpjCeir951D84fKo1q5ktZnRkaLr96+Cv2s5PTfk6jfTuOfUTvVaul6Jue0Ib9OAQpQgAIFLuDUwDtgwABUrVoVEyZMwAcffKBGdN955x2sXr0a3377Lf7v//4P3t7eBY5i7wkUtsBrz+iko4FXSgBkG2OZgHGENzeB19g7N1OdrP7VeRY9XPWx9XF1DW3MiVN4vXotxBa5ibbV26p/JPjP3zc/3UQS1ml0j520GXj1r9TtqRnNMfDq7gUBQWhU9E+kItkaeM+X36kCqCzDPKth+MEYy/TCNkK+fl6zGuGVr1vreI0vrMkXpGPCY/3R4GfLC1/SoULasNW7/CB+vTgY3rUmWifkkGvuXO4jjFuxz/otEvx4VfynT6P0/ZGNnTTSRrG/6/Qu3j84z7rdlLvlsT3xGBaU8VUz2MnLgIcuH7J+PepMHL7zLm39elYdHGQD/YOcBGepw5YQK/dWfuCRpcvUjdh3/hxqVbqNn17tpz6T0fdOSzupEoscu0+knZV+KVD+6nCPY3v/o8L1KEABClAgXwScGnilZvfYsWNqZPfcuXPo3r07Ll26pC5s1KhRePHFF/PlIvNrpwy8QMaAKdZ5FXhlX5m6PDgYeI0BdnCNR1Tg1YuEYgmlxqmCcwq8xuvLqQOFsYdtplBl6GpwIKAZQoqes57X4POe2O2TeC/w6jZpOUzUoAOvTGARvmohSlW9FzCto6TGF9bkiAFB2P5Yb4Tu/BiV7hbD+MYj1J9rJxfFjvMvqjpg4whs27vTcPjQPtRt3EpNllGm7lgV1MdduIget4phW9+daPF1TUuofHMPQhc9hVEXE7D9qVGYsffeDHfd75bC2RuX1DXq+/D5to24iT9RzCseU85dwHxfH+vX9Wiure9lY+mIdLWQEKuv962onfhm76/W+mipE5aR/IxTTqu6Yr86mXYv68lkJM/VfQ6LfvFU1yyLsaOGrXOSWe/W7DuHKuVK3ptMJL/+Q8T9UoACFKBAjgJODbzx8fG4efMmAgIsLZFSUlJw8OBBVK9eHT4+PjmerKutcD8FXmlXJnWT0gu1incVS6Ax1KgaOwEUVOA1tlTTwcZVAm+mjheG4DmvRkNMKpJgfby7XfJAXMkka+BtU7IKpu3bYi0/yOr7wDhF8cBvxqHEA2utYdU6Ip3x5bKAIHT0SsTZW1fUSHKPx15Gp23vo3xKEZw681KmwBt+tD76F1mJC35N0fLakyhZOUqdTvekq3g1pSZaxr+JxR4foq7HQYTW/QsOXo/DsMtXsCOwD7acj1ajqXoiC30dUm6BOyXRcOxqeFZYo85btplfxgdJRYuq7g36mcvq2nVdr/56oE91fNxmCVpPXAefB7cA5b+3birPhrQyk1nbpJxBRpalzEJCtYwMG2dx0y325IW9I3ufxrZjlh/QZZFJQ/T3gZTNyD3Wo8q6Y0a9Sr748XVDuzZX+48Yz4cCFKDAfSLg1MA7YsQINaI7f/58t+B198BrnDhCj1Ya639t/co+41v0Ong6csONI6OZRlF1+6vALkC/yHS7tXU+xsD7bgU/LPO5VzJja4Q3eGFrHLqdgKjUijhzYQ/CKj6QqebZus+BNmYPM5yRntlOPsoUeLdOB1ZZWm/p87p7xwtFit3ACwnXsbNEUWvgbVq6Gnoc3wHfyo+hae9Ia6jKaKoDb9Sguui/YpolOBoCpqrj/WGkZdKHtMkfpgc8ihlFE1HpdgpWP/w8Dlfujl5b+6pd3zj3tHqRTkZLpRWYLIv/KIJ6RU8gtUw1NPDxQ1GPy+pz6d07/WQZ9Ex+F5+VGoPx1a6rsKquPbk4tqVWQ7HSx9RorpQg6P1VKlYKq5/bhui98Rgy7ze0qLkTe70i0fhmKnaUsGy/+8nplskxpONDwimgcb970xanIeh+y953i6g6bVn6+3+Fmevj0LjBNvyR8i3u3PBXo8d6kQ4P0b2j0XtZb/XDnCwZS1WMP7zhxDgkXfdA3Uo+amroWQOaqBf2ei/vrcKzbCvXJ6G31cR1OH3ZUqKig7Ej3wNclwIUoAAF8lbAqYH33//+N7Zt26a6M7jDUtgCrw6j2c2AZgyNKmykhTpbdbXGdfWvkDNub299pPF5MAbeTJMmGPu6Dk7rNJC2cU6BV1aTiQzkRTYZ1bMVeAcvaqde2pLOCNsTDquXpjJeQ8bAKy++SU1wxlFI/cKcCn0VmyJdPbKhNVjHapVxtnhxpCTVQ3GffWp0c13pkjjo6WkJYd5VkJRwQq2T3Q8QDcdEw/dmHDaVeB0fl62AOeVKqXMXFxmBVKPwa/8DnIixdHz48mk8EVBVBVN5qa9Z8zBsCXgZo9e1QpyH4XwaDVMhVcyktrbOrdtYW6qk+mEg9VZZ1PEpikM3L6FbXBUsTHgNf6kyAft9r6B2alEcKpqKkqlFcP1uCRXm9WjtiP91wvrkM+heuibG914GPSL6VpPDmHX9C+vjoGZ9azEGqNP13mQZNko75Brln8qbZ2AdrqsZ40peeRbnzzZEk+aROJS4E9dPD0DHR8tgY8KnlkBcvSPCa/YE/B9V91rq0WXRz5zxNxjy+c0L7XDrz/aqfnfaL7+iZ5Ny+Lhn13QlPPIcvBw4Dl2nbbReQ25fcJN7Jt+zUk5ha5Hz0yPK7vDfU14DBShAgfwUcGrglRnWevTogVWrVuHhhx/Oz+vK1b4vXLigXporWfLem9zZ7aiwBV65lpxGJ3MbeHWgy+vAm+mlt2wCr+64INcpI9HSskq6LtS+WxxLjx+13MoxV+51brBRwzt4fhBi7yRiTulHsf10jF2BV0yldlRerJLgr0OIHvmzGXjTuhckFi2CoIBq6tQkUOlf5+tJJzI+f1n+ABG/CzWmnMJ/is9AcPGN1g4Hsr4sUkagfijZ+0u6wGvtMyyTW/RdgC2eLTF7dQs1uuxxpwpuFzuDPg+9iqPXtlq7XTS7ccM6Ki2jwG88cly96FbrSnkcTghDiWpjcKvoHaw6dQa9K1fC1WKWkVpZ9A9Q+zd9jSrrRiDOJwiBb/wAPTq9tt0Z9Dz7iXX9NteuY1rdF4Fy1S39fGVpPhToPNH2t+aYMpjn66O6P1S4EoBb1/+B2o3nqXO/fuJlPP9YWzSqexjvx7yPVecTUeVagrXdm/4BRX4w2HWwGnZf+h3bb4y3vrQnI/APogOqPHgJOy5ugveNv2Faj+dVUJZSH6kdlhfmulT4EIs3Wn5YkUVat73QylLXLOch3yN3E1thx7GbmBzSCHfXT0CZX/+DpEYvwueZ/6j1jL8psXXPde25fH9woQAFKECBnAWcGnhnzZqFyZMno1q1aqhTJ/MLIh999BFKly6d81nn8RonTpxQL8xJezRZZFIMmRwjp2mOGXi3W0fF9KhxQQZe48iwBHBbE0nkFHhHfPkXrC+SjHEP9cbZ376wGXgzThds/LW3seTD+Hmmrg5p3Qt0r98G5Rpi244WlprZ5BvWcoaMj7rNwJtWHrEvNUCVG8iiyySkA4K8jCWhTJ3Dn9etgffAwmcQUqXSvR8IBq3AltR6+Hb5k1hZtoj10G81nIIN5+ZbA2/TVE8EVfJVI8NXj4zCvHrrMOzWdnikFkeAdyscub5BXYP0Mn7N3x8bSlrCn5QU7BuyRv35ixUb8EJsD1wrWhqlR+7Dmv+GwufGWTQoD7xa4uK9F/YuX8FwlLG0OTu+yXJOAUFAhtF99Xnai4BnihdDp2pV4JVaFE9XWIgllywlGkn7I6wvm8lz2mzmU5b9dZoAtBgO/fz0fqQv5i5rjGKljlprmI9cPomEW+fT3Y7U2+XwRrMXMHXHZEiNr4zwyw8WJVJq48/DoepYUvM7OKgGRnerD2ONuYzmJ59+HlXLlcSU6++gaZH92OfZAPXesVyjbrWmD6hr5OVluAvJcXjn18EqYBt/IJT9v77+dYxqNsrass16wvKD4oYIi2PPGTn+F1Re1pPRao4g50jFFShAgUIi4NTA++mnn2LnTksfS1vLlClTCiTwSm9gGdmdOHEi4uLi0KtXLxV4ZTQ6u+V+DLwykrm532bFYhxRlb/L6F2+B145kLHtleEGGUNCdoFX13tKeYBMa2vs0jB9bpCqaR0W0AXYEZlj4JXD6xE+XQeqRzHT1X9m7DiRdg3TqzyCGZ430b/K3/D55tqZAu+dazVV7ateMs2aJiHvy6fVC4Sy+KbeBUr4YnA5r3QdDvS5xKRWhu+JrWpUc/uKYQj1KYKmt4G5p08CQzfi491eqL2lPzb7/Wmtd5bAuyl+Jbaej1bdGOrUaIeQaztQ9o4HTh0ah00PfoTXveNwqMS9UU1VItFoEN718MWyM4vUucm1fNFpLlo+VB7S4uuJXwagebEDlhHbbfdmWTTWWg8/Dwy7ljYDXAlf4GbivR6/Gb8500b/k+6WQnC1sqosY1bKAxhS/AJKFS+Nc7vfVwFz06i2gLFbRVqJhH52K5SoiWM7hsDDb5O1hrnElV749Nfv0PCRU2hfuy5m7fgSqUWS8ZB3Qxy9uku1VutRqwee+l8HJN+5hiInn0d07dN4YWdt+AQ8jsUD62Le/PaY5HHDeta6RGJXiRfhW+S6asO2velb6FmnNSZun6jqguUHKOkSId93Ug4ycdUBfH/ufRTxtEzHLC/a6VFe/Vzrz+X5tpbZfNn13g8MYbsy1UAbKaN2/4oPf38BwQ+FYkzrN3L+X9mOhUDjZ3Nej2tQgAIUKEABpwbeArzOLA995coVNdnF4sWL8fjjj6v1JOxK8M1pquP7MfDqYCv/No6oyt/lf8gSIHUtpHyW3SQXWd0U48teNmtWswi8xppZCQgSQuRX19ICa/zxA5bDDVqB6Qm7rSFXfnUsgUKPoOU28ErA1hMZSA2ohBU1TXHFptapedO9gJd2DSNqBFpGlAMHYMvFR/Djnx+kG+HNGHgz1QLPbIW1iUcwqVJV+J57CG9dOYHm/d/H4A1vpAu81l/X3yqFdmcOqHC7fdUbCC1y3joaK6PfEkQf/WUo2hf7DTJSGl7hQXRpOA+Hz89B1JVNeD4hBaUbdceMEz+iyzVPLD75Idb6TcQDN3ZhoH9XHCmxC0Vu+6DzzeGY+MpzmL41GjMOvqXob156AqOahatf70vNbtLWr61TCBufBV2SIJ9JcNxV4TvLdMVpL9qpdcdcAVaOAloOt4a31Z++jg5/fon5dztjp/8urCpVDFISIfW8Tcs/ivWbLLWw6iUy/fKjfGAYMdY/GMhosLVjRKNh+ONQkGpJpssTnvnfaziSvMF62nqkNWjmKCSWvFdb/tiN2zhx1x+zit7E68USVB12rZJ/U9t6FCmFJ66G49Pzg9R+dC131dIP4/S1P1QpxZZnt1hLcGQUec2BE0j22IlSt0sg1QO4gZt4q+FUPHr0Rwy6Gq32U7JYaRW66/q0QaeKbyC0ymn1Q5FezrR7G+srVLXWB+t2bKcuJ6PLJ9FIrfxf9TJi0bsl8WOvlajiW966rTzj0uFCnkNp74bvhgESeNNGyWVFmejDt6SH2sb4Z/29pstssvyPsvSK9iqb5ZczfkH2K9/vOXXysHuHXJECFHBLgfs+8B45cgSdO3dGTEwMHnzwQXWTv/zySzURRk4v1zHwTk/XYkqPNBkDb6aAZse3kTG42vwVvh2BVw6jOxQMS/XF8BOWWcwyBl79QpcOLMbAe2DfEhWWjC3XZBfGkgb5n+2k7ZNUfayEXDWrW1pdZZaB11CH3LHoWZzFHUQ1eQdJFWpZRotvp6jRyeKpfkhOrAGPsr9b1ZqWeQRze35jCRNHovH+muFYV6qU+rsEylvnuuN4WDUM/i5YBd6JQTPRpVZQuh9Oely9ipEv7cS6hd3wfpFLKhROO/+nCpESeItsmIAwD8sxZLlQujYWeyVglq8XWl6vhUuV7uDglWOY/OdVDLnwKfQI5fTma9G8sRf6fL4epVNrq+l3ZX9zzgZbzu9CO3SrPlhNVCE1u/uPn0KM5+vwKXJdfX2c97t4/+p4bPf1Q2h5SzcNCZ4xYU1QZW4z4JkZwMpw4MopFdgxszXgVUbd06gzfkj69h8ILR6N84+/gd0PlETYoXvdYLp7PIDfL4xR3RXUS2QnZgM/R1jOq0pLlHhplfpzpyU9cObaUVXv61EmVtnLD04rNtZS5Qn6BbTpsYswY+94q9HndzpjnVdb/P/2zgM6qmptw28Segu9SlERkK6CgoBKURAbiqCABbAgcFX0+it2lKugl6tgAeRewQpYUFAUUEFQERCwoYiCVGnSQwghCeRf757syZnJJDmZk4Qk8+61WJCZs0959pnwzHe+/e25azcj9bQXA/KWuRGrWDAFhFUpWpSYgnkH7vPVG252O7rOfcR8uWAahrM1i6qHy+s/jZdX/4z4Ks/634o5Xhwzd201485c7zLHYtEslUthlzTl4YYdPOTfV9/NrTCs3B9IPLoBX57SDLOT//ZPhrQ7HHT4OIY3uAyPb2mN2cc/R/HY1f5jNT3eCe906Y85u5djzub5WJmYXi+6fVwFPH7oN+wsXtwce2nVHvjpz56ohT04Vv9/iD1RBsf+bowSyeVRvOo+rC+90uy3UmpehkZNAAAgAElEQVRV3FxxGHaUro2kpC1ocGCZef1Iiao4Y+8XSEjejp+rnoVS9S7EeSXrIzFxLz498DWalT0dZzUcjpVbp+OHfd9gb8oBbEr0RbrLoDi6pTZAjYRySD6eioqli6N2Rd9cjPjUJGw7sgHRx4/hWLlTkBpTCkmlq+FYajHsSYxChcSdOLt4cVSKKRnAn7U+mNiTkJSCI8eOm/d2xZzAthJZz/EomRKPsqnxKFMiJmB//OH4iVT/vjK8GeYLdVEeZVJ9XzAyawlRydiGw2EeQd0KCoGnBqb/Xi4o51SYziNfhTcpKQkvvviikcvDhzN++D744IN8r8f7/fff47rrrsOqVasQGxtrxm7mzJl4+eWX8fXX6TOt+/btm2FcuTLc+vXrC9N4B8haqGL+2U1a48UGV26wACgGzOXl8sW25YvwHvTlrXZfeLuJMNtmHwdzVa+um3/wvRwU4Q0W3tmvXWgk8Moa52HH5iUhFz5wCq9z4QrmUH7050dGkNgowozK8TUnNxtdjKvfDh2ifee75rwxiDvtAnSY0cF//tWKnYmtW5uj4Wm/oezRdfgjKgVO4V346TCM2JN+jzIanLB1iIlg2qWCmY5w81ld8d/lq/H62ldwKMYnFzzHwysm+9I3mCebWgG45xd/tQRGMo8d3oerll2HOtF7/ZPgaqAbduML3zlv2oqWx/6Ln0veZn5+74pfzSILDUZ+Yn7meVB4/7txuJE7SmTj2NamLq2dpGYn2c1K6YR36z5sHv3HRUebFfDeX7UNGzd09Elmme1AzZZAWu5zQBpEqVjMqXknav75vi9FYuBcxNVuFcCS17i2xATM3VrClBPr8sPdKL7BJ7mmMWL843Rct+EnrD3yKZqW6o01+773l1JbNXkKzjqxBnWHzUadmjUDalDXO5aKT3Zsw5MpN+Kv1GqYUvw5oEZzxN0yDx9Muh7rY37zp4cMSC2HA0eGo/mxkWZi3ZWl6uCp35b5q16knxBM+kiXw6mYerwn3qgeheTYFUhNrI63DqxH68RD4ITHbnXr4Wi0rwwb26s7o3Fu4maMrVzRrFJXOyXFyLat+MFtyh0/gcSEM3BG6l5TTcPcD/sOmO0erVYF5VKBYfsPmPPja/zDyLTt2yUhIaC8n/OcLzwMxMQk+L+EOd+z/Z2TGIPfD+dn+wUxnL7qIwKFiUB2ix0Vpms5Geear8LLHN4JEyage/fuWLBgAfr162dydmfMmGEWo6Bouq2QkFuwbIT322+/RbVq1cxuQ0V47YpwzuOed955ESm8Nhpqc2b56JWz050VAeyj/DwXXlveq3FPtEjyRXHtYgJ2rFhirO3mVb4fsxHela919T3mjz0D2LUmpPDa66Y0UniZEhGcD8xD2dxgTgBifi9X8+K2nHzV+I+F2NGkB57Zv9qXUtB4MND5wYASVzVLNMX6n27yLWP71wi0iN5mLsH/heO9azApYT0GVGyBtw+uAasIxP8xyohmt1ebY3exKIxr9jy6t+lmxHPCwvW4v+ZQf14y1n9h+hvhjW1uJoI5F69Y/uc+fP7l5/iw9NNYXao4hlRPXxymTWpJTNu8Hrcn32sEj3mzv9z8i8nPde5j/Od/+COj/aYsN+fP8zPL/e705R1f3LSGWZWMf//3pjb+j5hzP9yvaUxjcOT7Ml+Zeb27o6tjX0pp36Q9Rn9rtkTvj3r7lyvmym31S52LbjuGGJ43/DYU1fan3RMAvrpsIS74pCs+rHw6HotNBtlvP3DUJ7xtH0Hbd2/3HZ9pFVf78o3Pff1SHMVfJqr61N79+LPSBThU8UycvekV/yIhr36zCb/Nm4zlpx8wXzZYuWJ54vloV+pbE4UtfyIV327ZhrGnt8bbJ/aDTyPW1T8HZfbvw53r5qFOii+yuAhNcHfNmnhgRxJuiP4WYB3q3z810junXDm8GVsBbRtehqcu/DeQeBBxf61A928fRHxqsukfdbwEkg43xy3J2/F/8b5IKxvzhim5zkbR7hV/BINqVvdPHqycHIVm++rj/CMpOAd/YV75EphWPQklThTH0aOn4Mrjm7Ck7Al/ZJsTBnscrY2tpf9GdFQUUk6kYjhi0fB4MQyO/hubih034t0kKQnHipVHSlQJFEtNwtHiFVE8uhxaxu3E0dS9WFi2LE4gChceKYHVpZOwpxhQPeUEeh8ui6rJUTgtPgp/l22ELeWjsLTUNiRFn0CxmCgcTTqO5OMnzGWVSo1B+RM1kRBVFpWSdqF0dDIqHj+AsqlHUCb1CPaXqo/lMelfGiyLo1HJKJ1aHMWioxAT7ZvEWe1EDBokHQngFfzDsZjyOBJdDgnHUjJsFxMTHTLym+UOs3lzW1QcjiLjsZzdSqMY6vJLrVqhJjBm8OxCff4n++TzVXg5Gax9+/YYPnw4WrVqhS+++MKILvNnn3vuORP5LZYWScgvMKFyeEeNGgWuClcUc3idE7ZMDl5QcxPhtcJrI512YQJGVJkPSNGz0dUM1QlcDKwzpSFkDvDzzX2PtTn55ofp/kfTtsSWM2+Wh/NP1OIPwcK7YxlW7fnRJzVNr8PK507D4Cpl0aZaa2Dr8pDCa3OXnTVug9MYmFNoRXjOhjkm1YGCzKhvcBtwKA4jW9yRtfAWm4UWW6ebrlZ4B73WxiyVPLXVPfjHr1OQkHIE5XaPwbL7L/eL8zuNxqJp+8v8EhogvJu+NuXEshJeSvKUFr9jZ+IvGFcsXZTGl2mCrr9+hs+Pn2PyfVccb4IzH/rG5G5aUWUkdeo3m/zC++THv/pTCqz8OlkYsb+4kf+lf777U0DerHnDUb/Y/Mx6wh8O9d0PtjFaG5RjbibRJSZi+YkzMarsQ5h/9EazzcboBjjtxGbMi74Il55YbATSlomzu3sp6lpcuPG59P1f9CBwcAveSCiOCX+nYnLiu2bfpgJCbN30sm8NOpkc1g7PLMLhxBQ8cvpTuHX7r/79XHJKbZO6QhmfdGpLsyrd1H1H0PbejcDkjuYLFyPZyavfQvGUeFyf9AhmlviXrz/vfbLgIiJsrfoh7rJnA6oqcFLpz3+vxbzvqiIqoTkGdzgVd9T4DaU/uCn9WnpNxEQc8qcm+WoTX2UW+uDvgqlrXkdM/IX4c2sNDO54munHLzHtTqtiUlPYlm3ch/a1Y7Buwzz0Wf20bxvWfK7XNfhWNz/76wdzIREy46IiLpuprlGzrcuttZkIiIAIpBPIV+Ht1KkT7rzzTjA9gPmvb7zxhhFglgXr1q2byZlt2rRpvo/PwIEDUaFChYio0uCUNZbtCm7hCK/NlaVo8j8jG9m0S8jm9DFMlgs28ITtY23KzqZvjPCuK1HcV2KrUiPznz4F07Y1Ow8BiT4J4uSaiaXSo68rN3+BVYfWm4Um2g5ciJVja2BwrepmUo6J8KYJZdvWvkUJ2LgIA9M2uM2OIztMGgUn7LFMFOv+Mq3DplZwAtvdi+4252M51UqNQZOEwzhcuxXiEvbgxh1/otc5d2YQ3vplWuCX1QN8kc+6n2cQXn/lhZ4zMeibMWZxhQZJ9+Hj2272C+/HVW9Gg8vu80voexUHp1/fwa2+RTbSFp1ghNkZVeW1UkxZXuuMunGYvesRs3iEmUgYfyS9Li7ry6aeiTZP+CK4NppMgZ22dBPiElPw9QNdcN+7P2aI9jrvv2Dhde7HL8LO6gqM7j64DZs/GYcGK0f7dkXhvMcX6XfWsjWrtc3ob6LBnx0/B5fErDZR6RKntELJ7ctMKsIpUXuwtngLjKixz4iobTP21EXz+KXYVuI01E1Kq+fMnNOa52FP23+iwccZ050wcot/4pW9jnFVP8W18b684j3R1fFqjZJ4u1QUmCJg87CZJmIXBTHR6we3AR/eYcQ2LrWMqebgjzI7J96lRbVD/fJkKbO6lUr7J5KF2oYTRXn/BiyOEsZvYj7NIPeR544Mo7e6iIAIiEDeEchX4WWZL4otpZeS2aBBAzCaysguf/74449D1ufNu8v37Xnjxo2mDu+2bb4oESPRo0ePRom0la4yO35hnLRmS4llKG+VdpFW5uw125WxnBIaHOGlALEaAkXvqtOvyl/hZZRry1LYerZO6eY1mJSK5Y5E/wtHYmKlWP85Bgvv9n9VNo+aeS3l4/fgdySbCWVNmvfz3wa2niqjuJQEtlClyCjfs66c5c+b9kfCjxfDC1s3+h69s7YslxhOe1R+yWutsTPK9xi7S+1rMGfhub7asY0Wo/f610zpL6ZGsFGuzZLAt/6Guz9/HIt2fIAaKX3xxS2P+oV31Z4TKFmuEibtbIwlyU3wzwrj0oU3bgdWHd1hhLdRy+GI7fmYf0ncT+7qhLjEZL/wMqLHaO8NFxTDv3p2N4/NMba+n8mnZa5Cz/vfMD/zMf7ouWvR++xTTISWjWkMdjU15+vOz5Zdqte+FlJ402rtmm3Slpd+c/FPuHHxBb5uQTV6+QWOzUQFEw8ieVwzEy01YxbTDC06XuF/QsDXeuAl/F15LpJj15ptqian4uPt+1Eu9Qjeav0Wbtg51jdRjqvV1WwBXDQSmDkg8FeEQ7rtGx3GLkLduNX+CO1XlXqjepOq6LNrnr+vTRMx++cXNJs+wRz18S3Tj8EvejYqyi8AbDmIkgaerO8nf1m7EnrsHYqPXhMBESj8BPJVeO+9914jle+9956J5t53331o2LAhmEfbuHFjzJ0796QSZRoD6/Hyj5tWGIXXRnAzy60NLjUWLLfkYisnOCdv2coM/soIrYb6H5PmaYQ3TXjtUrfBwmtSIr56PX04LxyJ2XXP9JUrO/1K7Nj7my/CyyjnyN2mxq9NjbCdTHQwSCjaT29v8pbZbPko/tuZN2q/VFhONtXCpBAcPOSbKBW0cpydbMZ9XV7vZsxYcKZPeLunYND8wf4Ui3UbPsWzf77vK7k2cAVGf/0K3t34EiqldMFXt0xIX1GPEUPApBw8f/xajCr3jImEM9Uk+mgSfju6yVz7ifp34rx+jwRMOOOjahvhNftwVCkwO01bLY7//KzqQFzyjwnmZZa36vTMIlQoVcxEd8uXKoY1aRUbKM12QYbgz1jwErz2+KydO+P29iZKaZqt0pFWCoupD+f//LBZYc6ZYxvyM+xIiVhb7TI0Paej7wsHZ+OXroWmB/6DYrGrUbq270uFXUBj+4mq+OW6pejerGbgObCGb1q1B//xHHm+9rX3Vv+F/3vvJ2wu5atXa+S5Ryes2/0DBi19yNxLA6qeg5E/L/TVGmZzRm0t68wW3Qh5sXpRBERABETAEshX4Y2Pj8exY8dQpYpvkgTFd+HChWjWrJlZ3axmzbT/TArJ+BRm4c0stzYnwtt9Vnf/4/wes3qYUcsN4bX75f6cC134bwtnSkOa8E6sGOubjMUFIyrW98u2kfPZPqEx7cKRWHlmN3+NXCTsw6rDPunLifBmlnbhfN3mMQavmmWE92iq73F1FsLbv9GteGVOQ99iCdcVDxDe2d89h48O/IL7Y2rhxhs+w+s/LMS4n0egzPFG+KDvC+B4mOjvX74qEHwczglmzAG1Ql+teCXsST5gJlLNqP4s7rv91pwJL3Mw05b7XXbKbWh/6zg/ZpsawReMsA9pb3I9nbm7FGHmttrGtAe/1ALgo/jrpywz21Ce/92nlU84OXFt1xqsP//fqFGvEa6fshxxu/7E8zdfgLZNfHmmmbbEg0h9rgWiuFAHZfXUjv4atd+Wuxj99w5CrSoJiK/+pNmFLdk2NaU7zhv2PzSrnRYBndTBVxuYAspob9okMtPJUZPWeR6M8m4/eNS85IxmM7rK+4YpRibvlVw3fe2fHGc6MMp7cKsv57ViemQ964vVuyIgAiIgApZAvgkvF3L48ccfzXK9XOihYkX3hcUL6nAVRuElS5v7GSrymhPhde7H/tspvEyP+OPAH+YRvC1u72Yss1yhjDsIJbxVqmNShVIYWqMD2ra+xb/4hck3fevm9MMGCe/ve37C4RPJWLplGyo8dtB1hDd4GWOb+zhx1XOY9Os0c7zMyrcZua5xjm953CyElyyfnemTG9bWffj9q0w5qJsa340PfpmE+OJJeK/mpWjS/Vks/GMzRiy7AtEojf91f9lcf9OjwDu70lYpA/yTnjJEsDdtxS14HK+OujdAeG2klsJdp2LpjBFeR1rDyoveQNuL0lcmtBFNnntmwhuc2mAWhAhqPAdOdguu4kAZ7jfFV2KNUWTDKET/kPcbc1+ZNkB5ZOoA6/kCeBTD8GZiR3O+B6o8biprDD6QgHsO7vVVo3jq8fTdOaLb5sXr305Pbcgkn9bJJDiarZQCN78ZtI0IiIAIhE8gX4R30aJFGDJkiP8sWYqMpb9at24d/pkXgJ6RKrz2Ub1TeC95/xIjCLYkGEXTViewaRFuhyxYeCl+nPTF45oImH0szQgdhXHLUjzcoAk+ikrA6CrtUKfN7X7hNcee3C390K36YWV7nxAHrILGR/+caDS2Pi6pWwc7i6UXjQ+V0uDMdXYujrFw9s14Zu93qFOsLKYN9E2cy/AlgsLb+GpfBM/mZ1K+rn/bv2AE+wUI79jLMHF8PRPFbliyBzYc89WQXdNqJNB6gIme3ra4h5lUxsgyJ9WxLu9tZ4zDPUt9s9qfrPIsHtt3Py6tXR9/lUwvw8SJUh2PTcDd13Yzj90pt0tHdjF9bE1dm4YQLGqMRm7evhNVOtyM8hV9Zf1sazFqgYnOOiej2f1xG77OFAfbMhNWVjpo+cRnZrOfH7/E/M2ori1rxp/PrFUe8+5Oy+N1e6OlbRc38WJU+Ps7U1M4LrWsmSRY+pQ38NPfa4EtTXH84NlIja3nZ2K6BVeLoOTaiZFZ5NMyL5mVLLjanJoIiIAIiED+Echz4U1OTkbHjh1Rr149PPzww+DPXLqXqQ3z589HTEzG1Wjy7/K9HamwCq/NMw0lom4ivDb/1ym8zkf5pMp9c4KcLc0VqgRaZvSDhddODuNEMkaLKyx92Zc3SeFdMcmIxqBGrbEqeT+m1uiG8ucOMRO62FglocLTddMPVb8DVl76RGjh5WSg1y7HoAZnmOoMtnFRCDRJX5qVr9uJa/x3wGpwY9Oiho5cy2CmJprc6QFTlcE0m5PKkmmzBxiptQxHvHbYPAbn4/6PX2/se+/oaUDpjSa/9NmOk1GteVcjvLfMH2Tqxtooe7DwLjntXlNe6/IaZ2BLmfTrW7ZxL5of+58/t9ZGZHkOVlBtPi6F0y4bm92nh3JHKR3c8VRTn9e5P/6bwrt84z4TOXYeM9R+b3tjlYnyMq1hxcZ9eH/1X0Zyx/VpbUqfVShdDI9f0Sy7Uwr5vlOo7XmxKoTz9Qzn56yQYMYwrQpIWGegTiIgAiIgAnlNIM+Fd/Pmzbj44osxa9YstGzpm2lsVzdbvHgx6tQJXE4zry84N/dfWIXXOdksuKZlbgovpZCTw3Jai9cKb3A9Xb9cHjiULrxpE4YGNWnjK7FVoxva9njeRFXZTOk1K5R8ISvh7TXR5KQ6hbfRsSTMOmNgupw6biA7cc3/xcEpQQ7htZUxbFdTeorHap02uz8L4R03J8WfSrBhwYV4toyvAL5hceAQzrryByOTC37dhTs/exwlK3/rr4FshbfPmttxStwP+KvCWebvi2s0w64yvuoS3Oa6umMwbelm/35DCa9903XaQCYfNGdur7NOb3bCa9MBnBPhZt7ePj2n1uMH2xl55ipzNgJrXw8umQZntQhbPszjOai7CIiACIhA3hHIc+Fl3i4npH333XeoVKmSuRK72MP06dPRtm3hLSJeWIXXuVLYjU19Bfhtcyu8FEmbFsD81VARXsp0Bil0cS+HEl4rv4z2zq/eHRW+fs4X4bXC27wDVh3Zhqllm6PttWnF+O2x3Apv2v6cwutcBS341FlzlPV2r2p4FeqUq+Ovl2q2cwivs7Yxa/B+tnmTr9aqffRtz4+LACx+MCDC6xTeYl/1NavA2TZmdxyqXPu9EV5GUyf+MA2lasw1E/1YLi354NkY0vRhXLb6FjRK/Bn7q7ZF5b0rcVOFdvihim8yG4X3+QunYPwXf5hFIdjyS3iZHvH+qr9M6bJBHRpkGaENjsIy0stljHOrOUXcmbZhJ5qFPJ4dN1VOyK1h0H5EQAREIM8I5Lnwrl69Gtdffz0++eQTf7kvpjWwHi9XVzvnnHP8F8cqDdHR0Xl2sbm948IqvFktPuFGeBmxfaDtAwHCG1yJwEY97f5YAuypjk+5GoJQwsu81LfWvmVyeU0Ud/nUQOFt3TVgAYmAA+VQeG0+MPeRlfAGHCOoLq2p0XrHN2YTp/C2OZaCaTt2+PKFS6VN3LQrxzXuiYm7lwYI78yvShghpHA1WHNzgPC+sLE0fuz4llmdjML70rIFKFN/iv+0ju3piqGthqH7d4PRNHkN4qqfa3JVg4V3+aBZps91U5YZ6XVGOJ0iyG28RnhtbV3ui2LJxiWM251exZ/2kNlNYuv4ZifHrm6yoI1syoQ9L5uCwcjyX/sTcG2bugEVJEx3x7ihX9CXrHBOQn1EQAREQATyjEC+Ca+bK1i1ahViY335i4WhFVbhZWTS1qENllA3wsuxodA6I7yZ9XPmurqtx2uF1y5PzOMxF/eZ757BR39+hNE1O6PXstfNsqtYMdncKoMyE1776JmPnVnftGI9bL91vinb5VwRzaQZpEV4J9ZvjknRvlqoroXXTmKyZarYOS2v0ym8tsxVQM6nrTpRvwMW7vkeI2r4Jn8xX3ne6mgzsYuP1C89NBp9jvxo3mPJsZ6b2iH1ogf9wjth8c8o3+gJ/0fHCm+bZcPQ6cRKJFVtihJ712JY6QvwdU1fCsOx/efjj3teMf9mFJXNmaPrFF7nZLZwP592UQr29yrP4Z5DqH5OEXd9XnbceN/YfOzcPCntSwREQAREINcI5Lnw7t+/H99++62rE77kkkuyXd3M1Y7yaaPCKrxZLT6R28LLobAVHNyUJ2N5pg4zOpjFHJhuweWJ7SQ5f2S6fk8MWzzZt4zsoW3m70GnNgod4XWW/WK91DQRzVD6zIvwMrrLlbA4Sz9t4ps9Dv92Cq+pwVuyLjA07Vy4gRWnivWwMvFvsxIaG78gWBFjVHNw8kz0PDzbvEdxLrH9WpRvd5NJBbDbVW76JJJTE3wymxbhjVo8BiOKp682d2fxK7D4lJ/MNiUP98Cqf/w7009MqHq6Xj5ezlq8rsXSywFd9nWWDHN9Xry3OOaM5qs2rkvS2kwEREAETg6BPBfek3NZ+XPUwiq8XOueVQyY68n6uCwlNvLckQZasPDaxROCc3TtcsI2VSEzUeY+s5okFzxSThlnnjCj0cwFZpUHWwqsTewZmPbjwvSu9TugRbRvWeg1Ow8BI9NrzwbUuc1OeLlC1k8zMLFxB0xK8u3PVYTXGd1lbV2bQpEW4Q2o6EDhjW3uq8Frm6PElV0i2Qqvc7WzcdU+9Qvv/fsO4JO9dyO1fkezqIMV3jNavYFdSb5lcSm8Axrfhtjl4wKFt8bdWFzmQ7NNteTLsejWMfkmvKyfe9kLvuVwXYtlPnyceV6s93tKpTL4T99W+XBEHUIEREAERCA/CUh4PdAurMLLS3ZGOJ3LDAeLqy25ZVc/Y+SVy6BSdJleYN93RjG5f1MOrIRvVaqHv3nYl4rARSAa9sqSeFbRZyvqtUtVxoLffI/2TXMKLyO1zhJRLiK8dvlYu7vZbfri0X3LzY8mIsvV21gzN7NmS5HZiWhBwuvkbRadaDUQuPSZ9L0xQvx8C5NyESy8zsUfHmiwAQ8lvmj6cT99Dk71TzCzwntem6+w9ohPpo/u6IOUQ+dgRMz7AcI7vsEzeDXqZR+6qH6Ye9NDmV4al+1lDjFbdpUUPHyU1FUEREAEREAE8pSAhNcD3sIsvIycsgWXDctMeIMnktmqCZkJrzNfN6tJcsH4sxJepzianFvbcll4V7YbjMG7v0gX3uCIbPBJBwvumLq+fGHHxDTLzwjveSMy5nymRXnXlSiOPnVqmSNYhrY01ri2cbh2zR3mvZTyp6Dhnmd9yw4/0MUf4e3W7nesOORb6S1hy+04nnAaro35CuOKpwv7+y0m44n4p802Pas+iWcuuzrTT4Ezt5ULMvz3pjYePjHqKgIiIAIiIAInh4CE1wP3wiy89rKDlxnOC+G1dWjdVGrITnjtohnvbd+JJkm+SVauI7y71vhFtMU7vuVk2YIjvJ6F17n0cVrpMcvZLDpx08fpJcnsSTDKO6O/7/WgCVCXTvjKVE/oUup3TEXapLT6HdDg9+GmN1MDrJhe1ykJn+59LEB420WvxcwS//Jf79ru09H3t1FmVbacCG+GWrQePjvqKgIiIAIiIAL5SUDC64G2hHdVpikNzghvdhLrHILstrX5wON370HXhKO+rq36oUWcbxKYifxymdeavkVOAnJ4+TPzeAfORYslw3JHeG0ViBrN0yeihRBec7DJHQFKt7MGr4v7z04cqxB1BD+XvM3X48KRaLDAd40UXrvN+P6N8egPvoitjfAGC++eaz9A+4+XIrr4fjzRdSD6nZX5Et/OCK+E18VgaRMREAEREIECSUDC62FYIlF4be6uxeZcntiZF+wUXpt762bFteyE158ewdzag2nLuV44Ei22Tk8XXqdQOnN4XQrv9g7/QI8dH5n9mRzerFIanPu3E9EyE94Qub1ubr+Aklml+vu69JqIS5fUNZFf1rMd//kf/hXZ7l1+ucmzzkx4mePcYtQCHE5MMX1tzdlQ5yLhdTNC2kYEREAERKCgE5DwehihoiC8NkUgeKEIi8Xm6FqZZdWGEV+O8FNzI7zcODh1IjPs2QmvrSHsr2fLHYUhvJesGo2dR3aa0whOaXDub/SefeiVUgxocpkvatxuaOCpuxVeuzBFGMvQBpTMssI7cC6uW1DML7lO4aXAOst/nRK1B9+UvDv9vEcdAqsSxNq849sAACAASURBVCUmo1mtCgF1d4PHhUsWD3lztXlZEV4PvyzUVQREQARE4KQSkPB6wF8UhDe4ZFh2ObyM3DpXVcup8DqrN4RCn53w+t8/mohpu3zL7NqIbMhle62QNu7pq5maltIwaN1Us2pbSOF1pDyYSWaJiemn6lwhja+GEt4Z/YDfPwWufxtocrmvb6jtXN57Tnl1Vkqwq4O9cuM5mPrNJr/8Bguv6bOzR4Dwujx0gDhnFw12u09tJwIiIAIiIAL5TUDC64F4pAovkXEiGlcqY41cW36s/fT25lE6W/Cqam5r8bqp6GCjxfO3bUedlOOwk8zalKyOaetW+VZg2/2Lb2TrdwSWjPWtosZ82zQRHbR5VnjCG5x/u24uMHMAQKG2y8vaurrOFbh+fBuYPSxwO5f3HldAa/nEZ2ZrZ6UEm27AyOvyjfsyFV6zFO/q8yW8LnlrMxEQAREQgaJHQMLrYUyLgvAG18i1YhpcdsxNSoKz77QevtJYtuWm8Npz5uILN8YdxsoL7sTgbXPgF17n8r5pywUb4WVLk99Bx/7IUnhtykOGCG/wMrKh5Nb52ln9gc1LgU1fmUUtzHmEsQytLU3mTCvISnht/V5e8qOXN8UtXzgmpjnrFGdz/zujy4rwevhloa4iIAIiIAInlYCE1wP+oiC8wRHVvBJeexyu6tatfjd0rtvZrPIW3NxEeO2Ka7VTUrBg2w6s7PU8Bv/0fLrwlor1pS+wuRBeMzHNToBjn4FzMTtlH3bE78BVcx8xUWSWPjPpEPw71CppTpF1Ci9TGewKb9x3r4lA6wE5vutsFQYjrx1PNf1tbm/vs0/BXwcSAiK8fN9KshHVBVf6ot7OahIuzsK5MpqE1wUwbSICIiACIlAgCUh4PQyLhDcQXlYR3uDcYOfqbs69uBFebn/Ja62xM+q4WXEMPcYECq9zh0xvWDHZJ76lY4H5D5oyZoNKH/NHeEMJr6mHy8aKCwe3Av2mA5PTXnNGSLOL8K6YlC7faTLt33cO7r1Qk8ycyw5zVys27Q+oumCF9+fHL0GFmVeFFnYX5xCwn9LFXfTQJiIgAiIgAiJQsAhIeD2MR1EQXrsoxIAzB2DkuSNhpZXVGSb9NAlWTN2kNNhUg1Aya6OyFrdX4R37Wnu8HRWPKw/Ho1fHR33CG3sGpv24MHBEbWSWwntqR+C1y02UdlCtGunCm1QSw7avT++XWZ3cSR18UVLn+9kJL1MonG3Ez0DF+h7uuvSuTuH9bSerLqTg6we6oG6l0v6NuI0pO2ZLpQVHqF2ciRVe1vtVEwEREAEREIHCSEDC62HUioLwBldF8CK8NjobSmbtcXJLeFe+1hWDo/42JcWGtXvQm/CeqIBhW9ImufEEnQtXOO+PeQ+kR4ttHm52FRlsOgNTGWq2SF8Qw8N9Z7s6J7PZ1zKVUg/CS2lmy6peby5cjnYhAiIgAiIgAnlGQMLrAW2kCO8D5z6APh/3AfNvZ105K1NiWQlvXFIcKL2Lti7CR39+5I8cB+/MbUpDWMLLCWTjWwIV6+HhNlea82AbGiy8mU3qshUZKtYDRqzxnXqoRSZsCTJ7cTnMm83JLWmjr9kKr41EhxHhzcn5aFsREAEREAERKIgEJLweRiVShHdY62EYvGBwppJqEWYlvHab7Orsuhbe9/th8JFfTIS3bbt7TPrF0NOvwbAvxgeOqDOlgVHZtNXOJvYaY/rkSHi58fPNgUPb0mvsuhHePJTMHAtvq37A1ZM93PXqKgIiIAIiIAKFj4CE18OYFQXh5eU783NDpTS4FV7KLP+wPm+vhr1Ckt0evx09ZvUw2yzovSDDNnctugtfbvsSXNGta72umY/Ol2P8ywnbfGOzKtzsBwP7xNb1CaqtouBVeJdP9E18s3V3T7Lw2uoNriO8YZZF8/AxUVcREAEREAEROOkEJLwehkDCGx68rCbAua3XC7fCa0+xxxig3TB/hHZijwcw6fcZ6RHeXVuBY3G+rbOqU8slgp9v4duWE9A4CY5VHJx5v8EpDXkoma6FN7yhUi8REAEREAERKBIEJLwehlHCGx68YOFlVLhx5cZmxbZcE14b2bWnaCsrpEVkJ150ByZt+TRdeBGbXi83u4UZPrwjfREJW4UhuE9aJNkcIA+F1y4vnG2EN7yhUi8REAEREAERKBIEJLwehrGoCG/vj3rjjwN/4L0r3sMz3z1jynU5y5K5TWlwi9IpvLaUGYX3Xx3+5T/+1O5TzbLFmbbMIrzzxviir87V1riTLIR3ao1uaLvzd/fC6yxF5kZ4w1xswg1Pu9qahNcNLW0jAiIgAiIQqQQkvB5GvqgIrzOqyklj+Sm89tgchvIlyqNxpcbm+G6Et/2mtxAfEw3WEH77t7eNpFPOTbNSGhzhTYvOTmxzDSbtW2XeDRBeNxUVbLUGp1RnFeHNrK6vh3vPdnUKb/lSxbBmVPdc2Kt2IQIiIAIiIAJFi4CE18N4SnjDgxdqklzwnrIV3nVzMWjxPVhVupSpHmEl3S+8VkqDhTdNhCe26olJcb7auwHC66aigs3RLQDC++o3mzB67lpzHeedWhnvDGkf3qColwiIgAiIgAgUYQISXg+DW1SE11ZGGN1hNOZsmJOvEd7zZ5yPw0mHTY1fplXYlq3wbv4ag+YPNsJbq2wt7DyyEzx/f3WI4IljNsqaJryz2/TFo/uWhye8nLg21rFaWqiosDOHd+QWoFRFD3da5l3tamsS3jzBq52KgAiIgAgUEQISXg8DWVSE11n7lhPIbEoBa++aCGj3qa7q8LpF6UyhsMewOcOuhTfxoE94D6UvCRwgyZkJb1rkd2WjCzE4eVO68KaW8K2i5ibCy15OoQ3Vx5YrM9secosmx9tJeHOMTB1EQAREQAQikICE1zHoJ06cQGpqKmJiYlzdChJeV5gybBRKeBmdfXTpo/5tOYGuSeUmWR7Amf9rxdw/0S0z4U17fWWDNmZpYtOPk9ZKVgc4Ac2t8NoFKLiDkyi82w4cRadnFpnrUEpDePejeomACIiACBR9AhLetDGm6D7yyCPmp6eeesrVyEt4XWFyJbw2imw3XnNz2tK9WRzCpmLYTVxFeEMJb92r0DaqbM6E1xnBzUp43Qp0eChNL7vamoTXA0R1FQEREAERKNIEJLwA5s2bh1GjRmH//v3o27evhDetSoJNN+CqZyO+HIHOdTvjhS4veP5AhIrwLu23FB1mdPDv243w2lSMkMLLF51pBzaHNy3/dmWpUhhcq7rpOrXVPWh7YHfOhNfW4uUOQi3Xa4VYwuv5ftEOREAEREAERMArAQkvgISEBMTFxWHcuHEoWbJkxAkv83YptxRaTiALzuENWLrXlv3ycOdZ4bUiXa54OSzrv8y/xDF3nevCy1XRKqZNNBsVC6fwvnfOQ2iyZ7NPeN0uEuEsexaqjxVeuwSxB17Zdb10wlf4bedhpTRkB0rvi4AIiIAIRCwBCa9j6B9//HGkpKRErPCyvNeOIzuwI34H5veejx6zehg6eSW8zsUtpvWY5l9lLU+E1zlxLEh411w4EWjQKWe/BJZPBOY/6OsTSnidi1N0TtsuZ0dwvbVdXlgpDa6RaUMREAEREIEII1CkhXflypX4/vvvQw5ppUqVTPqCs2UlvG+//XaG/TANYv369CoBhfXesRFeW8/WCqetl5vbwmtXV7MLRvC4FF77ep4L77SeWLl7tT+lISzhdU6KCyW8fH/ZRODMy4HWA/L01rDCe3fXM3DPxY3y9FjauQiIgAiIgAgURgJFWniXLFmCpUuXhhyXKlWqYMiQIa6Fd9q0aRn28/TTT0t4w7jrbe6tFWwrvM6cXDcpDbM3zA6o7JChz5i6vmWG2ZwR3g/vwPZf30WPunXMW2EJ766fgclpUWG3aRBhsHLTxa62JuF1Q0vbiIAIiIAIRCKBIi28OR1QpTT4ViyzEdbeH/U2i0Ewt/fLbV8GLt2bU7iO7TMT3oVbF5rJcfb42R3CRqbtdhmEN7NauD++Dcwehhan1gtfeI1Ex/oOnYdLB2fHgO9TeD9buwuDO56GPuec4qaLthEBERABERCBiCIg4QVw/Phx8+eJJ54wObyjR49GsWLFEB0dneXNUFTKkmWW0mAnl4VcutfDxyQz4bXnYSexZXeIsIX34BZgfMt04e3+NlCzZXaHy/i+jSCfZOHN+YmrhwiIgAiIgAhEFgEJL4AZM2bgscceCxh5piv06dNHwrt7FfJKeC1c5ggPC6P6Q9jCa8KizTExypfuMGzE1vA+9fMeAHatAS4dG54wh3dU9RIBERABERABEcghAQlvDoE5N1eENzx4wfVzT4rwOuvo5uHSv+ERUi8REAEREAEREIHcJCDh9UCzqAjvuv3r0OfjPmhcuTF+3/+7IcJ82LxOafAa4Y1List6sQpGYFdM9h0mWGrT8nhDvufhnlBXERABERABERCBgkdAwuthTIqK8BKBLUFmceSl8AanIoQb4Q0+7wyT1pyLQwQLr7PKgiK8Hj4F6ioCIiACIiACBZ+AhNfDGBV14c2t1INgxHkhvLXK1sJn134WeCgrvDWaA0NDl6fzMPzqKgIiIAIiIAIiUEgISHg9DJSENzx4wcLLJYa71usa1s5sZNrW8g3YiRXe+h2AQZ+GtX91EgEREAEREAERKPwEJLwexjDShHd0h9Ho1bCXB2K+rsHCO7X7VLSt2Tas/Up4w8KmTiIgAiIgAiIQUQQkvB6GO9KE14uYOjHnpvDaxTFCRnjt8r+K8Hq4y9VVBERABERABAo/AQmvhzGU8IYHLzeF11lJYlqPoOWfJbzhDZB6iYAIiIAIiEARIyDh9TCgRV1431z7Jp5d+ayfUG5FeG0ZNLtjL/vNUni5otoP04EzL9PCEB7uc3UVAREQAREQgcJOQMLrYQSLuvDmZiQ2GLOzDNp7V7yHJpWbhDUSWQpvWHtUJxEQAREQAREQgaJGQMLrYUSLkvDaXFiLgzVt80t4M9TPzcGYWOHtXLczXujyQg56alMREAEREAEREIFIISDh9TDSRUl4rTgWNuG1tYK9LF7h4RZQVxEQAREQAREQgUJAQMLrYZAkvOHDc6Y0eInwSnjDHwP1FAEREAEREIFIISDh9TDSRVV4yxUvh2X9lxWKlAYrvPe3vR83Nr3Rw2iqqwiIgAiIgAiIQFElIOH1MLJFVXhtTdvt8dvRY1YPPyEv1RSCMedWhNfD8KmrCIiACIiACIhAhBCQ8HoY6KIuvETjFNPcFN4sF4zwMCbqKgIiIAIiIAIiIALBBCS8Hu4JCW/48FROLHx26ikCIiACIiACIpAzAhLenPEK2FrCGz48CW/47NRTBERABERABEQgZwQkvDnjFdHCu7TfUlQoUcEDsfSuEt5cwaidiIAIiIAIiIAIuCAg4XUBKbNNilKE11Y74LXaSWv8d15NLpPwerjx1FUEREAEREAERCBHBCS8OcIVuHEkCK9zBTYv9XKDMVvhHXDmAIw8d6SHUVBXERABERABERABEciagITXwx0SCcLrXIEtN4X3rkV34cttX0IrpHm4AdVVBERABERABETAFQEJrytMoTeS8IYPTyukhc9OPUVABERABERABHJGQMKbM14BW0t4w4c3e8NszNkwB1c1vAq9GvYKf0fqKQIiIAIiIAIiIALZEJDwerhFJLwe4KmrCIiACIiACIiACOQTAQmvB9CRILw215aYcjOH1wN2dRUBERABERABERCBHBGQ8OYIV+DGkSC8znJlEl4PN4u6ioAIiIAIiIAInDQCEl4P6IuS8DKn9tGljxoazjq8El4PN4i6ioAIiIAIiIAIFAgCEl4Pw1CUhHflrpUYvGCwhNfD/aCuIiACIiACIiACBZOAhNfDuEh4PcBTVxEQAREQAREQARHIJwISXg+gI0l4yxUvh2X9l3mgpa4iIAIiIAIiIAIicHIISHg9cI8E4bW5vc68Xg/I1FUEREAEREAEREAE8p2AhNcD8kgQXpvbK+H1cKOoqwiIgAiIgAiIwEklIOH1gL+oCu/QVkMxrPUwQ0bC6+EGUVcREAEREAEREIECQUDC62EYJLwe4KmrCIiACIiACIiACOQTAQlvGuijR49i//79qFWrFqKjo13hjwThjUuKw+/7f0f5EuXRpHITV1y0kQiIgAiIgAiIgAgUJAISXgB33HEHFi5caMalcuXK6N27N+6///5sx6koCS/FtsOMDuaanSkN2ULQBiIgAiIgAiIgAiJQwAlIeAFMmDABPXr0QP369fHtt99iyJAheP/999GqVassh68oCS8vtMXrLSS8BfwDq9MTAREQAREQARHIOQEJbwhmnTp1Qv/+/TF06FAJb87vKfUQAREQAREQAREQgQJFQMIbNBybN2/GxRdfjClTpqBz587+dxMTEzMMXIsWLbB+/foCNaBeTkYRXi/01FcEREAEREAERKCgEijSwrty5Up8//33IdlXqlQJffv2DXgvPj4e119/PcqXL4+33noLMTEx/vevuOKKDPtZt26dhLeg3tk6LxEQAREQAREQARFII1CkhXfJkiVYunRpyMGuUqWKydW1jVUahg8fjp07d2L69OmgEGfXlMObHSG9LwIiIAIiIAIiIAInn0CRFl63eOPi4jBs2DAkJCTg1VdfdSW73LeE1y1hbScCIiACIiACIiACJ49AxAsvJZdlyFJSUvDiiy+iXLlyZjSYzsCavFk1Ce/Ju3F1ZBEQAREQAREQARFwSyDihXfXrl1gVYbgxnq8K1asiCjhbT+9PeKT41WH1+2nR9uJgAiIgAiIgAgUCgIRL7xeRqmoRXgHzR+EVbtXSXi93BTqKwIiIAIiIAIiUOAISHg9DImE1wM8dRUBERABERABERCBfCIg4fUAWsLrAZ66ioAIiIAIiIAIiEA+EZDwegAt4fUAT11FQAREQAREQAREIJ8ISHg9gJbweoCnriIgAiIgAiIgAiKQTwQkvB5AS3g9wFNXERABERABERABEcgnAhJeD6AlvB7gqasIiIAIiIAIiIAI5BMBCa8H0EVVeMd3Ho+u9bp6IKOuIiACIiACIiACIlBwCEh4PYxFURNeDyjUVQREQAREQAREQAQKLAEJr4ehkfB6gKeuIiACIiACIiACIpBPBCS8HkBLeD3AU1cREAEREAEREAERyCcCEl4PoCW8HuCpqwiIgAiIgAiIgAjkEwEJrwfQEl4P8NRVBERABERABERABPKJgITXA2gJrwd46ioCIiACIiACIiAC+URAwusBtITXAzx1FQEREAEREAEREIF8IiDh9QBawusBnrqKgAiIgAiIgAiIQD4RkPB6AC3h9QBPXUVABERABERABEQgnwhIeD2AlvB6gKeuIiACIiACIiACIpBPBCS8HkBLeD3AU1cREAEREAEREAERyCcCEl4PoCW8HuCpqwiIgAiIgAiIgAjkEwEJrwfQEl4P8NRVBERABERABERABPKJgITXA2gJrwd46ioCIiACIiACIiAC+URAwusBtITXAzx1FQEREAEREAEREIF8IiDh9QBawusBnrqKgAiIgAiIgAiIQD4RkPB6AE3hVRMBERCBnBBo164dli9fnpMu2lYEREAEsH79elHwQEDC6wHem2++idTUVNx0000e9qKu2RH4448/MH78eEycODG7TfW+RwJ9+/bF5MmTUblyZY97UvesCDz33HNo0qQJevbsKVB5SODbb7/FggUL8MQTT+ThUbRrErj44ovx+eefC4YIFFgCEl4PQyPh9QAvB10lvDmA5XFTCa9HgC67S3hdgvK4mYTXI8AcdJfw5gCWNj0pBCS8HrBLeD3Ay0FXCW8OYHncVMLrEaDL7hJel6A8bibh9QgwB90lvDmApU1PCgEJrwfsEl4P8HLQVcKbA1geN5XwegTosruE1yUoj5tJeD0CzEF3CW8OYGnTk0JAwusBu4TXA7wcdJXw5gCWx00lvB4Buuwu4XUJyuNmEl6PAHPQXcKbA1ja9KQQkPCeFOw6qAiIgAiIgAiIgAiIQH4RkPDmF2kdRwREQAREQAREQARE4KQQkPCeFOw6qAiIgAiIgAiIgAiIQH4RkPDmF2kdRwREQAREQAREQARE4KQQkPB6wH748GEkJyerSL8HhuF0FfdwqPn6cKGU48ePo1ixYhl2cuLECfz999+oWrVqyPeTkpKwf/9+1KhRA1FRUeGfRIT0TElJCckxu8vPbhyy6x9J7x86dAiJiYnmngzVsrtn9+zZg3LlyqF06dKRhC3H13r06FHz2a9Vqxaio6Nz3F+/s3OMTB3ygICENwyoR44cwT//+U8sXLjQ9G7durVZBaxatWph7E1dnAQuvfRSbNiwIQDKXXfdhTvvvBPi7v1emTNnDsaNG4evv/46YGdffvkl7rnnHsOYbfTo0bj++uv9kvzSSy/hhRdeMD9zFbZXXnnF3PdqoQls3boVXbt2xeLFi1GnTh3/Rk8//TSmTZsW0Onss8/GO++8Y17LahzEOp0ARbV///7YvHmzebFhw4a44447cNVVV7m6Z7ds2YJbb73V379Pnz5mNbbixYsLcxABcrX/1/Gz37t3b9x///3+rfQ7W7dMYSEg4Q1jpPifPf+DmjFjBsqUKWN+cZ5++ungf2Zq3gjwl+cVV1wRsORqbGwsKlWqZCRL3MPjy//gBw0ahG3btqFmzZoBwsvoTfv27XH33XfjxhtvxKJFizB8+HDzd926dfH999/juuuuM/d7y5YtzTLPH3/8MZYsWRJWtCe8Kyg8vShPP/74oznhYOF96qmnQBl+8MEH/RdUqlQpMybZjUPhIZD3Z7p792588MEHuPrqq1G2bFm89tprePXVV7Fs2TITrc3unh08eLCJ7D7zzDPYuXMnrrnmGiO8Vpjz/goKzxEmTJiAHj16oH79+mCZtyFDhuD9999Hq1atzEXod3bhGctIP1MJbxh3AH8p9uzZ03zw2ebNmwdGIVkvVo96wwDq6MJfnvzPiNIQ3MQ9fLZ8vL5371588cUX5ouDM8LLqOLtt9+OX3/9FSVKlDAHYU1Nyu9NN92EZ599FmvXrjVSwUbZ6NixIxgtbtq0afgnVUR7kg8livdwKOE9cOCAibIHt+zGoYjiypXL4he5Ll26mC9lbdq0yfKeZcSd2/DLM6PrbJRdjtnkyZNz5XyK8k46depkoutDhw71C69+ZxflES861ybhDWMs+Sh3zJgx5pstG0WhV69eWLVqFRiNVAufAJky8sJHlLVr18aVV15pIgts4h4+V9vzk08+wdixYwOEd+bMmSY69vnnn/sPwMeYp512mnl0OWLECBNhf/zxx/3vn3HGGZgyZQo6d+7s/aSK4B527doFikEo4Z0/f76JqJNpt27d0LZtW0Mgu3Eogphy7ZIYcWTUfMWKFSblJqt7lk8t+Htm6dKlqF69ujkHfpn78MMPzZc4tcwJMIWEX4adn339ztYdU1gISHhzOFKc9NOoUaOADzxzTvmh5yNeSppa+ASYJ8pJEeTMaCQfxfM/oXr16ol7+Fj9PUMJLyO+n376acB/9hQGfvH417/+ZSLuTZo0Ccjb45cP5vky/UQtI4HMhJf38qZNm1CyZEmsWbPGfMngPc/fH9mNgziHJsAna1whkCk7TMthy+qeZYSXKTrOAAW/bLz88ssZctvFPJ1AfHy8yesvX7483nrrLcTExJg39Ttbd0lhISDhDWOk+J89o2TMa2JThDcMiC66cIY1J/7cfPPNJk9a3F1Ay2aTcCO8jJo99thj/r0rwps16MyEN7jXfffdB6Y4MMKuCG/O7++//voL/fr1w3nnnWfyca2E8QtbZvesjfAyH9VONFaEN2v2zC9nXj/TPqZPn26eToRq+p2d83tYPfKPgIQ3DNbKJQ0DWphdOJmEj81ZpUHcw4To6BZKeEPljjIfcuDAgf4c3nXr1mHq1KlmT8rhzX4c3Aovc3lXr15tck+zG4fsjxpZW6xfvx433HCDSQthDq6z1B7zzjO7Z0Pl8I4aNQocM+XwZryH4uLiMGzYMCQkJJgvZpnJru2p39mR9TksTFcr4Q1jtJzVAjhD+JZbblGVhjA4Bndh+gLL33BCYJUqVcxjdkbAGFFgnqO4hw+ZKSKsGc0JlpQscmbqCCWB/5FxxvVDDz2UZZUGRiBZpeG5557D3LlzVaUhk+EgZ0bC+HSCKQsULFvuiuyZ78+8dAoZJwZywqAViqzGIfzRL3o9yY7pNPzDaK6tDcuqOYzs2ioNmd2z/DJXoUIFVWnI5tbg7waWIeOk1xdffNGkObExks6avPqdXfQ+W0X5iiS8YYwuc5lYs5QTUthatGiBSZMmZVr8PIxDRGQX/vJkxIaRFtsoYczNYxP38G8LRsP4RcLZGDG31QIowJyoZhsjXgMGDDA/UpZZmog5jmz8ksdor53hHv5ZFc2eTL2x9Yx5hRQwTqZiY/SLubu22XJYLE3GltU4FE1a4V0Vn1RQdIObvaezu2c3btxo0qRY3cGOC3PSbZWS8M6q6PWyTyqCr8ze0/qdXfTGvChfkYTXw+hylR/mLGnBCQ8Qg7ryPyqu6ENh4ATAUCuCiXvu8XbuiSuwMTLJmeuh/uPnilb79u0Le7WlvDnrwrdXrjrFe5ycQ63wld04FL4rPnlnnN09S6Fj1NJGLk/emRbeI+t3duEdu0g7cwlvpI24rlcEREAEREAEREAEIoyAhDfCBlyXKwIiIAIiIAIiIAKRRkDCG2kjrusVAREQAREQAREQgQgjIOGNsAHX5YqACIiACIiACIhApBGQ8EbaiOt6RUAEREAEREAERCDCCEh4I2zAdbkiIAIiIAIiIAIiEGkEJLyRNuK6XhEQAREQAREQARGIMAIS3ggbcF2uCIiACIiACIiACEQaAQlvpI24rlcEREAEREAEREAEIoyAhDfCBlyXKwIiIAIiIAIiCPgNEgAADhBJREFUIAKRRkDCG2kjrusVAREQAREQAREQgQgjIOGNsAHX5YqACIiACIiACIhApBGQ8EbaiOt6RUAEREAEREAERCDCCEh4I2zAdbkiIAIiIAIiIAIiEGkEJLyRNuK6XhEQAREQAREQARGIMAIS3ggbcF2uCDgJfPnllzh+/DiaNWuGWrVq+d/aunUr/vzzT3Tu3PmkAPvxxx/xwQcfYNmyZejZsyfuueeeXDmP77//Htu2bcNVV12VK/sL3sm8efNQoUIFdOjQIU/2n587nTFjBr7++mtMnDgxTw+bkJAAcmvRogUaNWqUp8fSzkVABCKXgIQ3csdeVy4COOOMMwyFCy64AK+++qqfyJtvvoknn3wS69evz3dKR44cQevWrY00XnjhhahUqRJ69eqVK+fxyCOP4J133vFf14svvoi33noLK1asyJX9d+rUCU2bNsUrr7ySK/s7mTt5/vnnzZcOSm9utVC8d+zYYcb5oYcewqBBg3LrUNqPCIiACAQQkPDqhhCBCCZA4W3YsCE2bNiAt99+G+eee66hcTKF97PPPsPw4cPx3XffGdnNzcZoYnJyMmJjY81uX3jhBXPduSW8cXFxiI6ORrly5XLztE/KvvJCeEPxPnHiBA4dOoQyZcqgZMmSJ+VadVAREIGiT0DCW/THWFcoApkSoPAysjZr1iyULl0a7777LqKiokIK75w5c/Df//4Xv//+Oxo3bozbbrstrNSAPXv2YMyYMVi6dCmOHTuGbt26YeTIkahatapJYbj//vuxa9cunH322ea8ue1pp52W4RqYisHH7oxCbty4EfXr18cll1xiZHn69OlYvnw57rrrLnMtTM8YMWKE+ZvHpXgtWbLEHGv//v3+YzGS3K9fPxw+fBjPPfccvvjiC3Mu7dq1M5zOPPNMcx7896mnnmoewZPL33//jZdffhn/+c9/UKdOHQwZMsRsl9W18n17noxm8zz/+usvXHfddRg8eDBq1KiR6bjx2t944w0zXvyywvHgdV966aWg1N9yyy0mKs592bZ582Y88MADuO+++9C2bVvD5pdffjEpHpUrVwaj03yvZs2apkuw8JIf+w0YMMC/z9GjR6NixYq48847zWtZ7TMz3ldffTVuvvlm/OMf/zDnwMbzGjt2rPkiUrduXVx55ZUYNmwYSpQo4edfpUoVUJY/+ugjFC9e3JzXjTfe6N+GaTEcZ6axUKRbtmxpGPHpgZoIiEDkEZDwRt6Y64pFwE+AwvvYY4+hXr16uPXWW82j+C5dumQQ3o8//hj33nuvERJKFXMu+aibUnT55Ze7JsroKnNyKYKUMjamUlSrVg2ffvop+Hj76aefxqJFizBq1Cjzfo8ePUC5CW7PPvusEXCeL7dZt24dpk6datIV/v3vf2PKlCmmS5s2bYw89u/f38iufUxPUeSxeB32WBTaVq1aoU+fPjh48KARKMrg66+/bqSa25YvX96I/tq1a83+efyYmBizL34JIFP+O7trpaTZ86RkXn/99ShWrBjGjRtnhJnymVljP0amKXk8X7L75JNP8N577xmhoxyuWbPGSD0jzmyUcUo1v1Twyw23Oeuss4xQUvonTJhghP61114LKbyWM78k2HbDDTegevXq5ssBW1b7zIx3kyZNzDXwusmV+eNdu3ZFgwYNMHDgQMOZYs8vIkyzYbP8+aWIY88+TE3hvcT0HF7PeeedZ/6QUXx8PObPn2+eYNgvI65vWm0oAiJQJAhIeIvEMOoiRCA8AlZ4KXaMBvKR/Ny5c03k0ZnDe/HFF5tHzoxm2kbRZYT2888/d31wijKjgJRROyGOcksJYX4n5YXvcaIUI3SZNQrz+eefHyBB3Hb37t1Gbq0Qzpw5ExQq24KjlqEesS9cuBB33HGHXx7Zl1FtXi+juIwiU7gYbeQXBAqxbRRlK7xurpXn+f7772Px4sVGQtkoy5xMmBnXffv2mYjz//3f/+H22283fVJSUozY8/gPP/ywEXNGiSm43Jby3bFjR3PejE47G8fwwIEDmDZtmvnCwC8OFPhgVm6E1+43s32G4s2ItFN4n3jiCSOvq1at8qeePPPMM/jf//6Hb775xowvr4OiznuGTyTY+EWM1/r444+be4cseDy+btvRo0f9nF3ftNpQBESgSBCQ8BaJYdRFiEB4BJzCu3r1ahNlZKSN4muFl/LSvHnzDFFHG5389ddf/Y+RszsLCgolhMdiNQM25m9S1ijCfDTuRniZrkBJnzx5sokGBjeeG4WTMu1sboSXUjt+/Hgz+cw2phBQeu3EKgoXJc1GHEMJr5trDXWelE6mcWQ2YZC5zYxaUvgYbbaNkVBKKSWc53vRRReZCCfH036pIBPmbLPx37xWXpez2fEMR3iz26cb4WUkl/ccI/G2MVLNJxAUYV5TKP6MrrMx6p+UlGSeRjDSyy9rjGRfdtllqF27dna3qN4XAREoogQkvEV0YHVZIuCGgFN4uT0jrRSnm266CUwZoHTxcTCFgSkNQ4cO9e/2pZdeMo/Bf/75Z9dRM8oXhYw5mnaCUmJioilJxfxK5om6EV4rQM6Jds7r9SK89hwZUQxuzNtl+ocb4XVzraHOk7m5zI3NTHi/+uorkw5iU1Gc58hJfsxVZSNnngMF+cEHHzRfLJjzzGYjwNdcc42JkvOaGFFmFYushJfyyH3Z5kxpcLNPN8LLcypbtqyJTttmJZ/pFsx3DsWfUXmKPoWXjddLlswDtpMSM/uC5Oazom1EQAQKNwEJb+EeP529CHgiECy8fJx9xRVXmMf0jI5Z6WJUjdsywmYbRYl5rVYmKMZ8PM2+zEUN1fj4nsLkFFX2pzhxklLv3r1dCe+WLVvMZLfgXFcKDx/HuxVeSuGkSZMC0ic+/PBDM5mNebG2bJu9ltTUVPMI3Y3wurnWcITX5rgyuswxcDZ7fnyN6R1MY2DaA79EMGrNKCeblXF+uWEuMRsnLnLyYGbCyxQBTh5keoHzHmD9ZubwutlnKN7BKQ3MXWbqjPOLlI2WcxIhJydmJ7z2PrDnyXxsnv/pp59ungqoiYAIRB4BCW/kjbmuWAT8BIKFl29Y4eC/rfDaaCEnJTGHldFAPg7ntnYSEKOSjKhx8lRmCwgwVYKTiigtd999t5FHihgFlpFLpjm4ifDy3BjlZGoE/2aeJiswUIyYg+xWeG2uJ2Wbi2/wfFhlgfsrVaqUvxrDpk2bQBGmaDGFwo3wurnWcISX185x4Bg89dRTJh1k7969ZoIaz9852Y1pIkwzYMSUXyxsVJ05whRhCi4ncjHiznHgl5zMhNfKKuWW+cbkwT/8gsTX3OwzFG+mZjhzeG1qDceAecj8Esb0Co4P0z3YshNepnDwSxX7U9JZoYJPLZgWwUoVaiIgApFHQMIbeWOuKxaBAOHlJB9GWG2zEUSn8DInkpE9Cq1tnEHPiVO2VBTli4+csxJe9qX0MFeX5b7YWKGAomrLRfGRNGU6q0lr7EfJY5STQmcbH7lzwhujjYzQBufwUuoYybSLKTASSAGyk/FsWgXlmZUbmCtsG3N6yYCT4PjYnXnNWeXwurnWUOfppgYyH9ez6oJNUeCxGFlnmoON4vI1VqXgOAVHwjnJjVFsVt+wfZm2wgl7VniDWW3fvh2PPvqonx1Z8wuRlVU3+wzFm+fGNAxbpYHnw2oTzsl1TGMge1uqLRR/fgng/inmzEtmegwrQ9jrs+kYlH81ERCByCMg4Y28MdcVi0DYBDjLfefOnWYZYltVIJyd8dG7U3jtTPtw9sUJTqyDy9JmjMqG07i6Gx+tsxaw81yYX8yKEMyNDXcxidy81uBro2Ty2nndzmoRbhnwUT/l+ZRTTjGpIG4aUyV4PLt4R3AfN/vMjLdzX7w2Sjaj/uEuQMJ6yjwfXp+Xe8wNF20jAiJQsAlIeAv2+OjsREAEREAEREAEREAEPBKQ8HoEqO4iIAIiIAIiIAIiIAIFm4CEt2CPj85OBERABERABERABETAIwEJr0eA6i4CIiACIiACIiACIlCwCUh4C/b46OxEQAREQAREQAREQAQ8EpDwegSo7iIgAiIgAiIgAiIgAgWbgIS3YI+Pzk4EREAEREAEREAERMAjAQmvR4DqLgIiIAIiIAIiIAIiULAJSHgL9vjo7ERABERABERABERABDwSkPB6BKjuIiACIiACIiACIiACBZuAhLdgj4/OTgREQAREQAREQAREwCMBCa9HgOouAiIgAiIgAiIgAiJQsAlIeAv2+OjsREAEREAEREAEREAEPBKQ8HoEqO4iIAIiIAIiIAIiIAIFm4CEt2CPj85OBERABERABERABETAIwEJr0eA6i4CIiACIiACIiACIlCwCUh4C/b46OxEQAREQAREQAREQAQ8EpDwegSo7iIgAiIgAiIgAiIgAgWbgIS3YI+Pzk4EREAEREAEREAERMAjAQmvR4DqLgIiIAIiIAIiIAIiULAJSHgL9vjo7ERABERABERABERABDwSkPB6BKjuIiACIiACIiACIiACBZuAhLdgj4/OTgREQAREQAREQAREwCMBCa9HgOouAiIgAiIgAiIgAiJQsAlIeAv2+OjsREAEREAEREAEREAEPBKQ8HoEqO4iIAIiIAIiIAIiIAIFm4CEt2CPj85OBERABERABERABETAIwEJr0eA6i4CIiACIiACIiACIlCwCUh4C/b46OxEQAREQAREQAREQAQ8EpDwegSo7iIgAiIgAiIgAiIgAgWbgIS3YI+Pzk4EREAEREAEREAERMAjAQmvR4DqLgIiIAIiIAIiIAIiULAJ/D+92WNrlUOMCAAAAABJRU5ErkJggg=="
},
"metadata": {},
"output_type": "display_data"
@@ -247,7 +247,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
" lower_bounds=np.arange(5) - 2,\n",
@@ -282,7 +282,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
" constraints=[{\"loc\": [1, 3], \"type\": \"fixed\"}],\n",
@@ -308,7 +308,7 @@
{
"data": {
"text/plain": [
- "array([ 0.33334, 0.33333, 0.33333, -0. , 0. ])"
+ "array([ 0.33333, 0.33333, 0.33334, -0. , 0. ])"
]
},
"execution_count": 11,
@@ -318,7 +318,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.array([0.1, 0.5, 0.4, 4, 5]),\n",
" algorithm=\"scipy_lbfgsb\",\n",
" constraints=[{\"loc\": [0, 1, 2], \"type\": \"probability\"}],\n",
@@ -363,7 +363,7 @@
{
"data": {
"text/plain": [
- "array([ 0., -0., -0., 0., -0.])"
+ "array([ 0., -0., -0., 0., 0.])"
]
},
"execution_count": 13,
@@ -373,7 +373,7 @@
],
"source": [
"res = om.maximize(\n",
- " criterion=upside_down_sphere,\n",
+ " fun=upside_down_sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_bfgs\",\n",
")\n",
@@ -423,10 +423,10 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
- " derivative=sphere_gradient,\n",
+ " jac=sphere_gradient,\n",
")\n",
"res.params.round(5)"
]
@@ -456,7 +456,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
" numdiff_options={\"n_cores\": 6},\n",
@@ -490,7 +490,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(10),\n",
" algorithm=\"scipy_neldermead\",\n",
" soft_lower_bounds=np.full(10, -5),\n",
@@ -515,7 +515,7 @@
"outputs": [
{
"data": {
- "image/png": "iVBORw0KGgoAAAANSUhEUgAAArwAAAH0CAYAAADfWf7fAAAgAElEQVR4XuzdCXhU1f3/8W9WIGGHIAEBJcq+G9lkESX9C1pBtNhSii2ubalaQSvuuLX4c6HaaqVqq6hQFcsmIigIoqBgkX0JiOxL2IUQkpD8n+/BO06GLDNzZiaTm/d9Hh+E3HPuOa9zCZ85OffcmMLCwkLhQAABBBBAAAEEEEDApQIxBF6XjizdQgABBBBAAAEEEDACBF5uBAQQQAABBBBAAAFXCxB4XT28dA4BBBBAAAEEEECAwMs9gAACCCCAAAIIIOBqAQKvq4eXziGAAAIIIIAAAggQeLkHEEAAAQQQQAABBFwtQOB19fDSOQQQQAABBBBAAAECL/cAAggggAACCCCAgKsFCLyuHl46hwACCCCAAAIIIEDg5R5AAAEEEEAAAQQQcLUAgdfVw0vnEEAAAQQQQAABBAi83AMIIIAAAggggAACrhYg8Lp6eOkcAggggAACCCCAAIGXewABBBBAAAEEEEDA1QIEXlcPL51DAAEEEEAAAQQQIPByDyCAAAIIIIAAAgi4WoDA6+rhpXMIIIAAAggggAACBF7uAQQQQAABBBBAAAFXCxB4XT28dA4BBBBAAAEEEECAwMs9gAACCCCAAAIIIOBqAQKvq4eXziGAAAIIIIAAAggQeLkHEEAAAQQQQAABBFwtQOB19fDSOQQQQAABBBBAAAECL/cAAggggAACCCCAgKsFCLyuHl46hwACCCCAAAIIIEDg5R5AAAEEEEAAAQQQcLUAgdfVw0vnEEAAAQQQQAABBAi83AMIIIAAAggggAACrhYg8Lp6eOkcAggggAACCCCAAIGXewABBBBAAAEEEEDA1QIEXlcPL51DAAEEEEAAAQQQIPByDyCAAAIIIIAAAgi4WoDA6+rhpXMIIIAAAggggAACBF7uAQQQQAABBBBAAAFXCxB4XT28dA4BBBBAAAEEEECAwMs9gAACCCCAAAIIIOBqAQKvq4eXziGAAAIIIIAAAggQeLkHEEAAAQQQQAABBFwtQOB19fDSOQQQQAABBBBAAAECL/cAAggggAACCCCAgKsFCLyuHl46hwACCCCAAAIIIEDg5R5AAAEEEEAAAQQQcLUAgdfVw0vnEEAAAQQQQAABBAi83AMIIIAAAggggAACrhYg8Lp6eOkcAggggAACCCCAAIGXewABBBBAAAEEEEDA1QIEXlcPL51DAAEEEEAAAQQQIPByDyCAAAIIIIAAAgi4WoDA6+rhpXMIIIAAAggggAACBF7uAQQQQAABBBBAAAFXCxB4XT28dA4BBBBAAAEEEECAwMs9gAACCCCAAAIIIOBqAQKvq4eXziGAAAIIIIAAAggQeLkHEEAAAQQQQAABBFwtQOB19fDSOQQQQAABBBBAAAECL/cAAggggAACCCCAgKsFCLyuHl46hwACCCCAAAIIIEDg5R5AAAEEEEAAAQQQcLUAgdfVw0vnEEAAAQQQQAABBAi83AMIIIAAAggggAACrhYg8Lp6eOkcAggggAACCCCAAIGXewABBBBAAAEEEEDA1QIEXlcPL51DAAEEEEAAAQQQIPByDyCAAAIIIIAAAgi4WoDA6+rhpXMIIIAAAggggAACBF7uAQQQQAABBBBAAAFXCxB4XT28dA4BBBBAAAEEEECAwMs9gAACCCCAAAIIIOBqAQKvq4eXziGAAAIIIIAAAggQeLkHEEAAAQQQQAABBFwtQOB19fDSOQQQQAABBBBAAAECL/cAAggggAACCCCAgKsFCLyuHl46hwACCCCAAAIIIEDg5R5AAAEEEEAAAQQQcLUAgdfVw0vnEEAAAQQQQAABBAi83AMIIIAAAggggAACrhYg8Lp6eOkcAggggAACCCCAAIGXewABBBBAAAEEEEDA1QIEXlcPL51DAAEEEEAAAQQQIPByDyCAAAIIIIAAAgi4WoDA6+rhpXMIIIAAAggggAACBF7uAQQQQAABBBBAAAFXCxB4XT28dA4BBBBAAAEEEECAwMs9gAACCCCAAAIIIOBqAQKvq4eXziGAAAIIIIAAAggQeLkHEEAAAQQQQAABBFwtQOB19fDSOQQQQAABBBBAAAECL/cAAggggAACCCCAgKsFCLyuHl46hwACCCCAAAIIIEDg5R5AAAEEEEAAAQQQcLUAgdfVw0vnEEAAAQQQQAABBAi83AMIIIAAAggggAACrhYg8Lp6eOkcAggggAACCCCAAIGXewABBBBAAAEEEEDA1QIEXlcPL51DAAEEEEAAAQQQIPByDyCAAAIIIIAAAgi4WoDA6+rhpXMIIIAAAggggAACBF7uAQQQQAABBBBAAAFXCxB4XT28dA4BBBBAAAEEEECAwGtxD1x44YWSmZlpUQNFEUAAAQQQQAABBMIt4KrAm5eXJy+99JK89957cvr0aWnRooU8+eSTkpqaahxXrVol9957r2RlZUmrVq3kmWeekQYNGpivTZ06VSZMmCC5ubmSkZEh48aNk7i4uFL9Cbzhvj2pHwEEEEAAAQQQsBdwVeA9cuSIvP322zJixAipXr26vPDCC2YG9vnnnzcBWIPsI488In369JHXX39dvvjiC3n55Zdl69atpsyUKVNMAB49erR06tRJRo4cSeC1v8eoAQEEEEAAAQQQKFcBVwVeX8l169bJPffcI7NmzZKVK1fK448/Lu+++645raCgQHr06CEff/yxTJ48WY4dOyZjxowxX1u/fr2MHTtWpk2bRuAt19uTiyOAAAIIIIAAAvYCrg68b731lqxdu9Ysa5g+fbqZ0R0/frxH7brrrpOHHnrIzOymp6fLkCFDzNdOnTplfr969WoCr/09Rg0IIIAAAggggEC5Crg28O7du1d+9atfyb///W9p3Lix/Oc//xGd8dW1uc4xfPhwGTVqlLzzzjvSv39/GThwoOdruj5306ZNEhMTY/5szZo1Zw3UNddcw0Nr5Xr7cnEEEEAAAQQQQKBsAVcG3sOHD8sNN9wgd999t/Tu3dsozJgxQxYuXGgeVHOOQYMGyaOPPmoCb8eOHWXo0KHmS8ePH5fu3bsXCbl33nnnWZoffPABgbfse4wzEEAAAQQQQACBchVwXeD9/vvv5Te/+Y3ceOONMmDAAA+uLm24//77Pety8/PzpWvXrjJ//nx5//33Zd++fWbdrh66m4OeO3PmzFIHh10ayvXe5eIIIIAAAggggIBfAq4KvCdOnJCbbrrJLGXwXp6gEvqQ2hVXXCEPPvigmfXVXRr0gbVJkybJzp07ZdiwYebhNWeXBt3STJc7lHYQeP26xzgJAQQQQAABBBAoVwFXBV59+EwDbWxsbBFUXb+r24xt2LDB7Nqwe/duSUtLk6efflqaNGliztWdHPSBtpycHOnbt6950C0xMZHAW663JxdHAAEEEEAAAQTsBVwVeO05AquBGd7AvDgbAQQQQAABBBAoDwECr4U6gdcCj6IIIIAAAggggECEBAi8FtAEXgs8iiKAAAIIIIAAAhESIPBaQBN4LfAoigACCCCAAAIIREiAwGsBTeC1wKMoAggggAACCCAQIQECrwU0gdcCj6IIIIAAAggggECEBAi8FtAaeF98+i9So1Yt6X5pf4uaKIoAAggggAACCCAQLgECr4WsE3i1ioxB11rURFEEEEAAAQQQQACBcAkQeC1kvQNvv4FXS3xCgkVtFEUAAQQQQAABBBAIhwCB10LVO/CmX9JH6tRPsaiNoggggAACCCCAAALhECDwWqgSeC3wKIoAAggggAACCERIgMBrAU3gtcCjKAIIIIAAAgggECEBAq8FNIHXAo+iCCCAAAIIIIBAhAQIvBbQBF4LPIoigAACCCCAAAIREiDwWkBr4H1j4kty/NhR4aE1C0iKIoAAAggggAACYRQg8FrgauCd/K9X5PDBAwReC0eKIoAAAggggAAC4RQg8FroauCdM3O6fLtxvTRv2VrSWrWxqI2iCCCAAAIIIIAAAuEQIPBaqBJ4LfAoigACCCCAAAIIREiAwGsBTeC1wKMoAggggAACCCAQIQECrwW0d+BNSW0knbr2sKiNoggggAACCCCAAALhECDwWqh6B16tJmPQtRa1URQBBBBAAAEEEEAgHAIEXgtV38Db/dLLpUat2hY1UhQBBBBAAAEEEEAg1AIEXgtR38DLXrwWmBRFAAEEEEAAAQTCJEDgtYDVwPvVki9kw+qV5uUTHbv2kAapjSxqpCgCCCCAAAIIIIBAqAUIvBaiGngzMzNly4Z17MVr4UhRBBBAAAEEEEAgnAIEXgtdAq8FHkURQAABBBBAAIEICRB4LaAJvBZ4FEUAAQQQQAABBCIkQOC1gCbwWuBRFAEEEEAAAQQQiJAAgdcCmsBrgUdRBBBAAAEEEEAgQgIEXgtoJ/Du37NbVn61RFIapkqnbj0taqQoAggggAACCCCAQKgFCLwWok7gPXwgS5Z/vkjq1Ksv6b36WtRIUQQQQAABBBBAAIFQCxB4LUQJvBZ4FEUAAQQQQAABBCIkQOC1gPYNvDVq1ZL0S/pKfEKCRa0URQABBBBAAAEEEAilAIHXQtM38GpVvF7YApSiCCCAAAIIIIBAGAQIvBaoTuD9/ugRWfrpJ6amlu06SNO0Cy1qpSgCCCCAAAIIIIBAKAUIvBaaTuDVKuZNn2pqat6ytaS1amNRK0URQAABBBBAAAEEQilA4LXQJPBa4FEUAQQQQAABBBCIkACB1wKawGuBR1EEEEAAAQQQQCBCAgReC2jvwLtlwzr5duN6SW3STNp1SbeolaIIIIAAAggggAACoRQg8FpoegdeXj5hAUlRBBBAAAEEEEAgjAIEXgtcAq8FHkURQAABBBBAAIEICRB4LaAJvBZ4FEUAAQQQQAABBCIkQOC1gPYOvM5evPq2te6X9reolaIIIIAAAggggAACoRQg8FpoegdercbZizdj0LUWtVIUAQQQQAABBBBAIJQCBF4LTQKvBR5FEUAAAQQQQACBCAkQeC2gCbwWeBRFAAEEEEAAAQQiJEDgtYD2DbwLPpgu+fn50m/g1RKfkGBRM0URQAABBBBAAAEEQiVA4LWQ9A28yxcvlMMHD0j6JX2kTv0Ui5opigACCCCAAAIIIBAqAQKvhSSB1wKPoggggAACCCCAQIQECLwW0AReCzyKIoAAAggggAACERIg8FpAE3gt8CiKAAIIIIAAAghESIDAawFdUuDt2LWHNEhtZFEzRRFAAAEEEEAAAQRCJUDgtZAsKfA2b9la0lq1saiZoggggAACCCCAAAKhEiDwWkgSeC3wKIoAAggggAACCERIgMBrAV1S4E1t0kzadUm3qJmiCCCAAAIIIIAAAqESIPBaSJYUeOvUqy/pvfpa1ExRBBBAAAEEEEAAgVAJEHgtJAm8FngURQABBBBAAAEEIiRA4LWA9g28Wzask283rhdmeC1QKYoAAggggAACCIRYgMBrAeobeA8fyJLlny8i8FqYUhQBBBBAAAEEEAi1AIHXQtQ38Obn5cmC2TMkPiFB+g282qJmiiKAAAIIIIAAAgiESoDAayHpG3i1qnnTp5oaMwZda1EzRRFAAAEEEEAAAQRCJUDgtZAk8FrgURQBBBBAAAEEEIiQAIHXAprAa4FHUQQQQAABBBBAIEICBF4LaAKvBR5FEUAAAQQQQACBCAkQeC2gSwu83S+9XGrUqm1RO0URQAABBBBAAAEEQiFA4LVQLC3wpl/SR+rUT7GonaIIIIAAAggggAACoRBwZeBdsGCB3HXXXfLWW29JmzZtPE4jRoyQFStWSExMjPmz4cOHyz333GP+f+rUqTJhwgTJzc2VjIwMGTdunMTFxZVqTOANxS1IHQgggAACCCCAQHgFXBd4X331VZk/f75kZ2fLE088USTwXnnllTJp0iSpW7duEdWtW7eKhuEpU6ZIgwYNZPTo0dKpUycZOXIkgTe89x+1I4AAAggggAACYRdwXeBdunSpdOnSRW644QZ58MEHiwTe3r17y6JFizwzvI7uxIkT5dixYzJmzBjzR+vXr5exY8fKtGnTAg68yxcvlMMHDwhLGsJ+73IBBBBAAAEEEEDALwHXBV6n19dff708/PDDRQJv586dpWHDhmb2t23btibUNmnSRO677z5JT0+XIUOGmOKnTp0yv1+9ejWB16/biJMQQAABBBBAAIHoFahUgff48eOSnJws+fn5ZmnD+++/L7NmzTLrffv37y8DBw70jJSuz920aZNnNvj3v//9WaM4d+5cyczMLPLnzPBG781OyxBAAAEEEECgcgpUqsDrO8Q9e/Y0ofeFF16Qjh07ytChQ80pGoy7d+8ua9as8RTR8Ot76JpgAm/l/ItDrxFAAAEEEECg4ghU6sDbrVs30Vla3aFh3759ZomDHqtWrZL7779fZs6cWepIFrdLAzO8Fefmp6UIIIAAAgggUDkEKk3g1UC7f/9+ad++vRQWFsorr7wiCxculDfffFN27twpw4YNk8mTJ3t2aWjRooWMGjWKwFs5/h7QSwQQQAABBBBwsUClCby7du2SO+64Q3bs2CFVqlQRfYBNZ3F1GzI9dC3v+PHjJScnR/r27StPPvmkJCYmBh14m7dsLWmtftwD2MX3EF1DAAEEEEAAAQSiWsC1gTcS6qUtaSDwRmIEuAYCCCCAAAIIIFC2AIG3bKMSzygt8NapV1/Se/W1qJ2iCCCAAAIIIIAAAqEQIPBaKBJ4LfAoigACCCCAAAIIREiAwGsBTeC1wKMoAggggAACCCAQIQECrwV0cYH3my+/kKy9e4QlDRawFEUAAQQQQAABBEIoQOC1wCwu8G7fkikb16wi8Fq4UhQBBBBAAAEEEAilAIHXQrO4wKvVzZs+1dSaMehai9opigACCCCAAAIIIBAKAQKvhSKB1wKPoggggAACCCCAQIQECLwW0AReCzyKIoAAAggggAACERIg8FpAE3gt8CiKAAIIIIAAAghESIDAawFN4LXAoygCCCCAAAIIIBAhAQKvBXRZgTf9kj5Sp36KxRUoigACCCCAAAIIIGArQOC1ECTwWuBRFAEEEEAAAQQQiJAAgdcCmsBrgUdRBBBAAAEEEEAgQgIEXgtoAq8FHkURQAABBBBAAIEICRB4LaAJvBZ4FEUAAQQQQAABBCIkQOC1gC4r8DZv2VrSWrWxuAJFEUAAAQQQQAABBGwFCLwWggReCzyKIoAAAggggAACERIg8FpAlxR4ly9eKIcPHhBmeC1wKYoAAggggAACCIRIgMBrAVlS4N2yYZ18u3E9gdfClqIIIIAAAggggECoBAi8FpIEXgs8iiKAAAIIIIAAAhESIPBaQBN4LfAoigACCCCAAAIIREiAwGsBTeC1wKMoAggggAACCCAQIQECrwV0SYF3+5ZM2bhmlafm1CbNpF2XdIsrURQBBBBAAAEEEEAgWAECb7ByIlJS4D18IEuWf77IU3OdevUlvVdfiytRFAEEEEAAAQQQQCBYAQJvsHIEXgs5iiKAAAIIIIAAApETIPBaWDPDa4FHUQQQQAABBBBAIEICIQ+8eXl5cuDAAUlNTY1QF8rvMv4GXm1hxqBry6+hXBkBBBBAAAEEEKjEAiELvEePHpWHH35Y5syZI4WFhbJx40bDOn36dNm2bZvcfvvtrmMuKfDm5+XJgtkzivS3V8YAqZaU5DoDOoQAAggggAACCES7QMgC75gxY+TUqVMm2A4ePFjWrl1r+q7B95ZbbpGFCxdGu0XA7Ssp8GpF86ZPLVJfSsNU6dStZ8DXoAACCCCAAAIIIICAnUDIAm96errMnTtX6tatK23btvUEXp357dGjh6xbt86upVFYOpDAy04NUTiANAkBBBBAAAEEKoVAyALvRRddJLNnz5ZzzjmnSOD96quv5K677pLFixe7DrS0wPvZ3A8l52S2p88EXtcNPx1CAAEEEEAAgQoiELLAO27cONm9e7c88sgj0r9/f1m2bJl8/fXX5vcDBw6U0aNHVxAS/5tZWuBdvnihHD54gMDrPydnIoAAAggggAACYREIWeDNzc2VCRMmyKRJkyQnJ8c0NjExUX7zm9/InXfeKfHx8WHpQHlWSuAtT32ujQACCCCAAAII+CcQssDrXE63Jdu1a5ecPn1amjRpYkKvWw8Cr1tHln4hgAACCCCAgJsEQh543YRTVl8CCbxaF3vxliXK1xFAAAEEEEAAgdALhCzwlrVG95lnngl968u5RgJvOQ8Al0cAAQQQQAABBPwQCFngff7554tcTvfk3b59u8yfP9/szXvrrbf60ZyKdQqBt2KNF61FAAEEEEAAgcopELLAWxKf7tTwyiuvyEsvveQ6YX8Cb/OWreXbjetN39Mv6SN16qe4zoEOIYAAAggggAAC0SwQ9sCrne/Xr58sWLAgmh2CaluggVfDb1qrNkFdi0IIIIAAAggggAACwQmEPfDqm9Z++tOfyqJFi4JrYRSX8jfwbt+SKfn5+ULgjeLBpGkIIIAAAggg4FqBkAXeZ5999iwk3Y/3008/lZ49e5oXULjt8Dfwar91WQOB1213AP1BAAEEEEAAgYogELLAe88995zV3+TkZGnXrp0MHjxY4uLiKoJHQG0k8AbExckIIIAAAggggEC5CIQs8JZL68v5oqUF3t3bt8nJ7BPSILWR7N+zmxnech4rLo8AAggggAAClVfAKvBmZWX5LZeS4r7dCUoLvN4wWzasI/D6fadwIgIIIIAAAgggEFoBq8Crgc/fIzMz099TK8x5/gZeneFd+dUSSWmYKp269aww/aOhCCCAAAIIIICAGwSsAu+JEyf8NtD1vG47/A28hw9kyfLPF0mdevUlvVdftzHQHwQQQAABBBBAIKoFrAJvVPcsAo0j8EYAmUsggAACCCCAAAKWAiELvCdPnpS33npLNm3aJPpaYd/jr3/9q2VTo6+4v4E3Py9PFsyeIfEJCdJv4NXR1xFahAACCCCAAAIIuFggZIH3jjvukO3bt0tGRoa8+eab8otf/EK2bt1qXjjx+OOPyxVXXOE6Rn8Dr3Z83vSppv8Zg651nQMdQgABBBBAAAEEolkgZIG3S5cuMm/ePKlXr555s9rMmTNNv6dPn25C7zPPPBPNDkG1jcAbFBuFEEAAAQQQQACBiAqENPAuXrxYkpKS5Oqrr5Zp06ZJbGys6NvWunXrJitXroxoxyJxsUAC75IFH8vxY0d521okBoZrIIAAAggggAACXgIhC7wjRoyQm2++WXr37i2333679O/f3wRfDbq33HKLfPnll66DDyTwLl+8UA4fPCCpTZpJuy7prrOgQwgggAACCCCAQLQKhCzwrlmzRmrWrClNmzaVVatWyQ033GB+ry+n0AB82223RatB0O0KJvCyNVnQ3BREAAEEEEAAAQSCEghZ4PW9+t69e0VDcOPGjaV169ZBNS7aCwUSeJ23rRF4o31UaR8CCCCAAAIIuE0gZIF3wIABZgmDPrB27rnnus2p2P4EEnidl09UTUqSdp3TpU59971quVIMOp1EAAEEEEAAgQonELLAO3XqVPnggw9kyZIl0qFDBxN+NQTXrVu3wqH42+BgAq/W3bxla0lr1cbfy3AeAggggAACCCCAgIVAyAKv04YjR47I3LlzZfbs2bJ8+XLp2bOnCb9XXXWVRTOjsyiBNzrHhVYhgAACCCCAAALeAiEPvN6Vb9myRR577DH5/PPPJTMz03XyBF7XDSkdQgABBBBAAAEXCoQ88B46dEjmzJljZni/+eYb6dWrl5nhHThwoOv4CLyuG1I6hAACCCCAAAIuFAhZ4H333XfNGl7db7dTp06eNby1a9d2IduZLhF4XTu0dAwBBBBAAAEEXCQQssCrM7iDBg0yuzQ0atTIRUQldyWQwJuflycLZs8wlfHyiUpxe9BJBBBAAAEEEIgSgZAF3ijpT0SbEUjg1YbNmz7VtI+9eCM6TFwMAQQQQAABBCq5AIHX4gbQwPvi03/xO8ASeC2wKYoAAggggAACCAQpQOANEk6LBRp4t2/JlI1rVvkdkC2aRlEEEEAAAQQQQACBHwRcGXgXLFggd911l7z11lvSps2PL3hYtWqV3HvvvZKVlSWtWrWSZ555Rho0aGAo9MUZEyZMkNzcXMnIyJBx48ZJXFxcqTeKE3jjExKk38Cry7ypnLetsaShTCpOQAABBBBAAAEEQiYQ0sB76tQp2bFjhxw/fvysBurODZE4Xn31VZk/f75kZ2fLE0884Qm8p0+fNkH2kUcekT59+sjrr78uX3zxhbz88suydetWGTFihEyZMsUE4NGjR5udJkaOHOlX4NWTMgZdW2b3CLxlEnECAggggAACCCAQcoGQBd5PP/3UBEWdIa1atepZDV22bFnIG19chUuXLpUuXbrIDTfcIA8++KAn8K5cuVIef/xx0e3T9CgoKJAePXrIxx9/LJMnT5Zjx47JmDFjzNfWr18vY8eOlWnTphF4IzJqXAQBBBBAAAEEEAifQMgC74ABA+SPf/yj/OQnPwlfawOo+frrr5eHH37YE3inT59uZnTHjx/vqeW6666Thx56yMzspqeny5AhQ8zXdKZaf7969WoCbwDmnIoAAggggAACCESjQMgC76WXXio6yxsth2/g/c9//iPr1q0za3OdY/jw4TJq1Ch55513pH///kXeBqfrczdt2iQxMTHm9Llz557Vtd///vdmlwY9WNIQLSNPOxBAAAEEEEAAgaICIQu8V155pWiorF69elQY+wbeGTNmyMKFC82Das6hL8p49NFHTeDt2LGjDB061HxJ1yB3795d1qxZ4zn3qaeeOqtf//znPwm8UTHaNAIBBBBAAAEEEChZIGSBV2dA33zzTfntb38rTZs2lcTExCJXTUlJieg4+AbetWvXyv333+9Zl5ufny9du3Y1D7i9//77sm/fPrNuVw/dzUHPnTlzZqltdnZp0JMCmeGtmpQk7TqnS536kUaxpdUAACAASURBVDWJ6ABwMQQQQAABBBBAIEoEQhZ4O3ToICdPniyxW5mZmRHtsm/g1YfUrrjiCvMgW+/evc0uDfrA2qRJk2Tnzp0ybNgw8/Cas0tDixYtzHKH0g7vwJt+SZ8yA6yzS4PWqWE3rdWZLdN0mzIOBBBAAAEEEEAAgfAIhCzwnjhxotQWJicnh6cHJdTqG3j1tA0bNsg999wju3fvlrS0NHn66aelSZMmpoZZs2aZB9pycnKkb9++8uSTT541S+17KQ28r/39eck5mS3+BF4t/9ncD8353oc/s8MRxeNiCCCAAAIIIICAiwRCFnhdZOJ3VzTwTv7XK3L44AG/A29+Xp4smD2DwOu3MicigAACCCCAAAJ2AiENvN98841MnDhRdPlCYWGhXHDBBXLjjTfKxRdfbNfKKC0dTODVrsybPpXAG6VjSrMQQAABBBBAwH0CIQu8s2fPNutj9Y1lbdu2Ndt56S4Hb7zxhtkP9+qry371bkXjJfBWtBGjvQgggAACCCBQGQVCFnj1hRO6s4Guf/U+dG9eXQ9b3D62FR1cA++cmdPl243rpXnL1p6H0MrqFzO8ZQnxdQQQQAABBBBAIHQCIQu8rVu3luXLl4vvw2m6p61u/6UvfXDb4R14daeF9F5Fw35J/fXerUHPadr8AmnZvqPbeOgPAggggAACCCAQFQIhC7yXX365PPHEE+aFDd7H4sWL5bHHHpOPPvooKjocykZ4B974hATpN9C/ZRu+gTeQsBzK9lMXAggggAACCCBQGQRCFninTJkizz33nHlIrU2bNuahNZ3VffXVV81SB32rmdsO78CrfdPAq8G3rIPAW5YQX0cAAQQQQAABBEInELLAq01asGCBCbhbtmwRfZNZy5YtTQDu169f6FocRTX5Bl5/9+L1Dbw1atWS7pf2j6Ke0RQEEEAAAQQQQMA9AiENvO5h8a8noQq8erVeGQOkWlKSfxfmLAQQQAABBBBAAAG/BawDb1ZWltSqVUuOHj1a6kVTUlL8blRFOTGUgdff2eGKYkM7EUAAAQQQQACBaBGwDrw9e/aU4cOHm/W7pR36Mgq3Hb6Bt23ndGnUtFmZ3fRd0qAFCLxlsnECAggggAACCCAQlIB14NVtx5KSkuTkyZOlNsB3u7KgWhtlhXwDr7978RJ4o2wgaQ4CCCCAAAIIuFrAOvA6OvpGtWuuuUZq1KjhajDvzgUbePPz8uT7o0dk1/ZtsmfHNlMlM7yV5rahowgggAACCCAQYYGQBV59nbC+Ta1x48YR7kL5XS7YwOu0eMuGdeYtbQTe8htDrowAAggggAAC7hcIWeC98847pX379mYbsspyhDLwduzaQxqkNqosdPQTAQQQQAABBBCImEDIAu+KFSvkgQceEH3FcLdu3aRatWpFOnHVVVdFrFORupBv4A30jWneM7z+rv+NVN+4DgIIIIAAAggg4BaBkAXewYMHl2oybdo0t5h5+hHKwKu7O7Rs19GvN7W5DpIOIYAAAggggAACYRQIWeANYxujtmoNvKtWrpSsPbtk45pVEugMr3aMWd6oHV4ahgACCCCAAAIuEQhp4D116pR88803snfvXhk0aJAhys3NNb8mJia6hOzHbmjg1f2FnW3GCLyuG2I6hAACCCCAAAIuEAhZ4N28ebPcdNNNkp2dLceOHZMNGzYYnilTpshnn30mf//7313AVbQLTuA9mZ0ti+d9KFWTkqR3xoCA+skMb0BcnIwAAggggAACCAQsELLAq29b69u3r9x8882iW5StXbvWNGb79u3ys5/9TL788suAGxftBZzAq+2cN32qaW6g++lu35JplkPowYNr0T7itA8BBBBAAAEEKqJAyAJvhw4dZOnSpeata96B99ChQ9KrVy9Zt25dRfQptc2hCLzeb11LaZgqnbr1dJ0THUIAAQQQQAABBMpTIGSBt0+fPvKPf/xD2rRpUyTw6u4M+udz5swpz36G5dqhCLzaMJs1wGHpGJUigAACCCCAAAIuEghZ4H377bfltddekz/84Q8yduxYefHFF2X58uXy+uuvy+OPP+55iM1FdhLqwBufkCD9Bl7tJiL6ggACCCCAAAIIlLtAyAKv9mThwoUm9OoDbKdPn5YLLrhAbr31Vundu3e5dzQcDSgu8LZs10Gapl0Y0OW8lzVkDLo2oLKcjAACCCCAAAIIIFC6QEgDb2XDLi7wBvPgmXfg7ZUxQKolJVU2SvqLAAIIIIAAAgiETSBkgVd3Ynj33XfPaujJkyfl+uuvlxkzZoStE+VVcagCr7bf2eUhmBni8uo/10UAAQQQQAABBCqCQMgC78UXXyzLli07q89ZWVly+eWXy6pVZ7bectMRjsAbzAyxm0zpCwIIIIAAAgggEGoB68CrD6kVFhbK/Pnz5bLLLivSPl3Hu3r1aunSpYs8//zzoW57uddH4C33IaABCCCAAAIIIIBAmQLWgXf37t3y6aefyvjx4+V3v/tdkQvGxsbKueeeKxkZGRIfH19mYyraCd6B13ZrMWdJAzO8Fe0uoL0IIIAAAgggEO0C1oHX6eDUqVPl2msr1w4DBN5ov71pHwIIIIAAAgggIGIdeHWNbq1ateTo0aOleqakpLjOOxyBl7etue42oUMIIIAAAgggUM4C1oG3Z8+eMnz4cHnuuedK7UpmZmY5dzX0lw9H4K1Tr76k9+ob+sZSIwIIIIAAAgggUEkFrAPv8ePHJSkpSXT7sdKO5ORk1xETeF03pHQIAQQQQAABBFwoYB14HZM33nhDrrnmGqlRo4YLmYrvUnGBt2pSkvTOGBCwwWdzP5Sck9lSo1Yt6X5p/4DLUwABBBBAAAEEEECgeIGQBd62bdvK3LlzpXHjxpXGurjAq50P5vXAyxcvlMMHDxi7YMpXGnQ6igACCCCAAAIIBCgQssB75513Svv27eXGG28MsAkV93QCb8UdO1qOAAIIIIAAApVHIGSBd8WKFfLAAw9I69atpVu3blKtWrUiildddZXrVEsKvP0GXi3xCQkB9dd7hjf9kj5Sp777drUICISTEUAAAQQQQACBEAmELPAOHjy41CZNmzYtRE2OnmpKCrzBBNaNq1fK7u3fSX5+vgRTPnpUaAkCCCCAAAIIIBBdAiELvNHVrci0JpSBV1vszPISeCMzflwFAQQQQAABBCqHgHXg/eSTTyQ1NVXatGlTrNiePXvk2LFj0rJlS9eJegfeUARWAq/rbhE6hAACCCCAAAJRIGAdeIcMGSLDhg2T6667rtjuLF68WJ555hn573//GwXdDW0TQh14dVnD9m83S8t2HeRkdrb5/+YtW0taq+I/TIS2N9SGAAIIIIAAAgi4U8A68Hbu3Fnee+89SUtLK1Zox44dcvXVV4s+1Oa2I9SBd8uGdfLtxvUm5B4+kGW2KSPwuu2uoT8IIIAAAgggEGkB68DboUMHmT17tpx77rnFtn3v3r1y2WWXybp16yLdt7Bfj8AbdmIugAACCCCAAAIIWAtYB95BgwaZvXd1Fre446OPPpKnnnpKdK2v2w4NvC8+/RfPLKyzJKFt53Rp1LRZwN1lhjdgMgoggAACCCCAAAJlClgHXn2l8GuvvSZvvvnmWbO8+/fvl+HDh0tGRobcfffdZTamop3gG3i9A2sw6263b8mUjWtWSWqTZnL4YJbkZGezpKGi3RS0FwEEEEAAAQSiTsA68BYUFMgdd9whCxYsEH2AzdmNYfPmzfL++++btb2TJk2S5OTkqOu8bYOcwKsBtV2XdLENvLpud/nni85qVp169SW9V1/b5lIeAQQQQAABBBColALWgddR++CDD2TmzJmydetW8/IEXdPbv39/uf766yUxMdGVuE7gdQIpgdeVw0ynEEAAAQQQQKCCC4Qs8FZwh6Ca7wRefY2wvixi/57dZpeFYGdkmeENahgohAACCCCAAAIIlCpA4LW4QTTwTpzwtOd1wIcOZBF4LTwpigACCCCAAAIIhEOAwGuhqoF38r9eMfvl6gyvbeDVpsybPvWsFgU7Y2zRNYoigAACCCCAAAKuESDwWgylBt7//mey7NmxTXQrsvy8XLPLgk1AJfBaDAhFEUAAAQQQQACBYgQIvBa3hQbeOTOne96OVrd+itllgcBrgUpRBBBAAAEEEEAgxAIEXgtQ78Bbp36KtOucLp/N+1D0IbZ+A4t/EUdZl2OGtywhvo4AAggggAACCAQmQOANzKvI2d6BV7+g63idfXQzBl0bVM1LFnwsx48dLVLWZsY4qEZQCAEEEEAAAQQQcJEAgddiMH0Dr67jXbtiuakx2MC7fPFC8xCc90HgtRgkiiKAAAIIIIBApRcg8FrcAr6Bt3nL1mY9bygCb/WatYrM9OqSCQ3U1ZKSLFpMUQQQQAABBBBAoPIJEHgtxtw38DZtfoFs/3azqVHX8Opa3kCP3du3ycnsE3IyO9vs/uB96JIJDb4cCCCAAAIIIIAAAv4LEHj9tzrrTA28Cz/5WLZtyTSzsbr0wFmOYBtOndcUe19UZ3gbNW1m0WKKIoAAAggggAAClU+AwGsx5hp4MzMzxXklcLgDry6ZSGvVxqLFFEUAAQQQQAABBCqfAIHXYsx9A2/VpCTJyT4pIoVmxwab5QfFzfCmNmkm7bqkW7SYoggggAACCCCAQOUTIPBajLlv4NWqYkzcPbNFmU3g1TW83x89Iiu/WlJkqYSuE27ZvqNFqymKAAIIIIAAAghULgECr8V4O4FXg+nSTz8xNYUq8Gpd+Xl5pk6t39nfVx+E0zBdo1Zti5ZTFAEEEEAAAQQQqDwCBF6LsXYCr1bhLEEoLCyUmJgY6xle32at+d9yz64NOsOrM70cCCCAAAIIIIAAAmULEHjLNirxjOICb1x8vJzOz5eOXXtIg9RGFrUXLapLHPSlFDkns80Xul96ObO8IdOlIgQQQAABBBBwswCB12J0iwu8zgsjwrGjgvdb2GzXCFt0m6IIIIAAAggggECFEqhUgXfEiBGyYsUKs+RAj+HDh8s999xj/n/q1KkyYcIEyc3NlYyMDBk3bpzExcWVOpjFBd6q1apJzsmTEo7Au3H1Ss+LLQi8FervGY1FAAEEEEAAgXIUqFSB98orr5RJkyZJ3bp1i5Bv3bpVNAxPmTJFGjRoIKNHj5ZOnTrJyJEjAw68iVWqSO6pU2EJvNoYZ5aXwFuOf2u4NAIIIIAAAghUKIFKFXh79+4tixYt8szwOiM1ceJEOXbsmIwZM8b80fr162Xs2LEybdo0vwOvrrFdPO9Dz/nhmOHVyr/58gvJ2rtHeOtahfp7RmMRQAABBBBAoBwFKlXg7dy5szRs2FCys7Olbdu2JtQ2adJE7rvvPklPT5chQ4aYoTh16pT5/erVq/0OvHrivOlTPeenNEyVTt16hnxond0g2I835LRUiAACCCCAAAIuFahUgff48eOSnJws+fn5ZmnD+++/L7NmzZK77rpL+vfvLwMHDvQMs67P3bRpk2c2ePDgwWfdAmvXrjWvFnYO78CrrxlO79U35LeN92uMw1F/yBtMhQgggAACCCCAQDkLVKrA62vds2dPE3pfeOEF6dixowwdOtScosG4e/fusmbNGk8R/TPfQ2eMiwu8+khc7TAFXn0ZxYLZM0xTMgZdW863D5dHAAEEEEAAAQSiX6BSB95u3brJ3LlzzQ4N+/btM0sc9Fi1apXcf//9MnPmzFJH0HuXBj3xxxneGKlTr15YZnj1Op/N/dDsx9srY4BUS0qK/ruMFiKAAAIIIIAAAuUoUGkCrwba/fv3S/v27UXfhvbKK6/IwoUL5c0335SdO3fKsGHDZPLkyZ5dGlq0aCGjRo0KKvBq/XXrp4Qt8Do7NTRq2sw8vMaBAAIIIIAAAgggULJApQm8u3btkjvuuEN27NghVapUEV2OoLO4ug2ZHrqWd/z48ZKTkyN9+/aVJ598UhITEwMKvM7rf8MdeJ0H17RxvHGNv94IIIAAAggggEDpApUm8IbjRvBd0uAEUQ28NWvXlu6X9g/HZWX/nt2y8qslpm724w0LMZUigAACCCCAgIsECLwWg+kbeJ0dFKSwUCQmJmwPlXmuIyIt23WQpmkXWvSCoggggAACCCCAgLsFCLwW41ti4JVCEYlM4GU/XosBpCgCCCCAAAIIVAoBAq/FMGvg/WjWDNGXTNSoVVu8Z1612nBuG7Zkwcdy/NhR0/pwvdXNgoaiCCCAAAIIIIBA1AgQeC2GQgPvi0//xdSg4TaSgdf7WqlNmkm7LuzWYDGUFEUAAQQQQAABFwsQeC0G1zvw6sNj8QkJsvTTTzw1hnOG1zvwhuutbhY0FEUAAQQQQAABBKJGgMBrMRQaeGe9/55s/3azOHvier9emMBrgUtRBBBAAAEEEEAgRAIEXgtIDbxzZk6XbzeuN7XonrgbV6+UwwcPiL5euH8YX/37/dEjRWaT9fqs5bUYTIoigAACCCCAgGsFCLwWQ+vs0qAhV2d5NXDqUgMn8F4S5lf/es8mE3gtBpKiCCCAAAIIIOBqAQKvxfA6gXf39m2ydsVy0YfHcrJPeAJvh649pEFqI4srlF7UN/DqsopGTc8TXdPLgQACCCCAAAIIIHBGgMBrcSc4gdd5gKxqUpIkxCfI98eOir5trVpyslRLSjYBNK1VG4srFV9Ur7th9UrP9mTOWeFcOxzyTlAhAggggAACCCAQZgECrwWw94snnNnWKlWryqmcHBN4Y2J0Ja+YwJveq6/FlUouunzxQjOj7Huwnjcs3FSKAAIIIIAAAhVQgMBrMWjegdeZ5Y2Pj5f8/PwigbdGrVrS/dL+Flcquej+Pbtl+5bMs0KvLm9o2a6j2SqNAwEEEEAAAQQQqMwCBF6L0fd9tXCRNbWFhSI/zPDqJfoNvDqs4bO4mV7dG7hO/RSLHlIUAQQQQAABBBCo+AIEXosx9A283q/7FZ/A2zHMD7AReC0GkqIIIIAAAggg4GoBAq/F8PoG3m++/EKy9u7RZwFFCguKzPDqTGvd+ilm67JwHGv+t1z27Nh2VtXhDtrh6At1IoAAAggggAACoRQg8Fpo+gbeLRvWnXkJRaFWWnRJg3OZcO2g4Fy7arUkc6mck9menoXzoTkLPooigAACCCCAAAIRESDwWjD7Bl7nwTXf5Qzel9DtycIxy3syO9vsAVw1KVmy9uySjWtWeS4bzofmLPgoigACCCCAAAIIRESAwGvB7Bt48/PyZMHsGWXWGK5ZXt8Lez9EF6lrltl5TkAAAQQQQAABBCIsQOC1ANfA+8vX/ixd6zeTga0vNjV9NvdDz3ICfRFF9Ro1JSGxSpH1tZHaI1fX9epsr26Txo4NFgNNUQQQQAABBBCo0AIEXovh08B72WvjJFUS5ZHe15maijy4JoWiobdx0/Nk53db5VTOSc/V2nZOF90rN9yHs3sD63jDLU39CCCAAAIIIBCtAgRei5HRwHvVK49JdmyB3NIsXS5q2kKch8e837Sml6jX4Bw5uH+f52oahNMv6SvVks48ZBauwwngrOMNlzD1IoAAAggggEC0CxB4LUZIA+/If/+ffFeQLefHJsm9lwz2BF7fB9eanp8mdVIamAfLnAfKIrHMQB9mWzzvQ9NLZnktBpuiCCCAAAIIIFBhBQi8FkOngfeBKS/LF9l7TS3DUtvJufnxZmsy3ZksxqtuXb6gyxj0cB4m01lXPcL12mHn8gs+mG7W8eqscqeuPaRGrdoWvaYoAggggAACCCBQsQQIvBbj5Rt4W8XXkNs6Xu6ZUfWt2nlY7cd1vmfOiORrh5nltRhwiiKAAAIIIIBAhRQg8FoMm7Mt2cZ92+XZTYsluSBWnu37c88Mrm/Vzrrd3du/O/OCih+OcC9t2Lh6pWz/drO5mrahd8YAi15TFAEEEEAAAQQQqFgCBF6L8fLeh3fUorclL0bk/9IHyZdz55haNcjqoetoN67+xiwr0OPC1m0lc/1az5UjsWPDkgUfy/FjR8012ZPXYtApigACCCCAAAIVToDAazFk3oH34c/ek72Sa9bxJu89LIcPHjBvVNM3q+nx/dEjsvTTTzxXO6dRY9m3e5f5fST25XW2J9Pr9coYEPbdISxYKYoAAggggAACCIRUgMBrwekdeF//er55eK1zYl3pLjUla++es4KsvnpYty3TMOx96DKDRk2aecKxRZNKLOq9brhp8wukZfuO4bgMdSKAAAIIIIAAAlEnQOC1GBLvwPv19k0ycdtyqVMYL39q00+Wf76o2G3AvLcJK+7S4VpuoNdd+79lJmzz4JrFoFMUAQQQQAABBCqcAIHXYsi8A69Wc+tnb5vaHkjrLRuWfVVisPTdpUGkUAolxmxjFs7lBjrDrEE8PiHB7AzBgQACCCCAAAIIVAYBAq/FKPsGXmcd77W10yTm252m5uJmbEta2qDn6968zVu2kQapjSxaVnJRZw/gOvVTzBKK+Ph49uUNizSVIoAAAggggEC0CBB4LUbCN/A663jbx9aUlgdOmZq7X3p5sYEyPy9P9u/ZLWtXLPdpQYzUqFVTmqZdKHXqpYT84TLv3Rr0wixvsLgBKIoAAggggAACFUKAwGsxTL6B17OOtyBOLj8Sb2oua49dZ8b1x2b8+I62cIRR790anBll3RaNt69Z3AgURQABBBBAAIGoFiDwWgyPb+DVqpx1vKPqtpW9mzf7teWYzvbqzGvOyeyzWhPqh9h0lwjvl154X7CscG5BRVEEEEAAAQQQQKDcBAi8FvTFBV5nHe/AKqmStOeQX4FXm7Dmf8tlz45tP7RGH1/Tmd7wvCTCd5bXIWjUtJnobC8HAggggAACCCDgJgECr8VoFhd4nXW8bQqSpM2R06IPh9WtnyK6963ujlDSoQ+ybVi90rwNLTY2VkRipOD0aalRu5Y0bX6hHD6YZYrqA23VkpLM/+vMcGl1lnQtfQnG90ePFrN++Myrh9t1Tjft5kAAAQQQQAABBNwgQOC1GMXiAq+zjrd2QZz0/2Edr17Cn+UCutxg+5ZMzyuIi2taSmojqVmrtvnSoQNZJvDWrVdfGjU9L+Dwqw/NafjV9bsrv1pS5HIaeLXuTl17WAhRFAEEEEAAAQQQKH8BAq/FGBQXeLU6Zx3vdYeqeGpv2a6D2XmhrOPsPXpFYmPjJLFKopw6efKHhQ5n16LbmaVf0jfg0OvUVHRJxY/16xvZqlZLMrPUwcwml9Vfvo4AAggggAACCIRbgMBrIVxW4O37fYLsjyuQ9UmnpbPUkNt6/7TMq+kM78Y1q4o9LyExUZqcn2a+pm9Oy8k+cebXHx520+UIGkx1drZaUrJnj11/lj7okgoNtNu2bDb16++9H6LTJRka2J3lFGV2hBMQQAABBBBAAIEoESDwWgxESYH3jwunSHZsQZGaG+fHy0P9hpZ5NQ2neuTl5UnWnl2iyxzy8/OLlNNlDXroubFxcXLs8CEpKCiQ0z7n+V5MA6339mM6K5yQkGhOc9bsOi+i0LrX6KuID2R5rm+WOHTrafbu5UAAAQQQQAABBCqKAIHXYqRKCryz1y+T6Qcyi9RcPz9WxnToH3RY3Lh6pXlRRXFblzkXSqpeQ/JyT0l+bq7ExcdL1aRk8xBcKI6YmBgpLDyzc0SVqlWlfoOGktq02Q9Vx0iNmjVZ8hAKaOpAAAEEEEAAgZALEHgtSEsKvBv3bZcXNi6WPN1d7IcjoUBk0JEqxb5q2J8mOMsSdK2tzvzqa4GPHT3qtZXZ2bXoDK4eurZXZ4wTEhLMQ2rOoQ+9OYfO5Oqh54UqJJfUL9+ZZuc8/XPngTz9M++dImrUrGUCtT/LM/zx5BwEEEAAAQQQqDwCBF6LsS4p8GqVzy2ZKRvyvy9Su67pHTbwZxZXPFPUO/RpAG7ctJlZqqAzwNu2ZJ4VWJ3gqwGyRs3aovvt+nM4yys0JGsQ1mPvjm2yd/cuiZEf9wo2X4jxSvf+VB7Cc3Ttsq5Ztjl0bXJZdZQU1H2vy5IPm5GgLAIIIIAAAqEXIPBamJYWeHWW9/3N/5Ocgnw5WJArebEizU7Fyq9b9PA7cAbbNH2QTdf+mvW3eblnrQE+81BbkqQ0bCQNflgPHMi1tG59OE5niPUahw8e8BR3tjNr2a5jiQ+4aZD2nml2Cmuo9v5zZ9ZZv65/7ruWOZA2l/e5GpbVXD+YOHsp6wcU58/Ku31cHwEEEEAAATcLEHgtRre0wOtd7atL58hXeYekVn6MDE1sLI2aNAt76NVQeTL7hMQnJMru7d+ZwJi1d0+R3urMaO+MARYCZ4rqzhK+yytst0nzt1Ea7k2o/2EG2t9yvudpPepV2lFSUPct4/0BoKT61P7MEpOjZumGvuFOwy9LNoIdQcohgAACCCBQsgCB1+Lu8Dfw6iWcvXn1/4fn1JF2XS4WZ12qRRP8LqpBSpc76KyiLoHYtX2bWfqgM4/6UgydedTQZ7PtmD5Yp7O0P74iWaT7pZcX2RnC7wa76EQnKKuvzop7+3h3Uz8k6DnO2/n0gxF7H7voRqArCCCAAALlJkDgtaAPJPA+sGCKZMWf2apM1/Km5MVKx649glpSYNFkzwyizvjq+l/nAbWmaRfI7u3bTPjVtawb16w0s46BHhrYli6Y51l+4LyqWHeMsAnTgbYjms9Xe7XWMFvczLvTdifs6rIT/U9ng5u3bB3NXaNtCCCAAAIIRKUAgddiWAIJvP9d8rEsycmSo/GFkn4iQc47FWuWNTjrOS2aEXRRDacbV39TZKmDsxOCVuo83KbrcQOZadSZXj3/243rPW1LbdJM2nUJPEAH3bkKVFBn3XV5w5kAfNSzBKW4Ncvea4Gdh+yK282iAnWfpiKAAAIIIBB2AQKvBXEggVcv87f578vqhBxpnR0nbXPizZVTGqaalzmU57FkwcelbkWmIatZ2oXmR+2BLMPQH+UvW7zQU3evjAHM8vo50I6dzuxqCoFQewAAIABJREFUIA7FVnHewdjPZoT0NH3QMRyHv7tnBHNtfa02P5kIRo4yCCCAQHQJEHgtxiPQwPvfhbNlTuwR0T15+x5PlNr5Z7byKo+lDd7d1plefbBNf9VD9/nV2UV965o+9Ob9sgtdZ9qxa0+/Q4D+yH7pp5+YepnlDf5mc8ZIZ4L1cNYDa4h0draIxB7KwffAfSW9g7aOg/6eddfuG2d6hAAC7hAg8FqMY6CBd+2WDfL87v+ZK7bPqyoXxdT0PDimM3ot23eMin8wnW3NdAmCtkvDsM4yem8Npq83btCwkV+7TXiHXmZ5LW44i6L+7jBhcYkyi3q/6KTMkwM4IZx9078Lpb3d0LeZuma9WfMLzvw0pFbtAHrBqQgggAAC4RQg8FroBhp49VLeuzVcXTdNEjfvLNICfVDM3xdDWDQ9qKIaLHQP3u3fbvaUP/MPey0T1Ev7B14fkNPdCZjlDYqeQlEo4ARtnVl3PhA6S0/070T3S/tHYatpEgIIIFA5BQi8FuMeTOCdvX6ZTD+Qaa7aM6mh3HDRZSZEej/g1bZLugmQ0XroCyE0wPrOfOmsr/NqYP0HX5dD7N+zS5o2v9B0ZfG8Dz3boOnXWBsZrSNMu4IV0D2pN65ZZYoHuvwn2GtSDgEEEECgbAECb9lGJZ4RTOD9evsmmbhtuamzc2JdubhRmlzU5Ewg9H54rCL86N83qJdGqT/qzc09JQX5pyUmJkZi4mKlWrUk6dq7X0A7QFgMF0URiIiAbjm3dsWZv+MaenUnlmDeaBiRxnIRBBBAoJIIEHgtBjqYwLvryAF5dPXcIle9pVm6XNS0hdmbVUOkzpzqUoGW7TpE/TpAXZ+rP8513nimb3OrU6++6Z8/rwOuVbee1G9wjnkhg+4BzIGAGwSK+zBYET7EusGePiCAAALFCRB4Le6LYAKvXs57lte5fK/qqXLZ+e0l4WSurPxqiWd2qEFqY6les5Y4T4FbNDciRTX8+s5m6Z/p8gUNxZlrV8mJ48dNW3Sm1/uI1OuIIwLBRSq9gIZeved936xH8K30twYACCBQDgIEXgv0UAReXdawIvdQkVak5MXIkbhCaZMTb349ElcgrU9Xkfr5cdKxRVtJadiowq5/1VlsfcVxSfvK6qx207QzSzw4EKjoAs4rvb3X6OvynpzsbPPWvLRWbSp6F2k/AgggUCEECLwWwxRs4D128oRkZu2SfSeOysDWF8vrX8+XZSf2mpYkFMZKduyZVxAXdySdjpGasfHSqGoNSUqoIvWrVpeNR/ZJ73NbFjm9emIVaXlOU4veha+oLnXQt7EdPnjgrIvExsVJ42bnS6v2HcPXAGpGIMICCz6Y7nndtnNp3bdX32IYrbuyRJiIyyGAAAJhFSDwWvAGG3iLu6SG4D3HDkpqzXry33VfmlO+zT4kp0+flsLCQpHCQjkaVyh5sRYN9ipapzBeqsacqSytej3za5v658qJ3JOSVq+RNK59Zh1uuI6ztzjT5Q2FnstpGGjVobOkntvE7AUcyKuNw9Vm6kUgWAHn5SDbtmwussSBbfqCFaUcAgggEJgAgTcwryJnhzLwltYMXQeoW4FtXP2NHJA8yYsplB1JMZJSrabsy8+WnIJ8OVV4WqrExHmq0d8fjjkddO/0bXDVY+Ilt7DAzChXjY2Xxkm1PMG4pIovrN9IalZL9vu62re1/1tmZns18hYU6tpep3ihJNeoJadP55s1zDobRvD1m5YTo1hg3vSpntbpw5rl/drnKKaiaQgggEBIBAi8FoyRCrxOE3Wm87O5sz0/GtXw1yztQjl29IgJhDVq1pKqScl+re/duG+7HD+VY6pek7VTTubnyp7c4yY42wRlfzi9Z5c90baw0MxkO//VyhepIkWns5NOiyQVxEpCYqL5Ly831/RZZ8Hj4uKkdvUa0rRGPfNn+hIM/ZCgvxKS/RkVzom0wPLFC80HPQ27nbr24D6N9ABwPQQQqFQCBF6L4Y504NWm6sb2u7ZvK/GhLyfc6a4IGvZ0xwR9yC3QY+HmVfLd0QPSLuVcTyjW/9FlFqUdBwtzJa/o5guBXrrczk8uiDWz2c5RO76q1Kv642y1rpduUN2/18U6eyuXW2e4cNQLeO/XG81vWIx6SBqIAAII+CFA4PUDqaRTyiPwOm3R2d4z+9+eMGtcvV/369teDcE6i+S8BS0vL9eE4WpJyaa8/qqHzoxGYjbUe3a5JNtvD++T3fv2nFm//MORHVcoJ2J/+L2+vKKYwlnxJT/wZzHUESvqG7r9vbD3khN/y+h5GuI7pJ4f9jXbgbSpspyry3l06zLdtkz/3umODfrhNP+Hv5+VxYF+IoAAApEQIPBaKJdn4PVttvPj+7y8PMnJPmFmgbP27DKn5efnB9VL33WF+o+yE5q1Qufr8fHxYXtBhgYCDfQ6Y13azHZQHfQplB0ncsJrh4wTsSInYn4M0EUCdxkXPJDwY1APRdsiUUeqJIrOap+bXEfOr9NAAl2PHYk2uvEaxe3g4PSzY9cevKXNjYNOnxBAIOICBN4fyKdOnSoTJkyQ3NxcycjIkHHjxpl1oaUd0RR4S2unPiH+/dGjntlgZ1bYt4w/b0Yr6w7VUKyzx3p4B2TvPw92JlmD74bVKwOaATuVk2M+AOSeOiUnjn8vBaeLPsgXE3tmTXBScvWzXoRRVl+9v66zdfqGPNvDN3T7W58uIzlcwnZ2ap+QkGDWd/seukTlWEF+iVvh6cOL9WMTzU4e+mIUZ/cOfWNgjSrVAnpA0d++VLbz9O+d7+4NjoF+qNSH2vQc/UmMfqDVD38cCCCAAAKBCRB4RWTr1q0yYsQImTJlijRo0EBGjx4tnTp1kpEjR7oi8AZ2S/x4ts4a+4Y6Dct6aPh0tlrSf4RLepFEcdfWN6rpoW+Rc4KxhmStW4NZfEKihGPWWEOproHWazmvcHbap6FQHxzSgBHpbdB8nf0dL+2PMx7eZXRc9GvFjYna69fUQB90bNr8AvPBJOvwQdlyaI/sOnFEsk4eNw8w7pXcs5pyfuyZsLU/P8esd06tUl1a1W0kDWvUjtp9n/31LO/zzAfOvDxzb3r/ndIXVXgf+qBqoybNIrL8qLxNuD4CCCAQKgECr4hMnDhRjh07JmPGjDGu69evl7Fjx8q0adMqdeAN5ibzDcJOKDYzoNknin3ZRCDX0cBmArHP8goNBc5aZK1Pd3DwfcWx73X05Re+a5+1fp0Jbdc53QR6Z5cH7Zd3OA6kzeV5roZpZ62oPzPQGvjVICEhUVIapsqxwnzZfGC3fLh7fZm7d+jrsXUv52h+6Ul5jkUg19Z7z2xFuGaV+fDnuyxJx8j5wKj3/skTJ8z9rn83znxoTAjkcpyLAAIIuF6AwCsi9913n6Snp8uQIUPMgJ86dcr8fvXq1QTeEP8V0OB46ECWeeBOD/1H3fnHWf+hdo5AZ42tm1n0vRee6mJiYiQu/szODQUFBZKQmCDJ1WsW+2PlpORkiY2P3qBx7PBh0w8NvrqVW35uruTknDT98uf4PDlXkk/HSJXYOFlb5ezZX+869LwTcYWiSyLqFMZJrhRKosRI49hqcl7Vkne6SEpMlPNqpfjTHNeeo0updImNHt9u2iANzmloljzoh6/9P6zLL6vz9c45R+Li4s0e1lWrVjOn632sf5ZYtWqJxZOSkj33e1nXCNXXg13iFKrrUw8CCFQOAQKviNx1113Sv39/GThwoGfUdX3upk2bPOs6J02adNYd8eijj0pmZmbluFOipJfOj/+9f5zvPatcboE5Snwi1YzvqhRIcoHI/rgCORJfaF6Gkh0rog/2cSAQrQLXHaoSrU2jXQiUKZAx6Noyz+GEkgUIvCJy//33S8eOHWXo0KFG6vjx49K9e3dZs2aNR+6NN944S/Gxxx4j8Lrkb5ez5EK7o/uj1j+noSRWqSLHjx6Vvbt3mhm37BPHJT9flzbEmAfnfI/T+flFtlFzCc1Z3fDeKs73izqrG3+6UHYnFEheTIHkFBZK1R9enbcj/rQkFMaYcFzSoQ/fHY0nNLv13invfhF4y3sEuL6NAIHXRk+EwCsir732muzbt8+s29Vj1apVJgTPnDmzVN2KskuD3S1CaQQQQAABBBBAoGILEHhFZOfOnTJs2DCZPHmyZ5eGFi1ayKhRowi8Ffv+pvUIIIAAAggggAAzvM49MGvWLBk/frzk5ORI37595cknn5TExEQCL39JEEAAAQQQQACBCi7ADK/FALKkwQKPoggggAACCCCAQIQECLwW0AReCzyKIoAAAggggAACERIg8FpAE3gt8CiKAAIIIIAAAghESIDAawFN4LXAoygCCCCAAAIIIBAhAQKvBTSB1wKPoggggAACCCCAQIQECLwW0AReCzyKIoAAAggggAACERIg8FpAE3gt8CiKAAIIIIAAAghESIDAawFN4LXAoygCCCCAAAIIIBAhAQKvBTSB1wKPoggggAACCCCAQIQECLwW0AReCzyKIoAAAggggAACERIg8FpAE3gt8CiKAAIIIIAAAghESIDAawFN4LXAoygCCCCAAAIIIBAhAQKvBbQGXg4EEEAgEIHu3bvL0qVLAynCuQgggIBkZmaiYCFA4LXAmzRpkhQWFsqIESMsaqGoWwR++tOfyjvvvCPVqlVzS5foR5AC06dPl+3bt8sf/vCHIGugmJsEhg8fLk8//bQ0bNjQTd2iL0EIfPHFF/LRRx/JuHHjgihNERsBAq+FHoHXAs+FRQm8LhzUILtE4A0SzqXFCLwuHdggukXgDQItREUIvBaQBF4LPBcWJfC6cFCD7BKBN0g4lxYj8Lp0YIPoFoE3CLQQFSHwWkASeC3wXFiUwOvCQQ2ySwTeIOFcWozA69KBDaJbBN4g0EJUhMBrAUngtcBzYVECrwsHNcguEXiDhHNpMQKvSwc2iG4ReINAC1ERAm+IIKkGAQQQQAABBBBAIDoFCLzROS60CgEEEEAAAQQQQCBEAgTeEEFSDQIIIIAAAggggEB0ChB4o3NcaBUCCCCAAAIIIIBAiAQIvCGCpBoEEEAAAQQQQACB6BQg8AYxLgUFBfLEE0/IrFmzJCEhQX7729/KL3/5yyBqokg0C+hb9P7+97/Lv/71L/n666+LNHXq1KkyYcIEyc3NlYyMDPPWnLi4OCnt3uC+iebRLr1t+nf9xRdflMOHD0u9evXkoYcekq5du5pChw4dkrvvvltWrVoldevWlSeffFIuuugi8zX9s3vvvVeysrKkVatW8swzz0iDBg3K/FrFlXJ/y48dOyZ/+9vfZPbs2aaz5513njz++OPmVz1K+t5Q2tf43uCO+0bfpjd37lzzX1nfG4L9vuEOqfLpBYE3CPd3331XZs6cKRMnTpTs7Gz5+c9/Ls8++6y0a9cuiNooEo0CeXl58sc//tGEEx3rZcuWeZq5detW8zrpKVOmmK+PHj1aOnXqJCNHjpTS7g3um2gcaf/apB98Bg8eLI0bN5avvvpKbr/9dlmyZInExMTImDFjzJ/fcccdJuDqr/rqUP0wrB+GHnnkEenTp4+8/vrrolsSvfzyy3L69OkSv+ZfizirvAT0w8snn3xi7oeqVavKv//9b1m0aJG89tprUtr3hmC/b5RXP7luYAL6d18nQXbu3OkJvCV9b9D7JpjvG4G1iLN9BQi8QdwTN954o/z617+W3r17m9L6DW/Pnj0yduzYIGqjSLQK6D9ivXr1MrN1K1as8DRTP+joLI9+w9Jj/fr1ZuynTZsmpd0b3DfROtKBt6tz587y6aefSo0aNSQ9PV0+//xzqVatmqlIf+Lzs5/9zMwE68yfftDRQ2fxevToIR9//LF8++23JX5N6+SoOAKZmZkyatQo8yGntO8NwX7fqDgSlbel+pM+nfjSn/TpBIjO8Orf95K+N1x66aVBfd/ge4PdPUbgDcLv8ssvlzfeeMPM6uihwUh//8orrwRRG0WiWSA/P18uvvjiIoH3vvvuM9+shgwZYpp+6tQp8/vVq1dLafcG9000j7T/bdu8ebPceuutZpZv79695h86Db/O8X//939Su3ZtM/uvM7rjx4/3fO26664zyyF0tq+kr3Xo0MH/xnBmuQroj6V1Bl+Xq/zud7+T0r43BPt9o1w7yMX9EnjqqackNTVVrrzySvP9QANvad8b9CVFwXzf4HuDX8NR4kkE3iD8LrnkEpkxY4aZwdFDf8SpP8p4++23g6iNItEsUFzgveuuu6R///4ycOBAT9MvvPBC2bRpk5kRLune4L6J5pH2r216P+hPd2644QazJOG7776T2267TebMmeOp4IUXXjCzOw0bNpR169aZWR/n0Ddu6Wzgtm3bSvxa9+7d/WsMZ5WbgP7Y+tprrzXrt/Unfbp2U9dvl/a9QWf+gvm+UW6d5MJ+CXzzzTfy3HPPmZ/06hp/J/CW9r1h0KBBQX3f4HuDX0NC4LVjKlpav2m9+uqr0qxZM/MFnenRsKt/xuEugeIC7/333y8dO3aUoUOHms4eP35c9BvRmjVrzD9oJd0b3DcV+97QEKvLWPTvva7T1WPfvn0m+CxevNjTOX1orX79+ibwLly40Dyo5hz6D92jjz5qAm9JX9N7i6NiCOhPd/SBxpdeesl86Hn44YdL/N4Q7PeNiiFROVup4/+LX/xCnn/+eTn33HPNByAn8Jb2vUG/DwTzfYPvDXb3GTO8QfjpjzOvv/56ueyyy0zpf/7zn+YfvgceeCCI2igSzQLFBV59OEXH21mzrQ8r6D9m+nBbafcG9000j3TpbdMdO3SMk5OTza/OoX+uS17mz58vNWvWNH980003me8PjRo1Mufq2m499F7SnR303F27dpX4NV0OwVGxBPSnN7o7g+7cUNL3hmC/b1QsicrVWn2YWR9Wjo+P93T8xIkT5vuEfhDSYFvc9wad/Ajm+wbfG+zuLwJvEH7Tp08339ycXRp0pu/Pf/6zuYE53CVQXODVH2cOGzZMJk+e7NmloUWLFuZH1aXdG9w3FffeeOyxx0TDra6/9T10babO6N55551mlwZd4qAPpiUlJckVV1whDz74oPmxt+7SoH8+adIks+ShpK9VXKXK0XJ94FADzTnnnGM6rA+r6TaVOmOvH2RK+t4Q7PeNyqHqjl56z/Bqj0r63lC9evUSv1ba9w13KJVfLwi8QdrrInUNvbotkT59f/PNNwdZE8WiWaC4wKvt1U/v+jBSTk6O9O3b1+y9mpiYaLpS2r3BfRPNo1182/QBs5/85CcSGxtb5ARdk3nLLbeYHTvuueces1ezzvLqQ0zODi4bNmwwX9u9e7ekpaWZtZ5NmjQx9ZT2tYqnVHlarA8basDV9Zo6s3fBBRfIn/70J2nZsqVBKO17Q7DfNyqPbsXuqW/gLe17Q7DfNyq2UPm2nsBbvv5cHQEEEEAAAQQQQCDMAgTeMANTPQIIIIAAAggggED5ChB4y9efqyOAAAIIIIAAAgiEWYDAG2ZgqkcAAQQQQAABBBAoXwECb/n6c3UEEEAAAQQQQACBMAsQeMMMTPUIIIAAAggggAAC5StA4C1ff66OAAIIIIAAAgggEGYBAm+YgakeAQQQQAABBBBAoHwFCLzl68/VEUAAAQQQQAABBMIsQOANMzDVI4AAAggggAACCJSvAIG3fP25OgIIIIAAAggggECYBQi8YQamegQQQAABBBBAAIHyFSDwlq8/V0cAAQQQQAABBBAIswCBN8zAVI8AAggggAACCCBQvgIE3vL15+oIIIAAAggggAACYRYg8IYZmOoRQAABBBBAAAEEyleAwFu+/lwdAQQQQAABBBBAIMwCBN4wA1M9AggggAACCCCAQPkKEHjL15+rI4CAHwKrVq2S+++/X7Zt2yY//elP5YknnvCjFKcEIvDnP/9ZsrOz5bHHHgukGOcigAACFUKAwFshholGIhB5gb59+0qTJk3kzTffLHLxLVu2yHXXXScrVqyIWKP0ev3795eRI0eaUFa7dm2ra7/wwgty4sQJuffee009+qvW3aJFi5DWa1VZhAuHOvD+97//lePHj8uvfvUrT09GjBghN9xwg1x++eUR7h2XQwCByi5A4K3sdwD9R6AEAQ2833//vTz44INyzTXXeM4qj8DbqVMnmTx5srRu3Tok4/XVV1/JqVOnpHfv3pKbm2sC2KuvvmodeL3rDUlDI1hJqAPvHXfcIenp6UUC73vvvSddu3aVpk2bRrBnXAoBBBAQIfByFyCAQLECGnhvueUWef755+Wjjz7yzKoWF3jfffdd+cc//iF79uwxs8IadgYOHOiXrAbOp59+WmbMmGECdocOHeShhx4y4TYzM1PGjh0rK1eulBo1akhcXJzMmzfvrBlereOpp56SmTNnmlnFtLQ0mTRpknz99dcydepU+dnPfiaPPvqomdX98ssvxQl3Dz/8sAwZMkTWr1/vqV+XS/zkJz+RvXv3in79888/l3r16snPf/5zue222yQmJkbmz59far3OsgCtQ/9/8eLFEhsbK5deeqmp05mh1nZUrVpVdu3aJf/73/8kPz9funfvbtqqf17coX3Sr2/evFmaNWsmo0ePNoF9zpw58uSTT8rChQtNG51Dr/mnP/1JBgwYIM8++6x8+OGHZpwaNWpk/tyZbfUOvDt37pR+/foZl/j4eE9dF154oXz88cfmuocOHTLXW7p0qRw7dkzatGljft+8eXN54IEHjE9CQoJUqVLFGOs4Dh48WG666Sa56qqrTJ3aZ73uunXrpFatWjJ06FC5/fbbjZUeer7OCr///vuibdJ+/fKXvzR16KFef/nLX+SDDz4wbdAgfdddd0lGRoZf9x4nIYBA5REg8FaesaanCAQkoIH3lVdekYkTJ5rQo8FED9/AqwHrnnvukeeee046duwoy5YtMyFMZ0x1ZrasQ4OqhlD9NSUlxQQlDc+ffPKJVK9e3RRv27at6I/IS1pyoKHniy++kMcff9wEuU2bNknPnj1NMNXAWb9+fbMG+JxzzpHU1FRP4HWCaXH1a0ju06eP/OY3v5H9+/fLb3/7W/P/Gnz9rVeXYmibNYSdPn3ahLOjR4/Ka6+9Zvqlpv/617/kpZdeMsHz5MmTJuBpYNMPG76HhkwNrhostX8aGH//+9+LfuDQENqtWzfj3qVLF1P0m2++kV//+tcmlGqA1g8u+kFCDWbNmmWCs46XM77OGl5/Am9eXp7Mnj3bzJInJSUZ+3379sk///lPc21dIqKh2XtJg3fgPXjwoFmmoveOrsvW0K/3jf7/rbfe6gm8Bw4cMB9ezj//fBN69ev//ve/zb2ms/4ahv/+97+be0WDs95DasGBAAIIeAsQeLkfEECgWAENvBokzj33XBOydKb34osvPivwaqDR0OMd0P7617/Kxo0b5cUXXyxVV2dmO3fuLG+//bYJMM4xbNgw+X//7/+Z9Z5lBV6nDl1rrHV5HxpMNTzprPB5553n+ZLvj+99A6/OomoQ09DtHFq/zo6+9dZbJvCWVa8GSZ0R1iCus5x66OzzJZdcIv/5z3+kVatWJvDqA3ka3JxDP2CsWbPGePse6qmhTwOvc/zhD38QnXnVmVGdRdXwqctQ9NDzNGCPHz/+rLoKCwvNB4m5c+eaMQ50hte3QjXTYK8fgPQoK/D+7W9/M+vANaA7hwZzDfBalx4akHWG+s477/Scc+ONN8pll11mZnp1LKZNmyZvvPGGVKtWjb/JCCCAQIkCBF5uDgQQKFZAA6+Gkvbt25sfl2uI1WUH27dvL/LQms406uxsr169PPVoUNTZTA2apR1bt241ywfWrl0riYmJnlM1qOlso84a6lHaDG9JdWg5Daa6hOCzzz4r0oyyAq8GUg2N+iN559CAqLPH+iN9f+qdMmWKmX185513ilx70KBB5kfyOlOp7Thy5EiRQKqzmQsWLPDMAnsX1kCps6q6tMM5CgoKzBprNVuyZImZJdUlFPrjf52h1nHQkK0zsjqzrO3XmWT9+oYNG8zY6hKQQAOvzua+/PLLJrRq3Tk5OSbQa2j1J/D+8Y9/NEFb2+scWl4/tOgykgYNGniWNOiSCOcYNWqUuSf1A4euw7777rtF105ff/31Mnz4cDPDy4EAAgj4ChB4uScQQKDMwKsnaNDQ4Kkzr9dee61nlwZ9MGnChAlFAq+GKp1VLCvw6iywruf0Dby6jlZDmT+BV5dYXHHFFWfV4QReDXw6i+l9lBV4dcZZH7DSwFrcoYG3rHp11lGXDfgG3quvvlpuvvlmT+D13QqstMCra6OdtbfFtUvDr4ZcXauryxR0tlTDrwZkDcQ6m6xj1bBhQ9EA37JlSzNr7U/g1XDZrl07zxpeDe66dGLMmDGSnJxsQqfOMvsbeLVtut63rMDrvebXuQ+dwOsY6H2kS0N0yYZ+SNOAz4EAAgh4CxB4uR8QQMCvwKtrKfVHzDqbq2HF2ZZMZ9V0SYOz7lIr09lgfeBJ1+KWduisoC6T0JDnvd5XlzToOlZdM6tHaTO8GsScXRx81wz7E0y1fg1QGkydXSA0vGnQ0plhfZjK9/CnXq1D1/0Wt6RBlzDoQ17F7YxQWuDV7dQWLVpk1uyWdGid+jCXMwt83333mVM1aOsSEf2wosd3331njIub4T18+LDZTUHH2FlHrQ8Q6oOI+mGmbt26JuzqTLTO0uqhs+K6jtsJvOqns/66htg5vNfwal+0fmc9s56jZdVM1ybrDLTvQ24lBV6nfl0eoQFfwy8HAgggQODlHkAAgTIFvJc0OCfrg2Ovv/66eQGEE3h1+YKGKuehteXLl5u1nBp2NczqGlJ90Etnay+66KKzrqsBTctokNYfY+vMqq5V1Xpr1qxZZuDVEx555BGzk4POuurspf6oXnd70B/xlzUTq+V1jbIGQl2HrGFR19zqjgEadnWPXg14OpOsAV1DnD+BV2db9aER1ZE+AAAD3UlEQVQ1XavrPLSms976wUFnf/UINPDqQ2v6cJuum9YPBbqbgc7a6gNdOkurhz64pbPxGhj1g4fOyuqhQVIfXNNrZmVlGTMNmLpMxXeGV8/XmWIt84tf/MIsHdCZXA3HGnh1NwQdW53R1bHVGXq9BzQoO4FX69cH0XTGVS10ja13gNUdLDRAa71q7zy0pmOhH6j0KCvw6ocKHSs11rXc+hCiLkPhxSRl/vXmBAQqnQAzvJVuyOkwAv4JFBd4taQ+jKRh1/vFE9OnTzchdceOHWbGTwOXhhg9dK2nrjHVQKy7CPgeGjA1mGnQ1a2ldLZVt7VygpqeX9YuDbr8QYOtrm/VJQL6EJc+ZKaByJ/AqzO5ek3dOUBD6ZVXXmlCoQYnfQhLw5Rut6YPT+nyCX8Cr7ZbA+q4cePMrKwTInVNsQboYAKvltFgr23UX3XZgi5L0OUKF1xwgYdWQ6O6ei8p0bXO+iCeBmL9UKABVT+UqE9xgVcfutMAqR9YNFTqMgx9iFE/8OguCGrg7Mygs9XaTz1H19/qoQFWA7F++ND1tbrVnG+AXb16tTHWX/XDjc4+69peZ3a6rMCrbVALfZBPH9bTpQwatG1fTOLf3xDOQgCBiiRA4K1Io0VbEUAAAQQQQAABBAIWIPAGTEYBBBBAAAEEEEAAgYokQOCtSKNFWxFAAAEEEEAAAQQCFiDwBkxGAQQQQAABBBBAAIGKJEDgrUijRVsRQAABBBBAAAEEAhYg8AZMRgEEEEDg/7dbxyQAAAAIBPu3NoPjwwUQ5FwkQIAAAQIlAYe3tJauBAgQIECAAAECt4DDe5MJECBAgAABAgQIlAQc3tJauhIgQIAAAQIECNwCDu9NJkCAAAECBAgQIFAScHhLa+lKgAABAgQIECBwCzi8N5kAAQIECBAgQIBAScDhLa2lKwECBAgQIECAwC3g8N5kAgQIECBAgAABAiUBh7e0lq4ECBAgQIAAAQK3gMN7kwkQIECAAAECBAiUBBze0lq6EiBAgAABAgQI3AIO700mQIAAAQIECBAgUBJweEtr6UqAAAECBAgQIHALOLw3mQABAgQIECBAgEBJwOEtraUrAQIECBAgQIDALeDw3mQCBAgQIECAAAECJQGHt7SWrgQIECBAgAABAreAw3uTCRAgQIAAAQIECJQEHN7SWroSIECAAAECBAjcAg7vTSZAgAABAgQIECBQEnB4S2vpSoAAAQIECBAgcAs4vDeZAAECBAgQIECAQEnA4S2tpSsBAgQIECBAgMAt4PDeZAIECBAgQIAAAQIlAYe3tJauBAgQIECAAAECt8AAJsUdyMvz690AAAAASUVORK5CYII="
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAArwAAAH0CAYAAADfWf7fAAAgAElEQVR4XuzdCXhU1f3/8W92CEsIEDYLKFGDiII0KgpWUehPcAFRilrqglT9u2utrXYRSx9X1GqLIrWWqj+FKmJqFRWRfdOUyiYggR8BIUDYwpIEEpL/8z14x8kwSWbmzExmbt73eXgUcs+557zOJXxy5txzE6qrq6uFAwEEEEAAAQQQQAABlwokEHhdOrJ0CwEEEEAAAQQQQMAIEHi5ERBAAAEEEEAAAQRcLUDgdfXw0jkEEEAAAQQQQAABAi/3AAIIIIAAAggggICrBQi8rh5eOocAAggggAACCCBA4OUeQAABBBBAAAEEEHC1AIHX1cNL5xBAAAEEEEAAAQQIvNwDCCCAAAIIIIAAAq4WIPC6enjpHAIIIIAAAggggACBl3sAAQQQQAABBBBAwNUCBF5XDy+dQwABBBBAAAEEECDwcg8ggAACCCCAAAIIuFqAwOvq4aVzCCCAAAIIIIAAAgRe7gEEEEAAAQQQQAABVwsQeF09vHQOAQQQQAABBBBAgMDLPYAAAggggAACCCDgagECr6uHl84hgAACCCCAAAIIEHi5BxBAAAEEEEAAAQRcLUDgdfXw0jkEEEAAAQQQQAABAi/3AAIIIIAAAggggICrBQi8rh5eOocAAggggAACCCBA4OUeQAABBBBAAAEEEHC1AIHX1cNL5xBAAAEEEEAAAQQIvNwDCCCAAAIIIIAAAq4WIPC6enjpHAIIIIAAAggggACBl3sAAQQQQAABBBBAwNUCBF5XDy+dQwABBBBAAAEEECDwcg8ggAACCCCAAAIIuFqAwOvq4aVzCCCAAAIIIIAAAgRe7gEEEEAAAQQQQAABVwsQeF09vHQOAQQQQAABBBBAgMDLPYAAAggggAACCCDgagECr6uHl84hgAACCCCAAAIIEHi5BxBAAAEEEEAAAQRcLUDgdfXw0jkEEEAAAQQQQAABAi/3AAIIIIAAAggggICrBQi8rh5eOocAAggggAACCCBA4OUeQAABBBBAAAEEEHC1AIHX1cNL5xBAAAEEEEAAAQQIvNwDCCCAAAIIIIAAAq4WIPC6enjpHAIIIIAAAggggACBl3sAAQQQQAABBBBAwNUCBF5XDy+dQwABBBBAAAEEECDwcg8ggAACCCCAAAIIuFqAwOvq4aVzCCCAAAIIIIAAAgRe7gEEEEAAAQQQQAABVwsQeF09vHQOAQQQQAABBBBAgMDLPYAAAggggAACCCDgagECr6uHl84hgAACCCCAAAIIEHi5BxBAAAEEEEAAAQRcLUDgdfXw0jkEEEAAAQQQQAABAi/3AAIIIIAAAggggICrBQi8rh5eOocAAggggAACCCBA4OUeQAABBBBAAAEEEHC1AIHX1cNL5xBAAAEEEEAAAQQIvNwDCCCAAAIIIIAAAq4WIPC6enjpHAIIIIAAAggggACBl3sAAQQQQAABBBBAwNUCBF5XDy+dQwABBBBAAAEEECDwcg8ggAACCCCAAAIIuFqAwOvq4aVzCCCAAAIIIIAAAgRe7gEEEEAAAQQQQAABVwsQeF09vHQOAQQQQAABBBBAgMDLPYAAAggggAACCCDgagECr6uHl84hgAACCCCAAAIIEHi5BxBAAAEEEEAAAQRcLUDgdfXw0jkEEEAAAQQQQAABAi/3AAIIIIAAAggggICrBQi8rh5eOocAAggggAACCCBA4OUeQAABBBBAAAEEEHC1AIHX1cNL5xBAAAEEEEAAAQQIvNwDCCCAAAIIIIAAAq4WIPC6enjpHAIIIIAAAggggACBl3sAAQQQQAABBBBAwNUCBF5XDy+dQwABBBBAAAEEECDwcg8ggAACCCCAAAIIuFqAwOvq4aVzCCCAAAIIIIAAAgRe7gEEEEAAAQQQQAABVwsQeF09vHQOAQQQQAABBBBAgMDLPYAAAggggAACCCDgagECr6uHl84hgAACCCCAAAIIEHi5BxBAAAEEEEAAAQRcLUDgdfXw0jkEEEAAAQQQQAABAi/3AAIIIIAAAggggICrBQi8rh5eOocAAggggAACCCBA4OUeQAABBBBAAAEEEHC1AIHX1cNL5xBAAAEEEEAAAQQIvNwDCCCAAAIIIIAAAq4WIPC6enjpHAIIIIAAAggggACBl3sAAQQQQAABBBBAwNUCBF5XDy+dQwABBBBAAAEEECDwcg8ggAACCCCAAAIIuFqAwOvq4aVzCCCAAAIIIIAAAgRe7gEEEEAAAQQQQAABVwsQeF09vHQOAQQQQAABBBBAgMDLPYAAAggggAACCCDgagECr6uHl84hgAACCCCAAAIIEHi5BxBAAAEEEEAAAQRcLUDgdfXw0jkEEEAAAQQQQAABVwXekpISKS8vl/bt2/sd2QMHDkhFRYW0bt3a79eLi4ulefPm0rRp04DujFNOOUXWr18f0LmchAACCCCAAAIIINAwAq4IvBpUr7/+etm0aZNRPPnkk+X222+XoUOHmt8fOnRIfvGLX8isWbPM73v37i0vvfSSZGVlmd8XFhbKmDFjPOVHjBghjz32mKSkpNQ5KgTehrlpuSoCCCCAAAIIIBCMgCsC744dO+S9996Tq666Spo1ayaTJ0+Wv/3tb7J48WIzW/vKK6/I1KlT5e2335b09HQTbrOzs+Xxxx83VqNHjzYzu0899ZQUFRXJ8OHDTeB1AnNtoATeYG41zkUAAQQQQAABBBpGwBWB15duy5YtcvHFF5uAm5uba4LrkCFD5LbbbjOnzpgxQ+655x755ptvZP/+/eYcDcR9+vQxX9ewq8F34sSJzPA2zH3JVRFAAAEEEEAAgbAJuDLwvvvuu/Lwww/L0qVLzXpdXcLwxBNPyODBgw3c6tWrZdiwYZKfny+6HEL/fOHChdKuXTvzdZ0hnj59uuTl5RF4w3arURECCCCAAAIIINAwAq4LvDpr+5Of/ERuvvlmuffee6W6ulpOPfVUmTRpkgwYMMAoFxQUmJA7d+5c2b59u4wcOdKE34yMDPP1KVOmyIQJE2T+/PmeUVm1atVxI6RLKHhorWFuXK6KAAIIIIAAAggEKuCqwPvtt9/KddddJ+eee65Zj5uUlGQcdIb3ySeflEsvvdT83t8M76JFizwPsfmb4b3vvvuOM/3www8JvIHeaZyHAAIIIIAAAgg0kIBrAq/OtI4aNUoGDhxo1uAmJyd7SINdwzt27Fgz88sa3ga6K7ksAggggAACCCAQRgFXBN61a9fKFVdcYX7pTGxiYqIh0h0ZdA2v9y4NuovDLbfcUmOXhptuuklatmzJLg1hvLGoCgEEEEAAAQQQiBUBVwReXVrgb8mBzuyOHz9eDh48KPfff7/MmTPHuJ9xxhny8ssve15QsXHjRrNVme7uoIduSzZu3DhJTU2tc5zYlixWbmPagQACCCCAAAII1C7gisAb6ADrm9iOHDniWavrW06XMeh+vPorkIPAG4gS5yCAAAIIIIAAAg0r0KgCb7ipCbzhFqU+BBBAAAEEEEAg/AIEXgtTAq8FHkURQAABBBBAAIEoCRB4LaAJvBZ4FEUAAQQQQAABBKIkQOC1gCbwWuBRFAEEEEAAAQQQiJIAgdcCmsBrgUdRBBBAAAEEEEAgSgIEXgtoDbwvjX9SWmRkSN+LBlrURFEEEEAAAQQQQACBSAkQeC1kncCrVQwaerVFTRRFAAEEEEAAAQQQiJQAgddC1jvwDhhypSSnpFjURlEEEEAAAQQQQACBSAgQeC1UvQNvbr8fSWbbLIvaKIoAAggggAACCCAQCQECr4UqgdcCj6IIIIAAAggggECUBAi8FtAEXgs8iiKAAAIIIIAAAlESIPBaQBN4LfAoigACCCCAAAIIREmAwGsBTeC1wKMoAggggAACCCAQJQECrwW0Bt7XJ70sB/eXCA+tWUBSFAEEEEAAAQQQiKAAgdcCVwPv239/Vfbu3kXgtXCkKAIIIIAAAgggEEkBAq+Frgbejz/Ik43r1ki3nNMku3sPi9ooigACCCCAAAIIIBAJAQKvhSqB1wKPoggggAACCCCAQJQECLwW0AReCzyKIoAAAggggAACURIg8FpAewferI6dpPc551nURlEEEEAAAQQQQACBSAgQeC1UvQOvVjNo6NUWtVEUAQQQQAABBBBAIBICBF4LVd/A2/eiS6RFRiuLGimKAAIIIIAAAgggEG4BAq+FqG/gZS9eC0yKIoAAAggggAACERIg8FrAauD9YvEiWbtyuXn5RK9zzpN2HTtZ1EhRBBBAAAEEEEAAgXALEHgtRDXwrl+/Xjas/Zq9eC0cKYoAAggggAACCERSgMBroUvgtcCjKAIIIIAAAgggECUBAq8FNIHXAo+iCCCAAAIIIIBAlAQIvBbQBF4LPIoigAACCCCAAAJREiDwWkATeC3wKIoAAggggAACCERJgMBrAe0E3p1F22T5F4slq0NH6X3u+RY1UhQBBBBAAAEEEEAg3AIEXgtRJ/Du3VUs+QvnSWabtpLb/0KLGimKAAIIIIAAAgggEG4BAq+FKIHXAo+iCCCAAAIIIIBAlAQIvBbQvoG3RUaG5Pa7UJJTUixqpSgCCCCAAAIIIIBAOAUIvBaavoFXq+L1whagFEUAAQQQQAABBCIgQOC1QHUC74GSfbJkzixTU07PM6VL9ikWtVIUAQQQQAABBBBAIJwCBF4LTSfwahUz86aZmrrlnCbZ3XtY1EpRBBBAAAEEEEAAgXAKEHgtNAm8FngURQABBBBAAAEEoiRA4LWAJvBa4FEUAQQQQAABBBCIkgCB1wLaO/BuWPu1bFy3Rjp27io9++Ra1EpRBBBAAAEEEEAAgXAKEHgtNL0DLy+fsICkKAIIIIAAAgggEEEBAq8FLoHXAo+iCCCAAAIIIIBAlAQIvBbQBF4LPIoigAACCCCAAAJREiDwWkB7B15nL15921rfiwZa1EpRBBBAAAEEEEAAgXAKEHgtNL0Dr1bj7MU7aOjVFrVSFAEEEEAAAQQQQCCcAgReC00CrwUeRRFAAAEEEEAAgSgJEHgtoAm8FngURQABBBBAAAEEoiRA4LWA9g28sz/Mk8rKShkw5EpJTkmxqJmiCCCAAAIIIIAAAuESIPBaSPoG3vwFc2Xv7l2S2+9Hktk2y6JmiiKAAAIIIIAAAgiES4DAayFJ4LXAoygCCCCAAAIIIBAlAQKvBTSB1wKPoggggAACCCCAQJQECLwW0AReCzyKIoAAAggggAACURIg8FpA1xZ4e51znrTr2MmiZooigAACCCCAAAIIhEuAwGshWVvg7ZZzmmR372FRM0URQAABBBBAAAEEwiVA4LWQJPBa4FEUAQQQQAABBBCIkgCB1wK6tsDbsXNX6dkn16JmiiKAAAIIIIAAAgiES4DAayFZW+DNbNNWcvtfaFEzRRFAAAEEEEAAAQTCJUDgtZAk8FrgURQBBBBAAAEEEIiSAIHXAto38G5Y+7VsXLdGmOG1QKUoAggggAACCCAQZgECrwWob+Ddu6tY8hfOI/BamFIUAQQQQAABBBAItwCB10LUN/BWVlTI7I/+JckpKTJgyJUWNVMUAQQQQAABBBBAIFwCBF4LSd/Aq1XNzJtmahw09GqLmimKAAIIIIAAAgggEC4BAq+FJIHXAo+iCCCAAAIIIIBAlAQIvBbQBF4LPIoigAACCCCAAAJREiDwWkATeC3wKIoAAggggAACCERJgMBrAV1X4O170SXSIqOVRe0URQABBBBAAAEEEAiHgOsCb2VlpSQmJppfwR7FxcXSvHlzadq0aUBF6wq8uf1+JJltswKqh5MQQAABBBBAAAEEIifgqsBbVlYmw4cPl9tvv12GDh3qUZs5c6bccccdxymuWrVK0tLSpLCwUMaMGSObNm0y54wYMUIee+wxSUlJqVOewBu5G5OaEUAAAQQQQACBcAm4JvA+9dRT8uqrrxqX8ePH1wi8n376qTz00EPy/vvv13Dr2rWrJCQkyOjRo83MrtZRVFRkQrMGXu/Q7A+cwBuu25B6EEAAAQQQQACByAm4JvDu3btXDh8+bGZnH3zwweMC7+9+9ztZunTpcZIlJSWSm5srU6dOlT59+piva9jV4Dtx4sSgZ3jzF8yVvbt3CUsaInfTUjMCCCCAAAIIIBCMgGsCr9Ppiy++WO69997jAu+dd94pV111lTRp0kTOPvtsGTx4sCQnJ0tBQYH5/4ULF0q7du1MNZMnT5bp06dLXl4egTeYu4lzEUAAAQQQQACBGBRoFIF3xYoVMmPGDGnVqpVs3bpV3n77bRk1apQ8+uijsmzZMhk5cqTk5+dLRkaGGaIpU6bIhAkTZP78+Z4h08Dse+hSifXr19f4Y2Z4Y/Aup0kIIIAAAggg0KgFGkXg9R3hd955Rx555BFZs2aNeVBNZ3gXLVokWVnHdlXwN8P7zTffHHejXHbZZQTeRv3Xh84jgAACCCCAQDwINMrAO2/ePLnllltk5cqVZt2v7xresWPHyvbt21nDGw93MG1EAAEEEEAAAQTqEXBN4NX9d6uqquTSSy8VXX5wxRVXSGpqqun+m2++KTk5OdKzZ0/Rh9Tuv/9+s373jTfeMF+/6aabpGXLlmHZpYElDfydQwABBBBAAAEEYkvANYH3nnvuMet0vY9PPvlEunXrJs8884xMmjTJ86XevXvLc889J507dzZ/tnHjRrMP75YtW8zvdVuycePGeQJzbUPmb1syJ/B2yzlNsrv3iK3RpjUIIIAAAggggEAjFHBN4K1v7MrLy2Xnzp3SokULyczM9Hu6LmPQ/Xj1VyAHgTcQJc5BAAEEEEAAAQQaVqDRBN5IMNcVeDPbtJXc/hdG4rLUiQACCCCAAAIIIBCEAIE3CCzfUwm8FngURQABBBBAAAEEoiRA4LWAJvBa4FEUAQQQQAABBBCIkgCB1wLaX+D9aukiKd5eJCxpsIClKAIIIIAAAgggEEYBAq8Fpr/Au3nDelm3agWB18KVoggggAACCCCAQDgFCLwWmv4Cr1Y3M2+aqXXQ0KstaqcoAggggAACCCCAQDgECLwWigReCzyKIoAAAggggAACURIg8FpAE3gt8CiKAAIIIIAAAghESYDAawFN4LXAoygCCCCAAAIIIBAlAQKvBTSB1wKPoggggAACCCCAQJQECLwW0PUF3tx+P5LMtlkWV6AoAggggAACCCCAgK0AgddCkMBrgUdRBBBAAAEEEEAgSgIEXgtoAq8FHkURQAABBBBAAIEoCRB4LaAJvBZ4FEUAAQQQQAABBKIkQOC1gCbwWuBRFAEEEEAAAQQQiJIAgdcCur7A2y3nNMnu3sPiChRFAAEEEEAAAQQQsBUg8FoIEngt8CiKAAIIIIAAAghESYDAawFdW+DNXzBX9u7eJczwWuBSFAEEEEAAAQQQCJMAgdcCsrbAu2Ht17Jx3RoCr4UtRRFAAAEEEEAAgXAJEHgtJAm8FngURQABBBBAAAEEoiRA4LWAJvBa4FEUAQQQQAABBBCIkgCB1wKawGuBR1EEEEAAAQQQQCBKAhEPvGVlZZKcnCwpKSlR6lL0LlNb4N28Yb2sW7XC05COnbtKzz650WsYV0IAAQQQQAABBBDwCEQk8FZWVsrLL78sb775puzZs0fGjx8vQ4cOlVtuuUXS0tLkpZdecsUQ1BZ49+4qlvyF8zx9zGzTVnL7X+iKPtMJBBBAAAEEEEAg3gQiEng///xzue2222TEiBGyZMkSuffee03g/fjjj+Xuu++W/Px8ycjIiDer49pL4I37IaQDCCCAAAIIINAIBCISeHUmt0uXLvLoo4/K6NGjTdjVX9u3b5cLLrhA8vLypEeP+H8DGYG3EfwNoYsIIIAAAgggEPcCEQm8GmrvuusuGTlypN/AO2PGDDn55JPjHi/QwKsdHTT06rjvLx1AAAEEEEAAAQTiUSAigVfD7r59++T111+XMWPGeGZ4n3/+ebN+d9WqVWYtb7wftQXeyooKmf3Rv2p0r/+gwdI0PT3eu0z7EUAAAQQQQACBuBOISOBdu3atXHHFFXLiiSfK/v375ayzzpKjR4/KnDlz5MEHHzTre91w1BZ4tW8z86bV6GJWh47S+9zz3dBt+oAAAggggAACCMSVQEQCrwpo6NUZ3aVLl8qhQ4ckJydHbrjhBrnmmmskMTExrpBqa2wwgZedGlwx5HQCAQQQQAABBOJQIGKB19uiurpaEhIS4pCn7ibXFXjnfzpDystKPRUQeF03/HQIAQQQQAABBOJEICKBt6CgwCxlqO0488wzzcso4v2oK/DmL5gre3fvIvDG+yDTfgQQQAABBBCIe4GIBN7bb79dZs2aVSuO2/fh1Y4TeOP+7wYdQAABBBBAAAGXCEQk8G7bts2s2/U9fvWrX5n9eZ999llJSkqKe0JmeON+COkAAggggAACCDQCgYgE3trc5s6da7YpW7ZsmbRo0SLueYMJvNpZ9uKN+yGnAwgggAACCCAQhwJRDbyFhYUycOBAmTp1qvTp0ycOuWo2mcAb90NIBxBAAAEEEECgEQhEJPDu3LlTysrKavAdOHBA3njjDXnvvfekMa7hZYa3EfxtoosIIIAAAgggEJMCEQm8tT201qxZM7n33nvl5ptvjkmMYBsVyAxvt5zTZOO6Nabq3H4/ksy2WcFehvMRQAABBBBAAAEELAQiEnj1pRN79uyp0SwNu6effrortiNzOhZs4NXwm929h8VwURQBBBBAAAEEEEAgWIGIBN5gGxGv5wcaeDdvWC+VlZVC4I3XkabdCCCAAAIIIBDPAmELvPpAmq7TDeQ47bTTGs22ZBpy9dBlDQTeQO4OzkEAAQQQQAABBMIrELbAW9/LJryb3ZgeWiPwhveGpTYEEEAAAQQQQCBYgbAF3s2bNwc8w9u9e3fXz/Bu21woZaWHpF3HTrKzaBszvMHemZyPAAIIIIAAAgiESSBsgTdM7Ymraupaw+vdkQ1rvybwxtXI0lgEEEAAAQQQcJNAxALv/PnzZenSpX5fMfzQQw9J06ZN494x0MCrM7zLv1gsWR06Su9zz4/7ftMBBBBAAAEEEEAgngQiEng/+OADeeCBB0S3Ijt06JCceOKJkpaWJuvWrZPWrVvLrFmzpHnz5vHk5LetgQbevbuKJX/hPMls01Zy+18Y9/2mAwgggAACCCCAQDwJRCTwjho1ygTbcePGSW5ursyZM0dOOOEEefbZZ2XJkiXyzjvvxJNRrW0l8LpiGOkEAggggAACCLhcICKB9+KLLxbdteGaa66RnJwcE3B79+5tZngvv/xy+eSTT6Rbt25xTxto4K2sqJDZH/1LklNSZMCQK+O+33QAAQQQQAABBBCIJ4GIBN7BgwfL1VdfLWPGjJGhQ4fKZZddJrfeequsXr1ahg0b5gnA8QTlr62BBl4tOzNvmqli0NCr473btB8BBBBAAAEEEIgrgYgE3ttuu80gvPLKK/LnP/9ZXnzxRbn55ptl0aJFUlxcLAsXLnTFK4YJvHF1r9NYBBBAAAEEEGikAhEJvF9//bXs2LFDBgwYIEeOHJFHHnlE8vLyzHreu+66S/r16+cK7mAC7+LZn8nB/SW8bc0VI08nEEAAAQQQQCCeBCISeP0BVFVVSWJiYjzZ1NvWYAJv/oK5snf3LunYuav07JNbb92cgAACCCCAAAIIIBAegYgE3gcffFAyMjJkxIgRom9Vc+sRSuBlazK33g30CwEEEEAAAQRiVSAigfett96Sp59+2uzB26dPH7n++uvlxz/+sSteNuE9kMEEXudtawTeWP2rQLsQQAABBBBAwK0CEQm8ilVRUSHz5s0zOzLoiyb0JRTXXXed2aosOzvbFZ7BBF7n5RNN0tOl51m5ktk2yxUGdAIBBBBAAAEEEIh1gYgFXu+O6wNsU6dONTs26JGfn2+WPMT7EUrg1T53yzlNsrv3iPfu034EEEAAAQQQQCAuBCIeeFetWiXTpk2TN99804BccMEFMmHCBFcsbyDwxsU9TiMRQAABBBBAoJELRCTw7tu3Tz788EN5++23zdvVdDnDjTfeaF5G0aVLF9eQE3hdM5R0BAEEEEAAAQRcLBCRwKuvFdZ1u7rf7rXXXiv6quHU1FTXMRJ4XTekdAgBBBBAAAEEXCgQkcD76aefSk5OjnTt2tWFZN93icDr6uGlcwgggAACCCDgEoGIBF6X2NTbjWACb2VFhcz+6F+mTl4+US8tJyCAAAIIIIAAAmETIPBaUAYTePUyM/OmmauxF68FOkURQAABBBBAAIEgBQi8QYJ5n66B96XxTwYcYAm8FtgURQABBBBAAAEEQhRwXeCtrKyUxMRE88v3OHDggHkhRuvWrf1yFRcXS/PmzQPeMi3YwLt5w3pZt2pFwAE5xDGlGAIIIIAAAggggICXgKsCb1lZmQwfPlx0l4ihQ4d6uqmvOP7FL35hdo7Qo3fv3vLSSy9JVtaxt50VFhbKmDFjZNOmTeb3I0aMkMcee0xSUlLqvFmcwJuckiIDhlxZ743lvG2NJQ31UnECAggggAACCCAQNgHXBN6nnnpKXn31VQMzfvz4GoH3lVdeMW96032B09PTTbjV1xs//vjj5vzRo0ebmV2to6ioyIRmDbzeodmfuBN49WuDhl5d76AQeOsl4gQEEEAAAQQQQCDsAhEJvOXl5TJ79mwzo7px48bjGv3666+bgBnOY+/evXL48GEzO/vggw/WCKsaXIcMGSK33XabueSMGTPknnvukW+++Ub2798vubm5JhD36dPHfF3DrgbfiRMn1tlEAm84R5C6EEAAAQQQQACByAhEJPDqTKvOlmqQ1Der+S4N+M1vfhPwOtlgu60vubj33ntrBF5dwvDEE0/I4MGDTXWrV6+WYcOGSX5+vui6Xf3zhQsXSrt27czXJ0+eLNOnT5e8vDwCb7ADwPkIIIAAAggggECMCUQk8Gro7Nu3r2fJQDT77Bt4q6ur5dRTT5VJk5P50NYAACAASURBVCbJgAEDTFMKCgpMyJ07d65s375dRo4cacJvRkaG+fqUKVNkwoQJMn/+fE/T9WUavsedd95pdmnQgyUN0RxlroUAAggggAACCAQuEJHAq8sKzj33XLO0INpHbTO8Tz75pFx66aWmOf5meBctWuR5iM3fDO/TTz99XFf++te/EnijPcBcDwEEEEAAAQQQCFIgIoH3zTfflNdee82slU1LSwuySXan+wu8wa7hHTt2rJn5jdQa3ibp6dLzrFzJbHtslwgOBBBAAAEEEEAAgcgJRCTw/uUvf5EXXnjBbP/lbP3l3YVnnnlGmjVrFtZe6f67VVVVZhZXlxpcccUVkpqaaq7hvUuDXveWW26psUvDTTfdJC1btrTapSG334/qDbDOLg3aJg272d17mPbpNmUcCCCAAAIIIIAAApERiFjgXb58ea0t/tOf/hT2wKu7LuiMsvfxySefSLdu3eTgwYNy//33y5w5c8yXzzjjDHn55Zelffv25ve6k4RuVbZlyxbze92WbNy4cZ7AXFtHdJeG1ya8KOVlpRJI4NV65n86w5zvfQSy/jcyw0+tCCCAAAIIIICA+wUiEnhjla2kpESOHDnid9ZZ26zLGHS7tEC3TNPA+/bfX5W9u3cFHHgrKypk9kf/IvDG6k1CuxBAAAEEEEDAdQIRC7y6J+4HH3wg69atk9LSUrM9me6MoP91yxFK4NW+z8ybRuB1y01APxBAAAEEEEAg5gUiEnh1b1vd6stZIqDrZvX1vnq8+OKLnv1wY16nngYSeON9BGk/AggggAACCDQGgYgE3kceeUQ++ugjs/dtr169zE4NGzZsEN3a6/PPP5cVK1ZE7MUT0Rw0Dbwff5AnG9etkW45p3keQquvDczw1ifE1xFAAAEEEEAAgfAJRCTwXnDBBXLllVfKL3/5yxotXbNmjfnz9957zzw4Fu+Hd+DVnRZy+18YUJe8d2vQAl26nSw5Z/QKqCwnIYAAAggggAACCAQnEJHAe/nll5styf74xz/WaM3SpUtl1KhRrgy8ySkpMmDIlQHp+wbeYMJyQBfgJAQQQAABBBBAAAGPQEQCr+6zq8sZNPDqG9cyMzPNq3v1db1bt26VBQsWSEpKStwPg/cMr3ZGA68G3/oOAm99QnwdAQQQQAABBBAIn0BEAm9ZWZl5+cP8+fNrtLR169by0ksvyQ9/+MPw9aABa/INvIHuxesbeFtkZEjfiwY2YE+4NAIIIIAAAggg4F6BiAReh2vZsmWebck6d+4s/fr1C/sLJxpyaMIVeLUP/QcNlqbp6Q3ZHa6NAAIIIIAAAgi4UiCigdeVYl6dCmfgDXR22O2m9A8BBBBAAAEEEAi3QNgCb0FBgeTl5cnNN98s//3vf2Xz5s21tvX66683W5XF++EbeE8/K1c6delab7d8lzRoAQJvvWycgAACCCCAAAIIhCQQtsA7e/ZsufXWW2XmzJny3HPPyYwZM2ptkD7AlpGREVKDY6mQb+ANdC9eAm8sjSJtQQABBBBAAAG3C4Qt8B49elT0dcJNmzaVhIQEt7uZ/oUaeCsrKuRAyT7ZurlQirYUmrqY4W0UtwydRAABBBBAAIEGEAhb4PVu+8MPPyw7duyQ1157rQG6FL1Lhhp4nRZuWPu1eUsbgTd6Y8aVEEAAAQQQQKDxCUQk8OqrhXUN75tvvulq0XAG3l7nnCftOnZytRedQwABBBBAAAEEGkIgIoF31qxZcvvtt5uXTbhhrW5tA+MbeIN9Y5r3DG+g638b4ibhmggggAACCCCAQDwLRCTwfv755/LAAw+Yt6z179//OJ+RI0dKampqPLuZtocz8OruDjk9ewX0pra4h6MDCCCAAAIIIIBAFAUiEnjvuusu+eSTT2rthltmfjXwrli+XIqLtsq6VSsk2BleBWKWN4p3O5dCAAEEEEAAgUYpEJHA21gkNfCuX79enG3GCLyNZeTpJwIIIIAAAgjEk0DEAm91dbX83//9nxQVFUl2drZ06NBBCgsLJT09XbKysuLJqNa2OoG3rLRUFsycIU3S0+WCQYOD6hszvEFxcTICCCCAAAIIIBC0QEQC78GDB+XnP/+5eWhNj/Hjx8vQoUPljjvuMCG4rpdSBN2DBizgBF5twsy8aaYlwe6nu3nDerMcQg8eXGvAweTSCCCAAAIIIOBagYgE3ilTpsjzzz8vuj3ZP/7xD7nxxhtN4F26dKmMGjVKFixYIO3bt4971HAEXu+3rmV16Ci9zz0/7l3oAAIIIIAAAgggEEsCEQm8l19+uQwePFjuvPNOGT16tAm7+mvPnj1m54Zp06bJmWeeGUsOIbUlHIFXL2yzBjikhlMIAQQQQAABBBBoRAIRCbwadocPH26WNXgH3oKCAhOEZ8+eLT/4wQ/injncgTc5JUUGDLky7l3oAAIIIIAAAgggEEsCEQm8Y8eOlXnz5slbb71lljXo7O6gQYPk/vvvl6+++koWLVokSUlJseQQUlv8Bd6cnmdKl+xTgqrPe1nDoKFXB1WWkxFAAAEEEEAAAQTqFohI4NWlCxpyt2/fbq7euXNns5zh0KFDMnHiRLnkkktcMS7+Am8oD555B97+gwZL0/R0V/jQCQQQQAABBBBAIBYEIhJ4tWNlZWWiD6+tXLlSDhw4ICeddJJcc801cuqpp8ZCv8PShnAFXm2Ms8tDKDPEYekMlSCAAAIIIIAAAi4ViEjg1d0YMjIypHv37jXYdu7cKYsXL5bLLrtMkpOT4540EoE3lBniuIekAwgggAACCCCAQAQFIhJ4b7/9dunZs6foK4a9j61bt8pFF11kXjvcrVu3CHYrOlUTeKPjzFUQQAABBBBAAAEbgagG3tWrV8uwYcPks88+k65du9q0OybKegde263FnCUNzPDGxNDSCAQQQAABBBBwkUBYA++DDz4oe/fulf/85z/SunVrs27XOY4cOSJLliyRHj16SF5enisICbyuGEY6gQACCCCAAAIuFwhr4NUtyPbt22cCb8uWLUUDoXM0adLEvHRClzS44S1r2q9IBF7etubyv3F0DwEEEEAAAQSiLhDWwOu0Xt+k1qFDB+nXr1/UOxTNC0Yi8Ga2aSu5/S+MZje4FgIIIIAAAggg4GqBiAReV4t5dY7A21hGmn4igAACCCCAQDwLhC3w5ufny+OPPy4TJkyQ6dOny/Lly2t1ee6556RZs2bx7Gba7i/wNklPlwsGDQ66b/M/nSHlZaXSIiND+l40MOjyFEAAAQQQQAABBBDwLxC2wKvrdp988kl58cUX5f33368z8D777LOuDbzKHMrrgfMXzJW9u3eZUQqlPDc4AggggAACCCCAQIQDr3f1CxculP3798vgwcHPdMbTQPmb4Q01sBJ442nkaSsCCCCAAAIIxJNA2GZ4vTt9zz33yMGDB+W1116LJ4ug21pb4B0w5EpJTkkJqj7vwJvb70eS2TYrqPKcjAACCCCAAAIIIBDFGd4///nPZh3v559/7mr32gJvKIF13crlsm3zJqmsrJRQyrsams4hgAACCCCAAAIWAhGZ4d21a5cMHDhQXnjhBbnwQvdusRXOwKtj6MzyEngt7miKIoAAAggggAACPgIRCbz33XeffPjhh7Vi644OGRkZcT8Y3oE3HIGVwBv3twQdQAABBBBAAIEYFIhI4J05c6Zs3ry51u6OGjVK0tLSYpAjuCaFO/DqsobNGwskp+eZUlZaav6/W85pkt29R3AN42wEEEAAAQQQQAABj0BEAm9j8Q134N2w9mvZuG6NCbl7dxWbbcoIvI3lbqKfCCCAAAIIIBApgbAG3qlTp5r9dx988EFp3bp1jTavWbNG3njjDfmf//kf16zrJfBG6rakXgQQQAABBBBAIHwCYQu85eXl0rdvX7n44otF36Tme+juA8OGDZOkpCTJy8sLXw8asCYNvC+Nf9IzC+ssSTj9rFzp1KVr0C1jhjdoMgoggAACCCCAAAL1CoQt8C5ZskR+9rOfyccffyzZ2dl+LzxjxgzRPXoXLVokWVnxv8+sb+D1DqyhrLvdvGG9rFu1Qjp27ip7dxdLeWkpSxrqvYU5AQEEEEAAAQQQqFsgbIFX99196KGHZO3atWYW19+xadMmGTRokLz77rvSq1evuB8bJ/BqQO3ZJ1dsA6+u281fOO84l8w2bSW3v3u3d4v7G4EOIIAAAggggEBMC4Qt8H7yySdy11131Rl4N2zYIJdeeqnoTO/JJ58c0zCBNM4JvE4gJfAGosY5CCCAAAIIIIBAdAXCFngLCgpk8ODB8o9//EPOP/98v72YNGmSPPPMM7Jq1SrXbEuma3j1NcL6soidRdvMLguhzsgywxvdm5+rIYAAAggggEDjEAhb4K2qqpIbbrhB1q9fLxMmTJDc3FyPYHV1tfz73/+WBx54QEaMGCGPP/64K3R1hnfSn8Z7Xge8Z1cxgdcVI0snEEAAAQQQQMBNAmELvIpSWFgo+lKJ7du3S05Ojpx66qmiuzesXLnS/JkuY3jrrbckMzPTFYYaeN/++6tmv1yd4bUNvIoyM2/acTahzhi7AplOIIAAAggggAAClgJhDbzalrKyMpk8ebJ8+eWXsnr1aklNTZUePXpI//795dprr5WUlBTLJsdOcQ2806e+LUVbCkW3IqusOGJ2WbAJqATe2BlfWoIAAggggAAC7hAIe+B1B0tgvdDA+/EHeZ63o7Vum2V2WSDwBubHWQgggAACCCCAQDQECLwWyt6BN7NtlvQ8K1fmz5xhHmIbMOTKkGpmhjckNgohgAACCCCAAAK1ChB4LW4O78Cr1eg6Xmcf3UFDrw6p5sWzP5OD+0tqlLWZMQ6pERRCAAEEEEAAAQRcJEDgtRhM38Cr63hX/zff1Bhq4M1fMNc8BOd9EHgtBomiCCCAAAIIINDoBQi8FreAb+DtlnOaWc8bjsDbvGVGjZleXTKhgbpperpFiymKAAIIIIAAAgg0PgECr8WY+wbeLt1Ols0bC0yNuoZX1/IGe2zbXChlpYekrLTU7P7gfeiSCQ2+HAgggAACCCCAAAKBCxB4A7c67kwNvHNnfSaFG9ab2VhdeuAsR7ANp85rir0vqjO8nbp0tWgxRRFAAAEEEEAAgcYnQOC1GHMNvPpmOeeVwJEOvLpkIrt7D4sWUxQBBBBAAAEEEGh8AgReizH3DbxN0tOlvLRMRKrNjg02yw/8zfB27NxVevb5/pXNFk2nKAIIIIAAAggg0GgECLwWQ+0beLWqBBN3j21RZhN4dQ3vgZJ9svyLxTWWSug64Zwzelm0mqIIIIAAAggggEDjEiDwWoy3E3g1mC6ZM8vUFK7Aq3VVVlSYOrV+Z39ffRBOw3SLjFYWLacoAggggAACCCDQeAQIvBZj7QRercJZglBdXS0JCQnWM7y+zVq1LN+za4PO8OpMLwcCCCCAAAIIIIBA/QIE3vqNaj3DX+BNSk6Wo5WV0uuc86Rdx04Wtdcsqksc9KUU5WWl5gt9L7qEWd6w6VIRAggggAACCLhZgMDrNbrFxcXSvHlzadq0aUBj7i/wOi+MiMSOCt5vYbNdIxxQBzkJAQQQQAABBBBwgUCjCLwzZ86UO+6447jhWrVqlaSlpUlhYaGMGTNGNm3aZM4ZMWKEPPbYY5JSz4sj/AXeJk2bSnlZmUQi8K5budzzYgsCrwv+9tEFBBBAAAEEEIiKQKMIvJ9++qk89NBD8v7779dA7dq1q1lvO3r0aDOz+9RTT0lRUZEMHz7cBN6hQ4fWOQj+Am9qWpocOXw4IoFXG+PM8hJ4o/L3g4sggAACCCCAgAsEGk3g/d3vfidLly49bshKSkokNzdXpk6dKn369DFf17CrwXfixIkBB15dY7tg5gzP+ZGY4dXKv1q6SIq3FwlvXXPB3z66gAACCCCAAAJREWg0gffOO++Uq666Spo0aSJnn322DB48WJKTk6WgoMD8/8KFC6Vdu3YGffLkyTJ9+nTJy8sLOPDqiTPzpnnOz+rQUXqfe37YB9HZDYL9eMNOS4UIIIAAAggg4FKBRhF4V6xYITNmzJBWrVrJ1q1b5e2335ZRo0bJo48+KsuWLZORI0dKfn6+ZGRkmGGeMmWKTJgwQebPn+8Z9mHDhh13C6xevdq8Wtg5vAOvvmY4t/+FYb9tvF9jHIn6w95gKkQAAQQQQAABBBpYoFEEXl/jd955Rx555BFZs2aNeVBNZ3gXLVokWVlZ5lR/M7wHDx48bqjOOussv4FXXz7RKkKBV19GMfujf5m2DBp6dQPfPlweAQQQQAABBBCIfYFGGXjnzZsnt9xyi6xcuVIOHz583BresWPHyvbt24Naw6tD/f0Mb4JktmkTkRlevc78T2eY/Xj7DxosTdPTY/8uo4UIIIAAAggggEADCjSKwPvmm29KTk6O9OzZU/Qhtfvvv9+s333jjTcM/U033SQtW7a02qXBO/Dq29Zat82KWOB1dmro1KWreXiNAwEEEEAAAQQQQKB2gUYReJ955hmZNGmSR6F3797y3HPPSefOnc2fbdy40ezDu2XLFvN73ZZs3LhxkpqaWue9470tmZ7ovP430oHXeXBNr8kb1/jrjQACCCCAAAII1C3QKAKvEpSXl8vOnTulRYsWkpmZ6VdFlzHofrz6K5DDN/A6QVQDb8tWraTvRQMDqSboc3YWbZPlXyw25diPN2g+CiCAAAIIIIBAIxNoNIE3EuPqG3idHRSkulokISFiD5V5riMiOT3PlC7Zp0Sie9SJAAIIIIAAAgi4QoDAazGMtQZeqRaR6ARe9uO1GECKIoAAAggggECjECDwWgyzBt5P/v0v0ZdMtMhoJd4zr1ptJLcNWzz7Mzm4v8S0PlJvdbOgoSgCCCCAAAIIIBAzAgRei6HQwPvS+CdNDRpuoxl4va/VsXNX6dmH3RoshpKiCCCAAAIIIOBiAQKvxeB6B159eCw5JUWWzJnlqTGSM7zegTdSb3WzoKEoAggggAACCCAQMwIEXouh0MD77/felc0bC8TZE9f79cIEXgtciiKAAAIIIIAAAmESIPBaQGrg/fiDPNm4bo2pRffEXbdyuezdvUv09cIDI/jq3wMl+2rMJuv1WctrMZgURQABBBBAAAHXChB4LYbW2aVBQ67O8mrg1KUGTuDtF+FX/3rPJhN4LQaSoggggAACCCDgagECr8XwOoF32+ZCWf3ffNGHx8pLD3kC75nnnCftOnayuELdRX0Dry6r6NTlRNE1vRwIIIAAAggggAACxwQIvBZ3ghN4nQfImqSnS0pyihzYXyL6trWmzZpJ0/RmJoBmd+9hcSX/RfW6a1cu92xP5pwVybXDYe8EFSKAAAIIIIAAAhEWIPBaAHu/eMKZbU1r0kQOl5ebwJuQoCt5xQTe3P4XWlyp9qL5C+aaGWXfg/W8EeGmUgQQQAABBBCIQwECr8WgeQdeZ5Y3OTlZKisrawTeFhkZ0veigRZXqr3ozqJtsnnD+uNCry5vyOnZy2yVxoEAAggggAACCDRmAQKvxej7vlq4xpra6mqR72Z49RIDhlwZ0fDpb6ZX9wbObJtl0UOKIoAAAggggAAC8S9A4LUYQ9/A6/26X/EJvL0i/AAbgddiICmKAAIIIIAAAq4WIPBaDK9v4P1q6SIp3l6kzwKKVFfVmOHVmdbWbbPM1mWROFYty5eiLYXHVR3poB2JvlAnAggggAACCCAQTgECr4Wmb+DdsPbrYy+hqNZKay5pcC4TqR0UnGs3aZpuLlVeVurpWSQfmrPgoygCCCCAAAIIIBAVAQKvBbNv4HUeXPNdzuB9Cd2eLBKzvGWlpWYP4CbpzaS4aKusW7XCc9lIPjRnwUdRBBBAAAEEEEAgKgIEXgtm38BbWVEhsz/6V701RmqW1/fC3g/RReua9XaeExBAAAEEEEAAgSgLEHgtwDXw/vS1J+Sctl1lyGlnm5rmfzrDs5xAX0TRvEVLSUlNq7G+Nlp75Oq6Xp3t1W3S2LHBYqApigACCCCAAAJxLUDgtRg+DbwXv/aYdJRUGXvBNaamGg+uSbVo6D2hy4ny7ab/k8PlZZ6rnX5WruheuZE+nN0bWMcbaWnqRwABBBBAAIFYFSDwWoyMBt7LXx0npYlVcmvXXPlhl1PFeXjM+01reok27drL7p07PFfTIJzb70Jpmn7sIbNIHU4AZx1vpISpFwEEEEAAAQRiXYDAazFCGnhHT35GNlWVykmJ6fLrfsM8gdf3wbUuJ2VLZlY782CZ80BZNJYZ6MNsC2bOML1kltdisCmKAAIIIIAAAnErQOC1GDoNvL+d8oosKt1uarm+Y0/5QWWy2ZpMdyZL8Kpbly/oMgY9nIfJdNZVj0i9dti5/OwP88w6Xp1V7n3OedIio5VFrymKAAIIIIAAAgjElwCB12K8fANv9+QWcnuvSzwzqr5VOw+rfb/O99gZ0XztMLO8FgNOUQQQQAABBBCISwECr8WwOduSrduxWZ77ZoE0q0qU5y681jOD61u1s2532+ZNx15Q8d0R6aUN61Yul80bC8zVtA0XDBps0WuKIoAAAggggAAC8SVA4LUYL+99eO+a95ZUJIg8kztUln76salVg6weuo523cqvzLICPU457XRZv2a158rR2LFh8ezP5OD+EnNN9uS1GHSKIoAAAggggEDcCRB4LYbMO/A+Ov9d2S5HzDreZtv3yt7du8wb1fTNanocKNknS+bM8lytfacTZMe2reb30diX19meTK/Xf9DgiO8OYcFKUQQQQAABBBBAIKwCBF4LTu/A+4//fG4eXjsrtbX0lZZSvL3ouCCrrx7Wbcs0DHsfusygU+eunnBs0aRai3qvG+7S7WTJOaNXJC5DnQgggAACCCCAQMwJEHgthsQ78P5n8zcyqTBfMquT5Vc9Bkj+wnl+twHz3ibM36UjtdxAr7t62ZcmbPPgmsWgUxQBBBBAAAEE4k6AwGsxZN6BV6u5bf5bprbfZl8ga7/8otZg6btLg0i1VEuC2cYskssNdIZZg3hySorZGYIDAQQQQAABBBBoDAIEXotR9g28zjreq1tlS8LGb03N/mZsa1vaoOfr3rzdcnpIu46dLFpWe1FnD+DMtllmCUVycjL78kZEmkoRQAABBBBAIFYECLwWI+EbeJ11vGcktpScXYdNzX0vusRvoKysqJCdRdtk9X/zfVqQIC0yWkqX7FMks01W2B8u896tQS/M8gaLG4CiCCCAAAIIIBAXAgRei2HyDbyedbxVSXLJvmRTc3177Dozrt834/t3tEUijHrv1uDMKOu2aLx9zeJGoCgCCCCAAAIIxLQAgddieHwDr1blrOO9q/Xpsr2gIKAtx3S2V2dey8tKj2tNuB9i010ivF964X3B+sK5BRVFEUAAAQQQQACBBhMg8FrQ+wu8zjreIWkdJb1oT0CBV5uwalm+FG0p/K41+viazvRG5iURvrO8DkGnLl1FZ3s5EEAAAQQQQAABNwkQeC1G01/gddbx9qhKlx77joo+HNa6bZbo3re6O0Jthz7ItnblcvM2tMTERBFJkKqjR6VFqwzp0u0U2bu72BTVB9qapqeb/9eZ4brqrO1a+hKMAyUlftYPH3v1cM+zck27ORBAAAEEEEAAATcIEHgtRtFf4HXW8baqSpKB363j1UsEslxAlxts3rDe8wpif03L6thJWma0Ml/as6vYBN7WbdpKpy4nBh1+9aE5Db+6fnf5F4trXE4Dr9bd+5zzLIQoigACCCCAAAIINLwAgddiDPwFXq3OWcd7zZ40T+05Pc80Oy/Udxy/R69IYmKSpKalyuGysu8WOhxfi25nltvvwqBDr1NTzSUV39evb2Rr0jTdzFKHMptcX3/5OgIIIIAAAgggEGkBAq+FcH2B98IDKbIzqUrWpB+Vs6SF3H7BFfVeTWd4161a4fe8lNRU6XxStvmavjmtvPTQsf9+97CbLkfQYKqzs03Tm3n22A1k6YMuqdBAW7ihwNSvv/d+iE6XZGhgd5ZT1NsRTkAAAQQQQAABBGJEgMBrMRC1Bd77506R0sSqGjWfUJksvx/wk3qvpuFUj4qKCiku2iq6zKGysrJGOV3WoIeem5iUJPv37pGqqio56nOe78U00HpvP6azwikpqeY0Z82u8yIKrXuVvop4V7Hn+maJw7nnm717ORBAAAEEEEAAgXgRIPBajFRtgfejNV9K3q71NWpuW5koD545MOSwuG7lcvOiCn9blzkXSm/eQiqOHJbKI0ckKTlZmqQ3Mw/BheNISEiQ6upjO0ekNWkibdt1kI5dun5XdYK0aNmSJQ/hgKYOBBBAAAEEEAi7AIHXgrS2wLtux2b587oFUqG7i313pFSJDN2X5vdVw4E0wVmWoGttdeZXXwu8v6TEayuz42vRGVw9dG2vzhinpKSYh9ScQx96cw6dydVDzwtXSK6tX74zzc55+ufOA3n6Z947RbRomWECdSDLMwLx5BwEEEAAAQQQaDwCBF6Lsa4t8GqVzy/+QNZWHqhRu67pvX7ICIsrHivqHfo0AJ/QpatZqqAzwIUb1h8XWJ3gqwGyRctWovvtBnI4yys0JGsQ1mP7lkLZvm2rJMj3ewWbLyR4pftAKg/jObp2Wdcs2xy6Nrm+OmoL6r7XZcmHzUhQFgEEEEAAgfALEHgtTOsKvDrL+17BMimvqpTdVUekIlGk6+FEuenU8wIOnKE2TR9k07W/Zv1txZHj1gAfe6gtXbI6dJJ2360HDuZaWrc+HKczxHqNvbt3eYo725nl9OxV6wNuGqS9Z5qdwhqqvf/cmXXWr+uf+65lDqbNDX2uhmU11x9MnL2U9QcU588aun1cHwEEEEAAATcLEHgtRreuwOtd7d+WfCxfVOyRjMoE+UnqCdKpc9eIh14NlWWlhyQ5JVW2bd5kAmPx9qIavdWZ0QsGDbYQOFZUd5bwXV5hu01aoI3ScG9C/Xcz9UgS5gAAIABJREFU0IGW8z1P61Gvuo7agrpvGe8fAGqrT+2PLTEpMUs39A13Gn5ZshHqCFIOAQQQQACB2gUIvBZ3R6CBVy/h7M2r/z+qPFN69jlbnHWpFk0IuKgGKV3uoLOKugRi6+ZCs/RBZx71pRg686ihz2bbMX2wTmdpv39Fskjfiy6psTNEwA120YlOUFZfnRX39vHupv6QoOc4b+fTH4zY+9hFNwJdQQABBBBoMAECrwV9MIH3t7OnSHHysa3KdC1vVkWi9DrnvJCWFFg02TODqDO+uv7XeUCtS/bJsm1zoQm/upZ13arlZtYx2EMD25LZMz3LD5xXFeuOETZhOth2xPL5aq/WGmb9zbw7bXfCri470V86G9wt57RY7hptQwABBBBAICYFCLwWwxJM4J2++DNZXF4sJcnVknsoRU48nGiWNTjrOS2aEXJRDafrVn5VY6mDsxOCVuo83KbrcYOZadSZXj1/47o1nrZ17NxVevYJPkCH3Lk4Kqiz7rq84VgALvEsQfG3Ztl7LbDzkJ2/3SziqPs0FQEEEEAAgYgLEHgtiIMJvHqZv3z+nqxMKZfTSpPk9PJkc+WsDh3Nyxwa8lg8+7M6tyLTkNU1+xTzUXswyzD0o/wvF8z11N1/0GBmeQMcaMdOZ3Y1EIdjqzjvYBxgM8J6mj7oGIkj0N0zQrm2vlabTyZCkaMMAgggEFsCBF6L8Qg28E6f+5F8nLhPdE/eCw+mSqvKY1t5NcTSBu9u60yvPtim/9VD9/nV2UV965o+9Ob9sgtdZ9rrnPMDDgH6kf2SObNMvczyhn6zOWOkM8F6OOuBNUQ6O1tEYw/l0HvgvpLeQVvHQX/Pumv3jTM9QgABdwgQeC3GMdjAu3rDWnlx2zJzxTMqmsgPE1p6HhzTGb2cM3rFxD+YzrZmugRB26VhWGcZvbcG09cbt+vQKaDdJrxDL7O8FjecRdFAd5iwuES9Rb1fdFLvyUGcEMm+6d+Fut5u6NtMXbPetdvJxz4NyWgVRC84FQEEEEAgkgIEXgvdYAOvXsp7t4YrW2dLasG3NVqgD4oF+mIIi6aHVFSDhe7Bu3ljgaf8sX/YM0xQr+sfeH1ATncnYJY3JHoKxaCAE7R1Zt35gdBZeqJ/J/peNDAGW02TEEAAgcYpQOC1GPdQAu9Ha76UvF3rzVXPT+8gN/7wYhMivR/wOr1PrgmQsXroCyE0wPrOfOmsr/NqYP0HX5dD7CzaKl26nWK6smDmDM82aPo11kbG6gjTrlAFdE/qdatWmOLBLv8J9ZqUQwABBBCoX4DAW79RrWeEEnj/s/kbmVSYb+o8K7W1nN0pW37Y+Vgg9H54LB4++vcN6nVR6ke9R44clqrKo5KQkCAJSYnStGm6nHPBgKB2gLAYLooiEBUB3XJu9X+P/R3X0Ks7sYTyRsOoNJaLIIAAAo1EgMBrMdChBN6t+3bJH1Z+WuOqt3bNlR92OdXszaohUmdOdalATs8zY34doK7P1Y9znTee6dvcMtu0Nf0L5HXAGa3bSNt27c0LGXQPYA4E3CDg74fBePgh1g329AEBBBDwJ0DgtbgvQgm8ejnvWV7n8v2bd5SLTzpDUsqOyPIvFntmh9p1PEGat8wQ5ylwi+ZGpaiGX9/ZLP0zXb6goXj96hVy6OBB0xad6fU+ovU64qhAcJFGL6ChV+953zfrEXwb/a0BAAIINIAAgdcCPRyBV5c1/PfInhqtyKpIkH1J1dKjPNn8d19SlZx2NE3aViZJr1NPl6wOneJ2/avOYusrjmvbV1ZntbtkH1viwYFAvAs4r/T2XqOvy3vKS0vNW/Oyu/eI9y7SfgQQQCAuBAi8FsMUauDdX3ZI1hdvlR2HSmTIaWfLP/7zuXx5aLtpSUp1opQmHnsFsb8j/WiCtExMlk5NWkh6Spq0bdJc1u3bIRf8IKfG6c1T0ySnfReL3kWuqC510Lex7d2967iLJCYlyQldT5LuZ/SKXAOoGYEoC8z+MM/zum3n0rpvr77FMFZ3ZYkyEZdDAAEEIipA4LXgDTXw+rukhuCi/bulY8s2Mv3rpeaUjaV75OjRo1JdXS1SXS0lSdVSkWjRYK+imdXJ0iThWGXZzduY//Zo+wM5dKRMstt0khNaHVuHG6nj+C3OdHlDtedyGga6n3mWdPxBZ7MXcDCvNo5Um6kXgVAFnJeDFG4oqLHEgW36QhWlHAIIIBCcAIE3OK8aZ4cz8NbVDF0HqFuBrVv5leySCqlIqJYt6QmS1bSl7KgslfKqSjlcfVTSEpI81ejv9yYcDbl3+ja45gnJcqS6yswoN0lMlhPSMzzBuLaKT2nbSVo2bRbwdbVvq5d9aWZ7NfJWVevaXqd4tTRrkSFHj1aaNcw6G0bwDZiWE2NYYGbeNE/r9GHNhn7tcwxT0TQEEEAgLAIEXgvGaAVep4k60zn/0488H41q+OuafYrsL9lnAmGLlhnSJL1ZQOt71+3YLAcPl5uqVxV/K2WVR6ToyEETnG2CciCc3rPLnmhbXW1msp1fGZUiaVJzOjv9qEh6VaKkpKaaXxVHjpg+6yx4UlKStGreQrq0aGP+TF+CoT8k6H8JyYGMCudEWyB/wVzzg56G3d7nnMd9Gu0B4HoIINCoBAi8FsMd7cCrTdWN7bduLqz1oS8n3OmuCBr2dMcEfcgt2GNuwQrZVLJLemb9wBOK9X90mUVdx+7qI1JRc/OFYC/dYOc3q0o0s9nO0Sq5ibRp8v1sta6Xbtc8sNfFOnsrN1hnuHDMC3jv1xvLb1iMeUgaiAACCAQgQOANAKm2Uxoi8Dpt0dneY/vfHjJrXL1f9+vbXg3BOovkvAWtouKICcNN05uZ8vpfPXRmNBqzod6zy7XZbty7Q7btKDq2fvm7ozSpWg4lfvd7fXmFn8LFybU/8Gcx1FEr6hu6A72w95KTQMvoeRriz+x4UsTXbAfTpsZyri7n0a3LdNsy/XunOzboD6eV3/39bCwO9BMBBBCIhgCB10u5uLhYmjdvLk2bNg3IviEDr28DnY/vKyoqpLz0kJkFLi7aak6rrKwMqD++J/muK9R/lJ3QrOc6X09OTo7YCzI0EGig1xnruma2Q+qgT6HSJJFDXjtkHEoUOZTwfYCuEbjrueCulO+DejjaFo06Okqq6Kz2D5plykmZ7STY9djRaKMbr+FvBwenn73OOY+3tLlx0OkTAghEXYDAKyKFhYUyZswY2bRpkxmAESNGyGOPPSYpKSl1DkgsBd66GqpPiB8oKfHMBjuzwr5lAnkzWn13qIZinT3Wwzsge/95qDPJGnzXrlwe1AzY4fJy8wPAkcOH5dDBA1J1tOaDfAmJx9YEpzdrftyLMOrrq/fXdbZO35Bne/iG7kDr02Uke2vZzk7t9V7W9d2+hy5R2V9VWetWePrwYtvEVLOTh74Yxdm9Q98Y2CKtaVAPKAbal8Z2nv698929wTHQHyr1oTY9Rz+J0R9o9Yc/DgQQQACB4AQIvCIyevRoM7P71FNPSVFRkQwfPtwE3qFDh7oi8AZ3S3x/ts4a+4Y6Dct6aPh0tlrSf4Rre5GEv2vrG9X00LfIOcFYQ7LWrcEsOSVVIjFrrKFU10DrtZxXODvt01CoDw5pwIj2Nmi+zoGOl/bHGQ/vMjou+jV/Y6L2+jU10Acdu3Q72fxgUrx3t2zYUyRbD+2T4rKD5gHG7XLkuKaclHgsbO2sLDfrnTumNZfurTtJhxatYnbf50A9G/o88wNnRYW5N73/TumLKrwPfVC1U+euUVl+1NAmXB8BBBAIl0CjD7wlJSWSm5srU6dOlT59+hhXDbsafCdOnNioA28oN5lvEHZCsZkBLT3k92UTwVxHA5sJxD7LKzQUOGuRtT7dwcH3Fce+19GXX/iufdb6dSa051m5JtA7uzxov7zDcTBtbshzNUw7a0UDmYHWwK8GKSmpktWho+yvrpSCXdtkxrY19e7eoa/H1r2cY/mlJw05FsFcW+89sxXhqhXmhz/fZUk6Rs4PjHrvlx06ZO53/btx7IfGuj+dCqYtnIsAAgi4QaDRB96CggIZPHiwLFy4UNq1a2fGdPLkyTJ9+nTJy8sj8Ib5LtfguGdXsXngTg/9R935x9l7CUmws8bWzaz53gtPdQkJCZKUfGznhqqqKklJTZFmzVv6/Vg5vVkzSUyO3aCxf+9e0w8NvrqVW+WRI1JeXmb6FcixsNkRaXY0QdISk2R12vGzv9516HmHkqpFl0RkVifJEamWVEmQExKbyolNat/pIj01VU7MyAqkOa49R7fY0yU2emz8Zq20a9/BLHnQH752frcuv77Ot2nfXpKSks0e1k2aHHsmQe9j/bPUJk1qLZ6e3sxzv9d3jXB9PdQlTuG6PvUggEDjEGj0gXfZsmUycuRIyc/Pl4zvPmqfMmWKTJgwQebPn++5C954443j7og//OEPsn79+sZxp8RIL52P/70/zveeVXaaGfXAHCM+0WrGprQqaVYlsjOpSvYlV5uXoZQmiuiDfRwIxKrANXvSYrVptAuBegUGDb263nM4oXaBRh94nRneRYsWSVbWsZklfzO8r7/++nGK48aNI/C65G+Xs+RCu6P7o7Zt30FS09LkYEmJbN/2rZlxKz10UCordWlDgnlwzvc4WllZYxs1l9Ac1w3vreJ8v6izuslHq2VbSpVUJFRJeXW1NPnu1Xlbko9KSnWCCce1HfrwXUkyodmt905D94vA29AjwPVtBAi8NnoijT7w+lvDO3bsWNm+fTtreO3uLUojgAACCCCAAAIxIdDoA6+Owk033SQtW7Zkl4aYuCVpBAIIIIAAAgggEF4BAq8+GLJxo9mHd8uWLUZXtyXT5Qqpqal1asfLPrzhvWWoDQEEEEAAAQQQiC8BAq/XeOkyBt2PV38FchB4A1HiHAQQQAABBBBAoGEFCLwW/gReCzyKIoAAAggggAACURIg8FpAE3gt8CiKAAIIIIAAAghESYDAawFN4LXAoygCCCCAAAIIIBAlAQKvBTSB1wKPoggggAACCCCAQJQECLwW0AReCzyKIoAAAggggAACURIg8FpAE3gt8CiKAAIIIIAAAghESYDAawFN4LXAoygCCCCAAAIIIBAlAQKvBTSB1wKPoggggAACCCCAQJQECLwW0AReCzyKIoAAAggggAACURIg8FpAE3gt8CiKAAIIIIAAAghESYDAawFN4LXAoygCCCCAAAIIIBAlAQKvBbQGXg4EEEAgGIG+ffvKkiVLginCuQgggICsX78eBQsBAq8F3htvvCHV1dVyww03WNRCUbcIXHHFFfLPf/5TmjZt6pYu0Y8QBfLy8mTz5s1y9913h1gDxdwkMGrUKBk/frx06NDBTd2iLyEILFq0SD755BN57LHHQihNERsBAq+FHoHXAs+FRQm8LhzUELtE4A0RzqXFCLwuHdgQukXgDQEtTEUIvBaQBF4LPBcWJfC6cFBD7BKBN0Q4lxYj8Lp0YEPoFoE3BLQwFSHwWkASeC3wXFiUwOvCQQ2xSwTeEOFcWozA69KBDaFbBN4Q0MJUhMBrAUngtcBzYVECrwsHNcQuEXhDhHNpMQKvSwc2hG4ReENAC1MRAm+YIKkGAQQQQAABBBBAIDYFCLyxOS60CgEEEEAAAQQQQCBMAgTeMEFSDQIIIIAAAggggEBsChB4Y3NcaBUCCCCAAAIIIIBAmAQIvBaQBw4ckIqKCmndurVFLRSNVQF9qcjRo0clOTnZbxOLi4ulefPmfl80ceTIEdmzZ4+0b99eEhISjivPvROro+6/XSUlJVJeXm7G099R33jXda9UVVXJzp07pW3btrXea/Gl5e7W6veFvXv3ysGDB839kJaWdlyHbcab7w3uun/q+95Q13jzvSG89wKBNwTPQ4cOyS9+8QuZNWuWKd27d2956aWXJCsrK4TaKBKrAvqkvb4daf78+TWaWFhYKGPGjJFNmzaZPx8xYoR5a05KSop5895f/vIXefHFF83X9IehV155xdwjenDvxOpo+2+XBpfrr7/eM9Ynn3yy3H777TJ06FBToL7xrute0fKzZ8+W+++/39wXeowbN06uvfba+EJqRK396quv5LbbbjM/zOrRrFkz+e1vfyvXXHON+b3NePO9IX5vJA21N954o5SWlor+uxHI94b6xpvvDeG/Hwi8IZhqgJk6daq8/fbbkp6ebsJPdna2PP744yHURpFYE9B/tG6++WbZsmWLeRWob+AdPXq0mdl96qmnpKioSIYPH24Cr4agZcuWyciRI829ceaZZ8qf/vQn+eCDD2Tu3LmSmJhowi/3TqyNeO3t2bFjh7z33nty1VVXmXAzefJk+dvf/iaLFy82M/v1jXdd90pZWZmcd955cu+998rPfvYz+fzzz+XOO+80/+3cuXP8IDWilmrg/eabb2TgwIHSokULmTBhgvm1atUqM9NrM958b4jPG0l/6P31r39tvk/06NHDE3jr+95Q13jzvSEy9wKBNwRXDTZDhgwxP+nrMWPGDLnnnnvMN0J/H1+HcAmKNKBAZWWl7Nq1Sz777DMTUL0Dr360nZuba0Jrnz59TCs17GrwnThxojz99NPy9ddfm2Ckhwam/v37m2+C+s2Qe6cBBzYMl9Yfgi6++GLzA43eB3WN9wknnFDnvaIzOLfeequsXr1aUlNTTesGDRpkwu8NN9wQhtZSRaQFpkyZIs8//7wsWLDAzO7V9b2hvvHme0OkRysy9ev3/X//+99y5ZVXyocffugJvDb/FsyZM4fvDREYLgJvCKj68fQTTzwhgwcPNqX1H6xhw4ZJfn6+ZGRkhFAjRWJRQL95PfnkkzUCb0FBgRn3hQsXSrt27UyzNdxOnz7dfKO77777JDMzUx599FFPl0455RSZNGmSDBgwwCxt4N6JxdEOrE3vvvuuPPzww7J06VKzXKWu8dZZ2rruFQ1LOls8c+ZMz8V1uUS3bt3koYceCqxBnNUgAl9++aW8//77Mm/ePDNW+tKZ+r431DfefG9okKG0uujHH39svtfr9379gUbH2FnSYPNvgU6i8b3Bamj8FibwBmmqH1+ceuqpngCjxZ1vdPqxdadOnYKskdNjVcBf4HU+pvL+4Ua/yenHmjoTrB9pdu/evUZg0X/IdG3m5Zdfzr0Tq4MdQLv0E5yf/OQnZrmLLkPQo67x1hleXd5S272inx589NFHnn8gtT79R1KXy/zxj38MoEWc0lACGmp0Vm/FihVmGYrOyNf3vaGu8dbvD/y70lCjGdp1dex13F9//XWzfE0/9fEOvDb/FugyOL43hDYudZUi8IZgqgFGZ/4uvfRSU5oZ3hAQ46BIXTO8+npI5yFF3xlenfn7/e9/7+mh7wwv904cDL5PE7/99lu57rrr5NxzzzVrt5OSkswZGlBrG29nhre2e6W+Gb/4U2p8LdaZXn2oUR9g1geXdEY/1PHm35X4un/Gjh1rPunTT+700KVsupZbfyi+++675Xe/+12t3xucT/tq+7eAGd7I3AsE3hBcWWsVAlocFvEXeP2t4dVvfNu3b/es4V27dq289tprpses4Y3Dgfdp8vr162XUqFHmQSVdr+29TZ2u06ttvP2t4fW+V/yt6dT1wTfddBNreOPkttFdPM4//3yzpl8fXPZdwxvMePPvSpwM+nfN1E90NeQ6hz7QqL+cv7/6qV+o/xb4W8PL9wb7+4PAG4Kh99OV+uT2Lbfcwi4NITjGahFdtqL7K+tP2botmc7e6A4LTtDRb2gtW7asc5cGnb3Tj7mee+4589Gnv10auHdi9Q74vl36D5auz9RfOpur94EeujuLzuw6H2PXNt513Sv6kFOvXr3kkUceYZeG2L8VTAunTZtmntM4++yzzb2g3x90eYMuZ9JdG2zGm39X4uQmqKWZvksa6vveUNd4870hMvcCgTcEV91wXPfO1J/C9DjjjDPk5ZdfrnVT+hAuQZEGFNAZPd2Fw/vQ2Rf9x02PjRs3mq3o9Il9PXRbMl2Dp0/aa1h+4YUXzJpePTTU6myvs6MD904DDmwIl9ZZfg26vodzP9Q33nXdK1qn/jClD6o5h84I/vSnPw2hpRSJhoD+YKMfVTuHbluoS1x0lre+7w31jTffG6IxgpG7hm/gre97Q33jzfeG8I8VgdfCVD/e1nVbvHDCAjGOi+oyBn3ASH/5HvpWrt27d0vHjh09s4Le53DvxPHA+2l6feNd172ib/PTbe101w9nezJ36birN862hdorHTNn1t+7lzbjzfcGd90v9X1vqGu8+d4Q3nuBwBteT2pDAAEEEEAAAQQQiDEBAm+MDQjNQQABBBBAAAEEEAivAIE3vJ7UhgACCCCAAAIIIBBjAgTeGBsQmoMAAggggAACCCAQXgECb3g9qQ0BBBBAAAEEEEAgxgQIvDE2IDQHAQQQQAABBBBAILwCBN7welIbAggggAACCCCAQIwJEHhjbEBoDgIIIIAAAggggEB4BQi84fWkNgQQQAABBBBAAIEYEyDwxtiA0BwEEEAAAQQQQACB8AoQeMPrSW0IIIAAAggggAACMSZA4I2xAaE5CCCAAAIIIIAAAuEVIPCG15PaEEAAAQQQQAABBGJMgMAbYwNCcxBAAAEEEEAAAQTCK0DgDa8ntSGAAAIIIIAAAgjEmACBN8YGhOYgEE2B2bNny9GjR+X000+Xjh07ei69efNm2bBhgwwYMCCazfFc66uvvpL33ntPFi9eLEOGDJH7778/LO1YtmyZbNmyRYYOHRqW+nwrmTFjhrRs2VL69esXkfqjWenbb78t8+fPl5deeimily0tLRV1O+OMM+TUU0+N6LWoHAEEGq8Agbfxjj09R0BOOeUUo/CjH/1I/va3v3lE3njjDfnDH/4g69evj7rSoUOHpHfv3iY0XnjhhZKZmSnDhg0LSzt++9vfytSpUz39+vOf/yxvvvmmLF26NCz1X3DBBdKjRw955ZVXwlJfQ1by/PPPmx86NPSG6/DnvW3bNjPOjzzyiNx8883huhT1IIAAAjUECLzcEAg0YgENvCeffLIUFBTI//7v/8o555xjNBoy8H766ady5513yhdffGHCbjgPnU2sqKiQjIwMU+2LL75o+h2uwLt//35JTEyU5s2bh7PZDVJXJAKvP++qqiopKSmR9PR0SUtLa5C+clEEEHC/AIHX/WNMDxGoVUADr86sTZs2TZo2bSr//Oc/JSEhwW/gzcvLk7/+9a+ybt06ycnJkZ///OchLQ0oLi6WJ554QhYuXCiHDx+WgQMHyq9//Wtp27atWcLw0EMPyfbt26VPnz6m3Xput27djuuDLsXQj911FnLjxo3StWtX+fGPf2zC8ltvvSVLliyRe+65x/RFl2fcd9995r96XQ1ec+fONdfas2eP51o6k3zdddfJgQMH5LnnnpPPPvvMtKVv377G6bTTTjPt0P8/6aSTzEfw6rJz506ZMGGCPPvss3LCCSfIbbfdZs6rq6/6daedOput7fz2229l5MiRMnr0aGnfvn2t46Z9f/3118146Q8rOh7a78GDB4uG+ltuucXMimtdzrFp0yb51a9+JQ8++KCcffbZxmbVqlVmiUfr1q1FZ6f1ax06dDBFfAOv+mm5n/70p546x40bJ61atZK7777b/FldddbmfdVVV8mNN94od911l2mDHtquJ5980vwg0rlzZ7nyyivljjvukNTUVI9/mzZtRMPyv/71L0lJSTHt+tnPfuY5R5fF6DjrMhYN0meeeaYx0k8POBBAoPEJEHgb35jTYwQ8Ahp4f//730uXLl1kzJgx5qP4iy+++LjA+8EHH8gDDzxgAomGKl1zqR91ayi6/PLLAxbV2VVdk6tBUEOZHrqUIisrSz766CPRj7cff/xx+fzzz2Xs2LHm65deeqlouPE9nn76aRPAtb16ztq1a+W1114zyxWeeeYZmTRpkimSm5trwuP1119vwq7zMb0GRb2W9sO5lgbaXr16yYgRI2Tfvn0mQGkY/Mc//mFCtZ7bokULE/S//vprU79ePykpydSlPwSoqf5/fX3VkOa0U0PmtddeK8nJyTJ+/HgTmDV81nZoOZ2Z1pCn7VW7Dz/8UN555x0T6DQcrly50oR6nXHWQ8O4hmr9oUJ/uNFzzjrrLBMoNfS/8MILJtBPnjzZb+B1nPWHBOcYNWqUtGvXzvxwoEddddbm3b17d9MH7be66vrxSy65RE488US56aabjLMGe/1BRJfZ6OH46w9FOvZaRpem6L2ky3O0P+eee675pUYHDx6Ujz/+2HyC4fwwEvBNy4kIIOAKAQKvK4aRTiAQmoATeDXY6WygfiT/73//28w8eq/hHTRokPnIWWcznUODrs7Qzpw5M+CLa1DWWUANo84DcRpuNYTo+k4NL/o1fVBKZ+hqOzQwn3/++TVCkJ67Y8cOE26dQDhlyhTRQOUcvrOW/j5inzVrltx+++2e8KhldVZb+6uzuDqLrIFLZxv1BwQNxM6hQdkJvIH0Vdv57rvvypw5c0wI1UPDsj5MWJvr7t27zYzzL3/5S7n11ltNmcrKShPs9fq/+c1vTDDXWWINuHquhu/+/fubduvstPehY7h37175+9//bn5g0B8cNMD7WgUSeJ16a6vTn7fOSHsH3scee8yE1/z8fM/Sk6eeekpeffVVWbBggRlf7YcGdb1n9BMJPfQHMe3ro48+au4dtdDr6Z87R1lZmcc54JuWExFAwBUCBF5XDCOdQCA0Ae/A+5///MfMMupMmwZfJ/BqeOnZs+dxs47O7OTq1as9HyPX1woNKBpC9Fq6m4Eeun5Tw5oGYf1oPJDAq8sVNKRPnDjRzAb6Hto2DZwapr2PQAKvhto//elP5uEz59AlBBp6nQerNHBpSHNmHP0F3kD66q+dGjp1GUdtDwzq2madtdTAp7PNzqEzoRpKNYRrey+66CIzw6nj6fxQoSZUlwBDAAAG6ElEQVS6ZlsP/X/tq/bL+3DGM5TAW1+dgQRencnVe05n4p1DZ6r1EwgNwtonf/46u66HzvofOXLEfBqhM736w5rOZF922WXSqVOn+m5Rvo4AAi4VIPC6dGDpFgKBCHgHXj1fZ1o1ON1www2iSwY0dOnHwRoYdEnD//t//89T7V/+8hfzMfiKFSsCnjXT8KWBTNdoOg8olZeXmy2pdH2lrhMNJPA6Acj7QTvv/toEXqeNOqPoe+i6XV3+EUjgDaSv/tqpa3N1bWxtgXfevHlmOYizFMW7jfqQn65V1UOdtQ0akB9++GHzg4WuedbDmQEePny4mSXXPumMsu5iUVfg1fCodTmH95KGQOoMJPBqm5o1a2Zmp53DCfm63ELXO/vz11l5DfoaePXQ/qqlrgN2Hkqs7QekQP6ucA4CCMS3AIE3vseP1iNgJeAbePXj7CuuuMJ8TK+zY07o0lk1PVdn2JxDg5Kua3XChAZj/Xhay+paVH+Hfnyvgck7qGp5DU76kNLVV18dUOAtLCw0D7v5rnXVwKMfxwcaeDUUvvzyyzWWT0yfPt08zKbrYp1t25y+VFdXm4/QAwm8gfQ1lMDrrHHV2WUdA+/DaZ/+mS7v0GUMuuxBf4jQWWud5dTDCeP6w42uJdZDH1zUhwdrC7y6REAfHtTlBd73gO7frGt4A6nTn7fvkgZdu6xLZ7x/kHJmy/UhQn04sb7A69wHTjt1Pba2Pzs723wqwIEAAo1PgMDb+MacHiPgEfANvPoFJ3Do/zuB15kt1IeSdA2rzgbqx+F6rvMQkM5K6oyaPjxV2wsEdKmEPlSkoeXee+814VGDmAZYnbnUZQ6BzPBq23SWU5dG6H91nabuwKDBSNcgBxp4nbWeGrb15RvaHt1lQetr0uT/t3fHOJXkUBhGO5xFsFVC1kmESFjE6ENy6/UTzZTeneBKnJI6Qbiwjzv4y3Xt+uf3aQyvr6+/CsIFrUoorgTeK2N9JPA29uahOXh5efksB/n4+PjcoFb/bze7VSZSmUErpj1YnFX1aoQLwgXcNnK14t489JDzt8B7wmrhtnrjPPrXA1I/u3LPr7wrzbit4T2lNc1Bdcg9hFVe0fxU7tH1X4G3Eo4eqmpfSO+Eit5aVBbRSRUuAgR+noDA+/Pm3IgJ/BF42+TTCuu5zgribeCtJrKVvQLtudpB38apc1RU4atXzt8F3toWeqrV7bivrk4oKKie46J6JV2Y/m7TWu0Kea1yFujO1Sv3Nry12tgK7X0Nb6GulczzMYVWAgtAZzPeKasoPHdyQ7XC56qmN4M2wfXavbrm72p4r4z1q35eOQO51/WdunBKFPpbraxX5nBWcftZp1I0T/cr4W1yaxW70zdO28pW2rB3Au+91dvb26/n5+ffdln3QHTC6pV7fuVd3yrDOKc01J9Om7jdXFcZQ/bnqLav/HsI6P4F8+qSK4/pZIgzvlOOUfh3ESDw8wQE3p8350ZM4GGBdrm/v79/fob4nCrwyM169X4beM9O+0fu1QanzsHtaLNWZR+5+rpbr9Y7C/i2L9UXdyJEtbGPfkzi/xzr/dgKmY29cd+eFnHVoFf9heenp6fPUpArV6US/b3z8Y77Nlfu+Tfv23s1tkJ2q/6PfoCk85TrT+Ob/B+74uJ3CBDYLSDw7p4fvSNAgAABAgQIEBgKCLxDQM0JECBAgAABAgR2Cwi8u+dH7wgQIECAAAECBIYCAu8QUHMCBAgQIECAAIHdAgLv7vnROwIECBAgQIAAgaGAwDsE1JwAAQIECBAgQGC3gMC7e370jgABAgQIECBAYCgg8A4BNSdAgAABAgQIENgtIPDunh+9I0CAAAECBAgQGAoIvENAzQkQIECAAAECBHYLCLy750fvCBAgQIAAAQIEhgIC7xBQcwIECBAgQIAAgd0CAu/u+dE7AgQIECBAgACBoYDAOwTUnAABAgQIECBAYLeAwLt7fvSOAAECBAgQIEBgKCDwDgE1J0CAAAECBAgQ2C0g8O6eH70jQIAAAQIECBAYCgi8Q0DNCRAgQIAAAQIEdgsIvLvnR+8IECBAgAABAgSGAgLvEFBzAgQIECBAgACB3QIC7+750TsCBAgQIECAAIGhgMA7BNScAAECBAgQIEBgt4DAu3t+9I4AAQIECBAgQGAoIPAOATUnQIAAAQIECBDYLSDw7p4fvSNAgAABAgQIEBgKCLxDQM0JECBAgAABAgR2Cwi8u+dH7wgQIECAAAECBIYCAu8QUHMCBAgQIECAAIHdAgLv7vnROwIECBAgQIAAgaGAwDsE1JwAAQIECBAgQGC3gMC7e370jgABAgQIECBAYCgg8A4BNSdAgAABAgQIENgt8C+a8hf1DkIipAAAAABJRU5ErkJggg=="
},
"metadata": {},
"output_type": "display_data"
@@ -565,7 +565,7 @@
{
"data": {
"text/plain": [
- "array([ 0., 0., -0., 0., -0.])"
+ "array([-0., 0., 0., 0., -0.])"
]
},
"execution_count": 20,
@@ -575,7 +575,7 @@
],
"source": [
"res = om.minimize(\n",
- " criterion=general_sphere,\n",
+ " fun=general_sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"pounders\",\n",
")\n",
@@ -598,7 +598,7 @@
"outputs": [],
"source": [
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
" logging=\"my_log.db\",\n",
@@ -670,12 +670,12 @@
],
"source": [
"algo_options = {\n",
- " \"convergence.relative_criterion_tolerance\": 1e-9,\n",
- " \"stopping.max_iterations\": 100_000,\n",
+ " \"convergence.ftol_rel\": 1e-9,\n",
+ " \"stopping.maxiter\": 100_000,\n",
"}\n",
"\n",
"res = om.minimize(\n",
- " criterion=sphere,\n",
+ " fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
" algo_options=algo_options,\n",
@@ -700,7 +700,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.10.8"
+ "version": "3.10.14"
},
"vscode": {
"interpreter": {
diff --git a/pyproject.toml b/pyproject.toml
index 357cf0460..4d211ac7e 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -161,6 +161,7 @@ module = [
"optimagic.optimization.process_multistart_sample",
"optimagic.optimization.process_results",
"optimagic.optimization.tiktak",
+ "optimagic.optimization.scipy_aliases",
"optimagic.optimizers._pounders",
"optimagic.optimizers._pounders.pounders_auxiliary",
@@ -223,6 +224,7 @@ module = [
"optimagic.decorators",
"optimagic.exceptions",
"optimagic.utilities",
+ "optimagic.deprecations",
"estimagic",
"estimagic.examples",
diff --git a/src/estimagic/estimate_ml.py b/src/estimagic/estimate_ml.py
index 714ca76b0..81fde607f 100644
--- a/src/estimagic/estimate_ml.py
+++ b/src/estimagic/estimate_ml.py
@@ -1,7 +1,7 @@
import warnings
from dataclasses import dataclass, field
from functools import cached_property
-from typing import Any, Dict, Union
+from typing import Any, Dict
import numpy as np
import pandas as pd
@@ -166,8 +166,8 @@ def estimate_ml(
opt_res = None
else:
opt_res = maximize(
- criterion=loglike,
- criterion_kwargs=loglike_kwargs,
+ fun=loglike,
+ fun_kwargs=loglike_kwargs,
params=params,
lower_bounds=lower_bounds,
upper_bounds=upper_bounds,
@@ -359,14 +359,14 @@ class LikelihoodResult:
_free_estimates: FreeParams
_converter: Converter
_has_constraints: bool
- _optimize_result: Union[OptimizeResult, None] = None
+ _optimize_result: OptimizeResult | None = None
_jacobian: Any = None
- _no_jacobian_reason: Union[str, None] = None
+ _no_jacobian_reason: str | None = None
_hessian: Any = None
- _no_hessian_reason: Union[str, None] = None
- _internal_jacobian: Union[np.ndarray, None] = None
- _internal_hessian: Union[np.ndarray, None] = None
- _design_info: Union[pd.DataFrame, None] = None
+ _no_hessian_reason: str | None = None
+ _internal_jacobian: np.ndarray | None = None
+ _internal_hessian: np.ndarray | None = None
+ _design_info: pd.DataFrame | None = None
_cache: Dict = field(default_factory=dict)
def __post_init__(self):
diff --git a/src/estimagic/estimate_msm.py b/src/estimagic/estimate_msm.py
index 1c807118c..562dc6c47 100644
--- a/src/estimagic/estimate_msm.py
+++ b/src/estimagic/estimate_msm.py
@@ -372,9 +372,9 @@ def get_msm_optimization_functions(
jacobian_kwargs (dict): Additional keyword arguments for jacobian.
Returns:
- dict: Dictionary containing at least the entry "criterion". If enough inputs
- are provided it also contains the entries "derivative" and
- "criterion_and_derivative". All values are functions that take params
+ dict: Dictionary containing at least the entry "fun". If enough inputs
+ are provided it also contains the entries "jac" and
+ "fun_and_jac". All values are functions that take params
as only argument.
"""
@@ -400,7 +400,7 @@ def get_msm_optimization_functions(
registry=registry,
)
- out = {"criterion": criterion}
+ out = {"fun": criterion}
if _jacobian is not None:
raise NotImplementedError(
diff --git a/src/optimagic/benchmarking/cartis_roberts.py b/src/optimagic/benchmarking/cartis_roberts.py
index d54102c90..6a827ba05 100644
--- a/src/optimagic/benchmarking/cartis_roberts.py
+++ b/src/optimagic/benchmarking/cartis_roberts.py
@@ -4849,7 +4849,7 @@ def get_start_points_methanl8():
CARTIS_ROBERTS_PROBLEMS = {
"arglale": {
# arglale is the same as linear_full_rank with specific settings
- "criterion": partial(linear_full_rank, dim_out=400),
+ "fun": partial(linear_full_rank, dim_out=400),
"start_x": [1] * 100,
"solution_x": [-0.99999952] * 100,
"start_criterion": 700,
@@ -4857,98 +4857,98 @@ def get_start_points_methanl8():
},
"arglble": {
# arglble is the same as linear_rank_one with specific settings
- "criterion": partial(linear_rank_one, dim_out=400),
+ "fun": partial(linear_rank_one, dim_out=400),
"start_x": [1] * 100,
"solution_x": solution_x_arglble,
"start_criterion": 5.460944e14,
"solution_criterion": 99.62547,
},
"argtrig": {
- "criterion": argtrig,
+ "fun": argtrig,
"start_x": [1 / 100] * 100,
"solution_x": [0] * 100,
"start_criterion": 32.99641,
"solution_criterion": 0,
},
"artif": {
- "criterion": artif,
+ "fun": artif,
"start_x": [1] * 100,
"solution_x": None,
"start_criterion": 36.59115,
"solution_criterion": 0,
},
"arwhdne": {
- "criterion": arwhdne,
+ "fun": arwhdne,
"start_x": [1] * 100,
"solution_x": [0.706011] * 99 + [0],
"start_criterion": 495,
"solution_criterion": 27.66203,
},
"bdvalues": {
- "criterion": bdvalues,
+ "fun": bdvalues,
"start_x": get_start_points_bdvalues(100, 1000),
"solution_x": solution_x_bdvalues,
"start_criterion": 1.943417e7,
"solution_criterion": 0,
},
"bratu_2d": {
- "criterion": partial(bratu_2d, alpha=4),
+ "fun": partial(bratu_2d, alpha=4),
"start_x": [0] * 64,
"solution_x": solution_x_bratu_2d,
"start_criterion": 0.1560738,
"solution_criterion": 0,
},
"bratu_2d_t": {
- "criterion": partial(bratu_2d, alpha=6.80812),
+ "fun": partial(bratu_2d, alpha=6.80812),
"start_x": [0] * 64,
"solution_x": solution_x_bratu_2d_t,
"start_criterion": 0.4521311,
"solution_criterion": 1.8534736e-05,
},
"bratu_3d": {
- "criterion": partial(bratu_3d, alpha=6.80812),
+ "fun": partial(bratu_3d, alpha=6.80812),
"start_x": [0] * 27,
"solution_x": solution_x_bratu_3d,
"start_criterion": 4.888529,
"solution_criterion": 0,
},
"brownale": {
- "criterion": brown_almost_linear,
+ "fun": brown_almost_linear,
"start_x": [0.5] * 100,
"solution_x": [1] * 100,
"start_criterion": 2.524757e5,
"solution_criterion": 0,
},
"broydn_3d": {
- "criterion": broydn_3d,
+ "fun": broydn_3d,
"start_x": [-1] * 100,
"solution_x": solution_x_broydn_3d,
"start_criterion": 111,
"solution_criterion": 0,
},
"cbratu_2d": {
- "criterion": cbratu_2d,
+ "fun": cbratu_2d,
"start_x": [0] * (2 * 5 * 5),
"solution_x": solution_x_cbratu_2d,
"start_criterion": 0.4822531,
"solution_criterion": 0,
},
"broydn_bd": {
- "criterion": broydn_bd,
+ "fun": broydn_bd,
"start_x": [1] * 100,
"solution_x": solution_x_broydn_bd,
"start_criterion": 2404,
"solution_criterion": 0,
},
"chandheq": {
- "criterion": chandheq,
+ "fun": chandheq,
"start_x": (np.arange(1, 101) / 100).tolist(),
"solution_x": None,
"start_criterion": 6.923365,
"solution_criterion": 0,
},
"chemrcta": {
- "criterion": chemrcta,
+ "fun": chemrcta,
"start_x": [1] * 100,
"solution_x": None,
"start_criterion": 3.0935,
@@ -4956,7 +4956,7 @@ def get_start_points_methanl8():
"lower_bounds": np.concatenate([np.zeros(50), 1e-6 * np.ones(50)]),
},
"chemrctb": {
- "criterion": chemrctb,
+ "fun": chemrctb,
"start_x": [1] * 100,
"solution_x": solution_x_chemrctb,
"start_criterion": 1.446513,
@@ -4964,35 +4964,35 @@ def get_start_points_methanl8():
"lower_bounds": 1e-6 * np.ones(100),
},
"chnrsbne": {
- "criterion": chnrsbne,
+ "fun": chnrsbne,
"start_x": [-1] * 50,
"solution_x": [1] * 50,
"start_criterion": 7635.84,
"solution_criterion": 0,
},
"drcavty1": {
- "criterion": partial(drcavty, r=500),
+ "fun": partial(drcavty, r=500),
"start_x": [0] * 100,
"solution_x": None,
"start_criterion": 0.4513889,
"solution_criterion": 0,
},
"drcavty2": {
- "criterion": partial(drcavty, r=1000),
+ "fun": partial(drcavty, r=1000),
"start_x": [0] * 100,
"solution_x": solution_x_drcavty2,
"start_criterion": 0.4513889,
"solution_criterion": 3.988378e-4,
},
"drcavty3": {
- "criterion": partial(drcavty, r=4500),
+ "fun": partial(drcavty, r=4500),
"start_x": [0] * 100,
"solution_x": solution_x_drcavty3,
"start_criterion": 0.4513889,
"solution_criterion": 0,
},
"eigena": {
- "criterion": partial(eigen, param=np.diag(np.arange(1, 11))),
+ "fun": partial(eigen, param=np.diag(np.arange(1, 11))),
"start_x": [1] * 10 + np.eye(10).flatten().tolist(),
"solution_x": [*np.arange(1, 11).tolist(), 1] + ([0] * 10 + [1]) * 9,
"start_criterion": 285,
@@ -5000,7 +5000,7 @@ def get_start_points_methanl8():
"lower_bounds": np.zeros(110),
},
"eigenb": {
- "criterion": partial(
+ "fun": partial(
eigen, param=np.diag(2 * np.ones(10)) + np.diag(-np.ones(9), k=1)
),
"start_x": [1] * 10 + np.eye(10).flatten().tolist(),
@@ -5011,7 +5011,7 @@ def get_start_points_methanl8():
# according to table 3 in their paper, the minimum is at 0.
},
"flosp2hh": {
- "criterion": partial(
+ "fun": partial(
flosp2,
a=np.array([1, 0, -1], dtype=np.int64),
b=np.array([1, 0, -1], dtype=np.int64),
@@ -5023,7 +5023,7 @@ def get_start_points_methanl8():
"solution_criterion": 1 / 3,
},
"flosp2hl": {
- "criterion": partial(
+ "fun": partial(
flosp2,
a=np.array([1, 0, -1], dtype=np.float64),
b=np.array([1, 0, -1], dtype=np.float64),
@@ -5035,7 +5035,7 @@ def get_start_points_methanl8():
"solution_criterion": 1 / 3,
},
"flosp2hm": {
- "criterion": partial(
+ "fun": partial(
flosp2,
a=np.array([1, 0, -1], dtype=np.float64),
b=np.array([1, 0, -1], dtype=np.float64),
@@ -5047,7 +5047,7 @@ def get_start_points_methanl8():
"solution_criterion": 1 / 3,
},
"flosp2th": {
- "criterion": partial(
+ "fun": partial(
flosp2,
a=np.array([0, 1, 0], dtype=np.float64),
b=np.array([0, 1, 1], dtype=np.float64),
@@ -5059,7 +5059,7 @@ def get_start_points_methanl8():
"solution_criterion": 0,
},
"flosp2tl": {
- "criterion": partial(
+ "fun": partial(
flosp2,
a=np.array([0, 1, 0], dtype=np.float64),
b=np.array([0, 1, 1], dtype=np.float64),
@@ -5071,7 +5071,7 @@ def get_start_points_methanl8():
"solution_criterion": 0,
},
"flosp2tm": {
- "criterion": partial(
+ "fun": partial(
flosp2,
a=np.array([0, 1, 0], dtype=np.float64),
b=np.array([0, 1, 1], dtype=np.float64),
@@ -5083,49 +5083,49 @@ def get_start_points_methanl8():
"solution_criterion": 0,
},
"freurone": {
- "criterion": freurone,
+ "fun": freurone,
"start_x": [0.5, -2] + [0] * 98,
"solution_x": solution_x_freurone,
"start_criterion": 9.95565e4,
"solution_criterion": 1.196458e4,
},
"hatfldg": {
- "criterion": hatfldg,
+ "fun": hatfldg,
"start_x": [1] * 25,
"solution_x": [0] * 11 + [-1, 1] + [0] * 12,
"start_criterion": 27,
"solution_criterion": 0,
},
"hydcar20": {
- "criterion": partial(hydcar, n=20, m=3, k=9),
+ "fun": partial(hydcar, n=20, m=3, k=9),
"start_x": get_start_points_hydcar20(),
"solution_x": solution_x_hydcar20,
"start_criterion": 1341.663,
"solution_criterion": 0,
},
"hydcar6": {
- "criterion": partial(hydcar, n=6, m=3, k=2),
+ "fun": partial(hydcar, n=6, m=3, k=2),
"start_x": get_start_points_hydcar6(),
"solution_x": solution_x_hydcar6,
"start_criterion": 704.1073,
"solution_criterion": 0,
},
"integreq": {
- "criterion": integreq,
+ "fun": integreq,
"start_x": (np.arange(1, 101) / 101 * (np.arange(1, 101) / 101 - 1)).tolist(),
"solution_x": solution_x_integreq,
"start_criterion": 0.5730503,
"solution_criterion": 0,
},
"luksan11": {
- "criterion": luksan11,
+ "fun": luksan11,
"start_x": [-0.8] * 100,
"solution_x": [1] * 100,
"start_criterion": 626.0640,
"solution_criterion": 0,
},
"luksan12": {
- "criterion": luksan12,
+ "fun": luksan12,
"start_x": [-1] * 98,
"solution_x": None,
"start_criterion": 3.2160e4,
@@ -5136,7 +5136,7 @@ def get_start_points_methanl8():
# That is why we disable this test function for the time being.
},
"luksan13": {
- "criterion": luksan13,
+ "fun": luksan13,
"start_x": [-1] * 98,
"solution_x": solution_x_luksan13,
"start_criterion": 6.4352e4,
@@ -5145,112 +5145,112 @@ def get_start_points_methanl8():
# according to table 3 in their paper, the minimum is at 25188.86
},
"luksan14": {
- "criterion": luksan14,
+ "fun": luksan14,
"start_x": [-1] * 98,
"solution_x": solution_x_luksan14,
"start_criterion": 2.6880e4,
"solution_criterion": 123.9235,
},
"luksan15": {
- "criterion": luksan15,
+ "fun": luksan15,
"start_x": [-0.8, 1.2, -1.2, 0.8] * 25,
"solution_x": solution_x_luksan15,
"start_criterion": 2.701585e4,
"solution_criterion": 3.569697,
},
"luksan16": {
- "criterion": luksan16,
+ "fun": luksan16,
"start_x": [-0.8, 1.2, -1.2, 0.8] * 25,
"solution_x": solution_x_luksan16,
"start_criterion": 1.306848e4,
"solution_criterion": 3.569697,
},
"luksan17": {
- "criterion": luksan17,
+ "fun": luksan17,
"start_x": [-0.8, 1.2, -1.2, 0.8] * 25,
"solution_x": None, # multiple argmins
"start_criterion": 1.687370e6,
"solution_criterion": 0.4931613,
},
"luksan21": {
- "criterion": luksan21,
+ "fun": luksan21,
"start_x": [ih * (ih - 1) for ih in np.arange(1, 101) * (1 / 101)],
"solution_x": solution_x_luksan21,
"start_criterion": 99.98751,
"solution_criterion": 0,
},
"luksan22": {
- "criterion": luksan22,
+ "fun": luksan22,
"start_x": [-1.2 if i % 2 == 0 else 1 for i in range(100)],
"solution_x": solution_x_luksan22,
"start_criterion": 2.487686e4,
"solution_criterion": 872.9230,
},
"methanb8": {
- "criterion": methane,
+ "fun": methane,
"start_x": get_start_points_methanb8(),
"solution_x": solution_x_methane,
"start_criterion": 1.043105,
"solution_criterion": 0,
},
"methanl8": {
- "criterion": methane,
+ "fun": methane,
"start_x": get_start_points_methanl8(),
"solution_x": solution_x_methane,
"start_criterion": 4345.100,
"solution_criterion": 0,
},
"morebvne": {
- "criterion": morebvne,
+ "fun": morebvne,
"start_x": [t * (t - 1) for t in np.arange(1, 101) * (1 / 101)],
"solution_x": solution_x_morebvne,
"start_criterion": 3.633100e-4,
"solution_criterion": 0,
},
"msqrta": {
- "criterion": msqrta,
+ "fun": msqrta,
"start_x": get_start_points_msqrta(10),
"solution_x": solution_x_msqrta,
"start_criterion": 212.7162,
"solution_criterion": 0,
},
"msqrtb": {
- "criterion": msqrta,
+ "fun": msqrta,
"start_x": get_start_points_msqrta(10, flag=2),
"solution_x": solution_x_msqrtb,
"start_criterion": 205.0753,
"solution_criterion": 0,
},
"oscigrne": {
- "criterion": oscigrne,
+ "fun": oscigrne,
"start_x": [-2] + [1] * 99,
"solution_x": solution_x_oscigrne,
"start_criterion": 6.120720e8,
"solution_criterion": 0,
},
"penalty_1": {
- "criterion": penalty_1,
+ "fun": penalty_1,
"start_x": list(range(1, 101)),
"solution_x": None,
"start_criterion": 1.144806e11,
"solution_criterion": 9.025000e-9,
},
"penalty_2": {
- "criterion": penalty_2,
+ "fun": penalty_2,
"start_x": [0.5] * 100,
"solution_x": solution_x_penalty2,
"start_criterion": 1.591383e6,
"solution_criterion": 0.9809377,
},
"powellse": {
- "criterion": powell_singular,
+ "fun": powell_singular,
"start_x": [3.0, -1.0, 0.0, 1] * 25,
"solution_x": [0] * 100,
"start_criterion": 41875,
"solution_criterion": 0,
},
"qr3d": {
- "criterion": partial(qr3d, m=5),
+ "fun": partial(qr3d, m=5),
"start_x": get_start_points_qr3d(5),
"solution_x": solution_x_qr3d,
"start_criterion": 1.2,
@@ -5259,7 +5259,7 @@ def get_start_points_methanl8():
+ [0 if i == j else -np.inf for i in range(5) for j in range(5)],
},
"qr3dbd": {
- "criterion": partial(qr3dbd, m=5),
+ "fun": partial(qr3dbd, m=5),
"start_x": get_start_points_qr3dbd(5),
"solution_x": solution_x_qr3dbd,
"start_criterion": 1.2,
@@ -5268,21 +5268,21 @@ def get_start_points_methanl8():
+ [0 if i == j else -np.inf for i in range(5) for j in range(5)],
},
"spmsqrt": {
- "criterion": spmsqrt,
+ "fun": spmsqrt,
"start_x": get_start_points_spmsqrt(34),
"solution_x": solution_x_spmsqrt,
"start_criterion": 74.33542,
"solution_criterion": 0,
},
"semicn2u": {
- "criterion": semicon2,
+ "fun": semicon2,
"start_x": [0] * 100,
"solution_x": solution_x_semicon2,
"start_criterion": 2.025037e4,
"solution_criterion": 0,
},
"semicon2": {
- "criterion": semicon2,
+ "fun": semicon2,
"start_x": [0] * 100,
"solution_x": solution_x_semicon2,
"start_criterion": 2.025037e4,
@@ -5291,28 +5291,28 @@ def get_start_points_methanl8():
"upper_bounds": 0.2 * 700 * np.ones(100),
},
"vardimne": {
- "criterion": vardimne,
+ "fun": vardimne,
"start_x": [1 - i / 100 for i in range(1, 101)],
"solution_x": [1] * 100,
"start_criterion": 1.310584e14,
"solution_criterion": 0,
},
"watsonne": {
- "criterion": watson,
+ "fun": watson,
"start_x": [0] * 31,
"solution_x": solution_x_watson,
"start_criterion": 30,
"solution_criterion": 0,
},
"yatpsq_1": {
- "criterion": partial(yatpsq_1, dim_in=10),
+ "fun": partial(yatpsq_1, dim_in=10),
"start_x": [6] * 100 + [0] * 20,
"solution_x": solution_x_yatpsq_1,
"start_criterion": 2.073643e6,
"solution_criterion": 0,
},
"yatpsq_2": {
- "criterion": partial(yatpsq_2, dim_in=10),
+ "fun": partial(yatpsq_2, dim_in=10),
"start_x": [10] * 100 + [0] * 20,
"solution_x": solution_x_yatpsq_2,
"start_criterion": 1.831687e5,
diff --git a/src/optimagic/benchmarking/get_benchmark_problems.py b/src/optimagic/benchmarking/get_benchmark_problems.py
index 713092484..5473e6204 100644
--- a/src/optimagic/benchmarking/get_benchmark_problems.py
+++ b/src/optimagic/benchmarking/get_benchmark_problems.py
@@ -116,7 +116,7 @@ def get_benchmark_problems(
problems[prob_name] = {
"inputs": inputs,
- "noise_free_criterion": specification["criterion"],
+ "noise_free_fun": specification["fun"],
"solution": _create_problem_solution(
specification, scaling_options=scaling_options
),
@@ -192,9 +192,9 @@ def _get_raw_problems(name):
raw_problems[k] = v
if k in subset_add_steps:
problem = v.copy()
- raw_func = problem["criterion"]
+ raw_func = problem["fun"]
- problem["criterion"] = partial(_step_func, raw_func=raw_func)
+ problem["fun"] = partial(_step_func, raw_func=raw_func)
raw_problems[f"{k}_with_steps"] = problem
for k, v in CARTIS_ROBERTS_PROBLEMS.items():
@@ -223,14 +223,14 @@ def _create_problem_inputs(
_criterion = partial(
_internal_criterion_template,
- criterion=specification["criterion"],
+ criterion=specification["fun"],
additive_options=additive_options,
multiplicative_options=multiplicative_options,
scaling_factor=scaling_factor,
rng=rng,
)
- inputs = {"criterion": _criterion, "params": _x}
+ inputs = {"fun": _criterion, "params": _x}
return inputs
diff --git a/src/optimagic/benchmarking/more_wild.py b/src/optimagic/benchmarking/more_wild.py
index 250c155b6..b842840d2 100644
--- a/src/optimagic/benchmarking/more_wild.py
+++ b/src/optimagic/benchmarking/more_wild.py
@@ -762,21 +762,21 @@ def get_start_points_mancino(n, a=1):
MORE_WILD_PROBLEMS = {
"linear_full_rank_good_start": {
- "criterion": partial(linear_full_rank, dim_out=45),
+ "fun": partial(linear_full_rank, dim_out=45),
"start_x": [1] * 9,
"solution_x": linear_full_rank_solution_x,
"start_criterion": 72,
"solution_criterion": 36,
},
"linear_full_rank_bad_start": {
- "criterion": partial(linear_full_rank, dim_out=45),
+ "fun": partial(linear_full_rank, dim_out=45),
"start_x": [10] * 9,
"solution_x": linear_full_rank_solution_x,
"start_criterion": 1125,
"solution_criterion": 36,
},
"linear_rank_one_good_start": {
- "criterion": partial(linear_rank_one, dim_out=35),
+ "fun": partial(linear_rank_one, dim_out=35),
"start_x": [1] * 7,
# no unique solution
"solution_x": None,
@@ -784,7 +784,7 @@ def get_start_points_mancino(n, a=1):
"solution_criterion": 8.380281690143324,
},
"linear_rank_one_bad_start": {
- "criterion": partial(linear_rank_one, dim_out=35),
+ "fun": partial(linear_rank_one, dim_out=35),
"start_x": [10] * 7,
# no unique solution
"solution_x": None,
@@ -792,7 +792,7 @@ def get_start_points_mancino(n, a=1):
"solution_criterion": 8.380282,
},
"linear_rank_one_zero_columns_rows_good_start": {
- "criterion": partial(linear_rank_one_zero_columns_rows, dim_out=35),
+ "fun": partial(linear_rank_one_zero_columns_rows, dim_out=35),
"start_x": [1] * 7,
# no unique solution
"solution_x": None,
@@ -800,7 +800,7 @@ def get_start_points_mancino(n, a=1):
"solution_criterion": 9.880597014926506,
},
"linear_rank_one_zero_columns_rows_bad_start": {
- "criterion": partial(linear_rank_one_zero_columns_rows, dim_out=35),
+ "fun": partial(linear_rank_one_zero_columns_rows, dim_out=35),
"start_x": [10] * 7,
# no unique solution
"solution_x": None,
@@ -808,77 +808,77 @@ def get_start_points_mancino(n, a=1):
"solution_criterion": 9.880597014926506,
},
"rosenbrock_good_start": {
- "criterion": rosenbrock,
+ "fun": rosenbrock,
"start_x": [-1.2, 1],
"solution_x": [1, 1],
"start_criterion": 24.2,
"solution_criterion": 0,
},
"rosenbrock_bad_start": {
- "criterion": rosenbrock,
+ "fun": rosenbrock,
"start_x": [-12, 10],
"solution_x": [1, 1],
"start_criterion": 1.795769e6,
"solution_criterion": 0,
},
"helical_valley_good_start": {
- "criterion": helical_valley,
+ "fun": helical_valley,
"start_x": [-1, 0, 0],
"solution_x": [1, 0, 0],
"start_criterion": 2500,
"solution_criterion": 0,
},
"helical_valley_bad_start": {
- "criterion": helical_valley,
+ "fun": helical_valley,
"start_x": [-10, 0, 0],
"solution_x": [1, 0, 0],
"start_criterion": 10600,
"solution_criterion": 0,
},
"powell_singular_good_start": {
- "criterion": powell_singular,
+ "fun": powell_singular,
"start_x": [3, -1, 0, 1],
"solution_x": [0] * 4,
"start_criterion": 215,
"solution_criterion": 0,
},
"powell_singular_bad_start": {
- "criterion": powell_singular,
+ "fun": powell_singular,
"start_x": [30, -10, 0, 10],
"solution_x": [0] * 4,
"start_criterion": 1.615400e6,
"solution_criterion": 0,
},
"freudenstein_roth_good_start": {
- "criterion": freudenstein_roth,
+ "fun": freudenstein_roth,
"start_x": [0.5, -2],
"solution_x": freudenstein_roth_solution_x,
"start_criterion": 400.5,
"solution_criterion": 48.984253679240013,
},
"freudenstein_roth_bad_start": {
- "criterion": freudenstein_roth,
+ "fun": freudenstein_roth,
"start_x": [5, -20],
"solution_x": freudenstein_roth_solution_x,
"start_criterion": 1.545754e8,
"solution_criterion": 48.984253679240013,
},
"bard_good_start": {
- "criterion": partial(bard, y=y_vec),
+ "fun": partial(bard, y=y_vec),
"start_x": [1] * 3,
"solution_x": bard_solution_x,
"start_criterion": 41.68170,
"solution_criterion": 0.00821487730657897,
},
"bard_bad_start": {
- "criterion": partial(bard, y=y_vec),
+ "fun": partial(bard, y=y_vec),
"start_x": [10] * 3,
"solution_x": bard_solution_x,
"start_criterion": 1306.234,
"solution_criterion": 0.00821487730657897,
},
"kowalik_osborne": {
- "criterion": partial(
+ "fun": partial(
kowalik_osborne,
y1=v_vec,
y2=y2_vec,
@@ -889,245 +889,245 @@ def get_start_points_mancino(n, a=1):
"solution_criterion": 0.00030750560384924,
},
"meyer": {
- "criterion": partial(meyer, y=y3_vec),
+ "fun": partial(meyer, y=y3_vec),
"start_x": [0.02, 4000, 250],
"solution_x": meyer_solution_x,
"start_criterion": 1.693608e9,
"solution_criterion": 87.945855170395831,
},
"watson_6_good_start": {
- "criterion": watson,
+ "fun": watson,
"start_x": [0.5] * 6,
"solution_x": watson_6_solution_x,
"start_criterion": 16.43083,
"solution_criterion": 0.00228767005355236,
},
"watson_6_bad_start": {
- "criterion": watson,
+ "fun": watson,
"start_x": [5] * 6,
"solution_x": watson_6_solution_x,
"start_criterion": 2.323367e6,
"solution_criterion": 0.00228767005355236,
},
"watson_9_good_start": {
- "criterion": watson,
+ "fun": watson,
"start_x": [0.5] * 9,
"solution_x": watson_9_solution_x,
"start_criterion": 26.90417,
"solution_criterion": 1.399760e-6,
},
"watson_9_bad_start": {
- "criterion": watson,
+ "fun": watson,
"start_x": [5] * 9,
"solution_x": watson_9_solution_x,
"start_criterion": 8.158877e6,
"solution_criterion": 1.399760e-6,
},
"watson_12_good_start": {
- "criterion": watson,
+ "fun": watson,
"start_x": [0.5] * 12,
"solution_x": watson_12_solution_x,
"start_criterion": 73.67821,
"solution_criterion": 4.722381e-10,
},
"watson_12_bad_start": {
- "criterion": watson,
+ "fun": watson,
"start_x": [5] * 12,
"solution_x": watson_12_solution_x,
"start_criterion": 2.059384e7,
"solution_criterion": 4.722381e-10,
},
"box_3d": {
- "criterion": partial(box_3d, dim_out=10),
+ "fun": partial(box_3d, dim_out=10),
"start_x": [0, 10, 20],
"solution_x": [1, 10, 1],
"start_criterion": 1031.154,
"solution_criterion": 0,
},
"jennrich_sampson": {
- "criterion": partial(jennrich_sampson, dim_out=10),
+ "fun": partial(jennrich_sampson, dim_out=10),
"start_x": [0.3, 0.4],
"solution_x": [0.2578252135686162] * 2,
"start_criterion": 4171.306,
"solution_criterion": 124.3621823556148,
},
"brown_dennis_good_start": {
- "criterion": partial(brown_dennis, dim_out=20),
+ "fun": partial(brown_dennis, dim_out=20),
"start_x": [25, 5, -5, -1],
"solution_x": brown_dennis_solution_x,
"start_criterion": 7.926693e6,
"solution_criterion": 85822.20162635,
},
"brown_dennis_bad_start": {
- "criterion": partial(brown_dennis, dim_out=20),
+ "fun": partial(brown_dennis, dim_out=20),
"start_x": [250, 50, -50, -10],
"solution_x": brown_dennis_solution_x,
"start_criterion": 3.081064e11,
"solution_criterion": 85822.20162635,
},
"chebyquad_6": {
- "criterion": partial(chebyquad, dim_out=6),
+ "fun": partial(chebyquad, dim_out=6),
"start_x": [i / 7 for i in range(1, 7)],
"solution_x": chebyquad_6_solution_x,
"start_criterion": 4.642817e-2,
"solution_criterion": 0,
},
"chebyquad_7": {
- "criterion": partial(chebyquad, dim_out=7),
+ "fun": partial(chebyquad, dim_out=7),
"start_x": [i / 8 for i in range(1, 8)],
"solution_x": chebyquad_7_solution_x,
"start_criterion": 3.377064e-2,
"solution_criterion": 0,
},
"chebyquad_8": {
- "criterion": partial(chebyquad, dim_out=8),
+ "fun": partial(chebyquad, dim_out=8),
"start_x": [i / 9 for i in range(1, 9)],
"solution_x": chebyquad_8_solution_x,
"start_criterion": 3.861770e-2,
"solution_criterion": 0.003516873725677,
},
"chebyquad_9": {
- "criterion": partial(chebyquad, dim_out=9),
+ "fun": partial(chebyquad, dim_out=9),
"start_x": [i / 10 for i in range(1, 10)],
"solution_x": chebyquad_9_solution_x,
"start_criterion": 2.888298e-2,
"solution_criterion": 0,
},
"chebyquad_10": {
- "criterion": partial(chebyquad, dim_out=10),
+ "fun": partial(chebyquad, dim_out=10),
"start_x": [i / 11 for i in range(1, 11)],
"solution_x": chebyquad_10_solution_x,
"start_criterion": 3.376327e-2,
"solution_criterion": 0.00477271369637536,
},
"chebyquad_11": {
- "criterion": partial(chebyquad, dim_out=11),
+ "fun": partial(chebyquad, dim_out=11),
"start_x": [i / 12 for i in range(1, 12)],
"solution_x": chebyquad_11_solution_x,
"start_criterion": 2.674060e-2,
"solution_criterion": 0.00279976155186576,
},
"brown_almost_linear": {
- "criterion": brown_almost_linear,
+ "fun": brown_almost_linear,
"start_x": [0.5] * 10,
"solution_x": [1] * 10,
"start_criterion": 273.2480,
"solution_criterion": 0,
},
"osborne_one": {
- "criterion": partial(osborne_one, y=y4_vec),
+ "fun": partial(osborne_one, y=y4_vec),
"start_x": [0.5, 1.5, 1, 0.01, 0.02],
"solution_x": osborne_one_solution_x,
"start_criterion": 16.17411,
"solution_criterion": 0.00005464894697483,
},
"osborne_two_good_start": {
- "criterion": partial(osborne_two, y=y5_vec),
+ "fun": partial(osborne_two, y=y5_vec),
"start_x": [1.3, 0.65, 0.65, 0.7, 0.6, 3, 5, 7, 2, 4.5, 5.5],
"solution_x": osborne_two_solution_x,
"start_criterion": 2.093420,
"solution_criterion": 0.0401377362935477,
},
"osborne_two_bad_start": {
- "criterion": partial(osborne_two, y=y5_vec),
+ "fun": partial(osborne_two, y=y5_vec),
"start_x": [13, 6.5, 6.5, 7, 6, 30, 50, 70, 20, 45, 55],
"solution_x": osborne_two_solution_x,
"start_criterion": 199.6847,
"solution_criterion": 0.0401377362935477,
},
"bdqrtic_8": {
- "criterion": bdqrtic,
+ "fun": bdqrtic,
"start_x": [1] * 8,
"solution_x": bdqrtic_8_solution_x,
"start_criterion": 904,
"solution_criterion": 10.2389734213174,
},
"bdqrtic_10": {
- "criterion": bdqrtic,
+ "fun": bdqrtic,
"start_x": [1] * 10,
"solution_x": bdqrtic_10_solution_x,
"start_criterion": 1356,
"solution_criterion": 18.28116175359353,
},
"bdqrtic_11": {
- "criterion": bdqrtic,
+ "fun": bdqrtic,
"start_x": [1] * 11,
"solution_x": bdqrtic_11_solution_x,
"start_criterion": 1582,
"solution_criterion": 22.260591734883817,
},
"bdqrtic_12": {
- "criterion": bdqrtic,
+ "fun": bdqrtic,
"start_x": [1] * 12,
"solution_x": bdqrtic_12_solution_x,
"start_criterion": 1808,
"solution_criterion": 26.2727663967939,
},
"cube_5": {
- "criterion": cube,
+ "fun": cube,
"start_x": [0.5] * 5,
"solution_x": [1] * 5,
"start_criterion": 56.5,
"solution_criterion": 0,
},
"cube_6": {
- "criterion": cube,
+ "fun": cube,
"start_x": [0.5] * 6,
"solution_x": [1] * 6,
"start_criterion": 70.5625,
"solution_criterion": 0,
},
"cube_8": {
- "criterion": cube,
+ "fun": cube,
"start_x": [0.5] * 8,
"solution_x": [1] * 8,
"start_criterion": 98.6875,
"solution_criterion": 0,
},
"mancino_5_good_start": {
- "criterion": mancino,
+ "fun": mancino,
"start_x": get_start_points_mancino(5),
"solution_x": mancino_5_solution_x,
"start_criterion": 2.539084e9,
"solution_criterion": 0,
},
"mancino_5_bad_start": {
- "criterion": mancino,
+ "fun": mancino,
"start_x": get_start_points_mancino(5, 10),
"solution_x": mancino_5_solution_x,
"start_criterion": 6.873795e12,
"solution_criterion": 0,
},
"mancino_8": {
- "criterion": mancino,
+ "fun": mancino,
"start_x": get_start_points_mancino(8),
"solution_x": mancino_8_solution_x,
"start_criterion": 3.367961e9,
"solution_criterion": 0,
},
"mancino_10": {
- "criterion": mancino,
+ "fun": mancino,
"start_x": get_start_points_mancino(10),
"solution_x": mancino_10_solution_x,
"start_criterion": 3.735127e9,
"solution_criterion": 0,
},
"mancino_12_good_start": {
- "criterion": mancino,
+ "fun": mancino,
"start_x": get_start_points_mancino(12),
"solution_x": mancino_12_solution_x,
"start_criterion": 3.991072e9,
"solution_criterion": 0,
},
"mancino_12_bad_start": {
- "criterion": mancino,
+ "fun": mancino,
"start_x": get_start_points_mancino(12, 10),
"solution_x": mancino_12_solution_x,
"start_criterion": 1.130015e13,
"solution_criterion": 0,
},
"heart_eight_good_start": {
- "criterion": partial(
+ "fun": partial(
heart_eight,
y=np.array([-0.69, -0.044, -1.57, -1.31, -2.65, 2, -12.6, 9.48]),
),
@@ -1137,7 +1137,7 @@ def get_start_points_mancino(n, a=1):
"solution_criterion": 0,
},
"heart_eight_bad_start": {
- "criterion": partial(
+ "fun": partial(
heart_eight,
y=np.array([-0.69, -0.044, -1.57, -1.31, -2.65, 2, -12.6, 9.48]),
),
@@ -1147,7 +1147,7 @@ def get_start_points_mancino(n, a=1):
"solution_criterion": 0,
},
"brown_almost_linear_medium": {
- "criterion": brown_almost_linear,
+ "fun": brown_almost_linear,
"start_x": [0.5] * 100,
"solution_x": [1] * 100,
"start_criterion": 2.524757e5,
diff --git a/src/optimagic/benchmarking/run_benchmark.py b/src/optimagic/benchmarking/run_benchmark.py
index 128c02d0a..06cfccabd 100644
--- a/src/optimagic/benchmarking/run_benchmark.py
+++ b/src/optimagic/benchmarking/run_benchmark.py
@@ -39,7 +39,7 @@ def run_benchmark(
dictionary that maps a name for optimizer settings
(e.g. ``"lbfgsb_strict_criterion"``) to a dictionary of keyword arguments
for arguments for ``minimize`` (e.g. ``{"algorithm": "scipy_lbfgsb",
- "algo_options": {"convergence.relative_criterion_tolerance": 1e-12}}``).
+ "algo_options": {"convergence.ftol_rel": 1e-12}}``).
Alternatively, the values can just be an algorithm which is then benchmarked
at default settings.
batch_evaluator (str or callable): See :ref:`batch_evaluators`.
@@ -115,12 +115,12 @@ def _process_optimize_options(raw_options, max_evals, disable_convergence):
default_algo_options = {}
if max_evals is not None:
- default_algo_options["stopping.max_criterion_evaluations"] = max_evals
- default_algo_options["stopping.max_iterations"] = max_evals
+ default_algo_options["stopping.maxfun"] = max_evals
+ default_algo_options["stopping.maxiter"] = max_evals
if disable_convergence:
- default_algo_options["convergence.relative_criterion_tolerance"] = 1e-14
- default_algo_options["convergence.relative_params_tolerance"] = 1e-14
- default_algo_options["convergence.relative_gradient_tolerance"] = 1e-14
+ default_algo_options["convergence.ftol_rel"] = 1e-14
+ default_algo_options["convergence.xtol_rel"] = 1e-14
+ default_algo_options["convergence.gtol_rel"] = 1e-14
out_options = {}
for name, _option in dict_options.items():
@@ -181,7 +181,7 @@ def _process_one_result(optimize_result, problem):
"""
_registry = get_registry(extended=True)
- _criterion = problem["noise_free_criterion"]
+ _criterion = problem["noise_free_fun"]
_start_x = problem["inputs"]["params"]
_start_crit_value = _criterion(_start_x)
if isinstance(_start_crit_value, np.ndarray):
diff --git a/src/optimagic/deprecations.py b/src/optimagic/deprecations.py
new file mode 100644
index 000000000..1d5defe8e
--- /dev/null
+++ b/src/optimagic/deprecations.py
@@ -0,0 +1,97 @@
+import warnings
+
+
+def throw_criterion_future_warning():
+ msg = (
+ "To align optimagic with scipy.optimize, the `criterion` argument has been "
+ "renamed to `fun`. Please use `fun` instead of `criterion`. Using `criterion` "
+ " will become an error in optimagic version 0.6.0 and later."
+ )
+ warnings.warn(msg, FutureWarning)
+
+
+def throw_criterion_kwargs_future_warning():
+ msg = (
+ "To align optimagic with scipy.optimize, the `criterion_kwargs` argument has "
+ "been renamed to `fun_kwargs`. Please use `fun_kwargs` instead of "
+ "`criterion_kwargs`. Using `criterion_kwargs` will become an error in "
+ "optimagic version 0.6.0 and later."
+ )
+ warnings.warn(msg, FutureWarning)
+
+
+def throw_derivative_future_warning():
+ msg = (
+ "To align optimagic with scipy.optimize, the `derivative` argument has been "
+ "renamed to `jac`. Please use `jac` instead of `derivative`. Using `derivative`"
+ " will become an error in optimagic version 0.6.0 and later."
+ )
+ warnings.warn(msg, FutureWarning)
+
+
+def throw_derivative_kwargs_future_warning():
+ msg = (
+ "To align optimagic with scipy.optimize, the `derivative_kwargs` argument has "
+ "been renamed to `jac_kwargs`. Please use `jac_kwargs` instead of "
+ "`derivative_kwargs`. Using `derivative_kwargs` will become an error in "
+ "optimagic version 0.6.0 and later."
+ )
+ warnings.warn(msg, FutureWarning)
+
+
+def throw_criterion_and_derivative_future_warning():
+ msg = (
+ "To align optimagic with scipy.optimize, the `criterion_and_derivative` "
+ "argument has been renamed to `fun_and_jac`. Please use `fun_and_jac` "
+ "instead of `criterion_and_derivative`. Using `criterion_and_derivative` "
+ "will become an error in optimagic version 0.6.0 and later."
+ )
+ warnings.warn(msg, FutureWarning)
+
+
+def throw_criterion_and_derivative_kwargs_future_warning():
+ msg = (
+ "To align optimagic with scipy.optimize, the `criterion_and_derivative_kwargs` "
+ "argument has been renamed to `fun_and_jac_kwargs`. Please use "
+ "`fun_and_jac_kwargs` instead of `criterion_and_derivative_kwargs`. Using "
+ "`criterion_and_derivative_kwargs` will become an error in optimagic version "
+ "0.6.0 and later."
+ )
+ warnings.warn(msg, FutureWarning)
+
+
+def replace_and_warn_about_deprecated_algo_options(algo_options):
+ if not isinstance(algo_options, dict):
+ return algo_options
+
+ algo_options = {k.replace(".", "_"): v for k, v in algo_options.items()}
+
+ replacements = {
+ "stopping_max_criterion_evaluations": "stopping_maxfun",
+ "stopping_max_iterations": "stopping_maxiter",
+ "convergence_absolute_criterion_tolerance": "convergence_ftol_abs",
+ "convergence_relative_criterion_tolerance": "convergence_ftol_rel",
+ "convergence_scaled_criterion_tolerance": "convergence_ftol_scaled",
+ "convergence_absolute_params_tolerance": "convergence_xtol_abs",
+ "convergence_relative_params_tolerance": "convergence_xtol_rel",
+ "convergence_absolute_gradient_tolerance": "convergence_gtol_abs",
+ "convergence_relative_gradient_tolerance": "convergence_gtol_rel",
+ "convergence_scaled_gradient_tolerance": "convergence_gtol_scaled",
+ }
+
+ present = sorted(set(algo_options) & set(replacements))
+ if present:
+ msg = (
+ "The following keys in `algo_options` are deprecated and will be removed "
+ "in optimagic version 0.6.0 and later. Please replace them as follows:\n"
+ )
+ for k in present:
+ msg += f" {k} -> {replacements[k]}\n"
+
+ warnings.warn(msg, FutureWarning)
+
+ out = {k: v for k, v in algo_options.items() if k not in present}
+ for k in present:
+ out[replacements[k]] = algo_options[k]
+
+ return out
diff --git a/src/optimagic/exceptions.py b/src/optimagic/exceptions.py
index 180b130ea..6d63a7de0 100644
--- a/src/optimagic/exceptions.py
+++ b/src/optimagic/exceptions.py
@@ -23,6 +23,14 @@ class UserFunctionRuntimeError(OptimagicError):
"""Exception that is raised when user provided functions raise errors."""
+class MissingInputError(OptimagicError):
+ """Exception for missing user provided input."""
+
+
+class AliasError(OptimagicError):
+ """Exception for aliasing errors."""
+
+
class InvalidKwargsError(OptimagicError):
"""Exception for invalid user provided keyword arguments."""
diff --git a/src/optimagic/optimization/algo_options.py b/src/optimagic/optimization/algo_options.py
index 24d0cb9e4..425caebbb 100644
--- a/src/optimagic/optimization/algo_options.py
+++ b/src/optimagic/optimization/algo_options.py
@@ -1,20 +1,7 @@
import numpy as np
-"""
-The order is the following:
-
-1. Convergence and Stopping Criteria
-2. Trust Region Parameters
-3. Other Numerical Algorithm Group Tuning Parameters
-
-"""
-"""
-=====================================================================================
-1. Stopping Criteria
-=====================================================================================
-"""
-CONVERGENCE_RELATIVE_CRITERION_TOLERANCE = 2e-9
+CONVERGENCE_FTOL_REL = 2e-9
"""float: Stop when the relative improvement between two iterations is below this.
The exact definition of relative improvement depends on the optimizer and should
@@ -24,14 +11,14 @@
"""
-CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE = 0
+CONVERGENCE_FTOL_ABS = 0
"""float: Stop when the absolute improvement between two iterations is below this.
Disabled by default because it is very problem specific.
"""
-CONVERGENCE_ABSOLUTE_GRADIENT_TOLERANCE = 1e-5
+CONVERGENCE_GTOL_ABS = 1e-5
"""float: Stop when the gradient are smaller than this.
For some algorithms this criterion refers to all entries, for others to some norm.
@@ -43,7 +30,7 @@
"""
-CONVERGENCE_RELATIVE_GRADIENT_TOLERANCE = 1e-8
+CONVERGENCE_GTOL_REL = 1e-8
"""float: Stop when the gradient, divided by the absolute value of the criterion
function is smaller than this. For some algorithms this criterion refers to
all entries, for others to some norm.For bound constrained optimizers this
@@ -52,7 +39,7 @@
"""
-CONVERGENCE_SCALED_GRADIENT_TOLERANCE = 1e-8
+CONVERGENCE_GTOL_SCALED = 1e-8
"""float: Stop when all entries (or for some algorithms the norm) of the gradient,
divided by the norm of the gradient at start parameters is smaller than this.
For bound constrained optimizers this typically refers to a projected gradient.
@@ -61,7 +48,7 @@
"""
-CONVERGENCE_RELATIVE_PARAMS_TOLERANCE = 1e-5
+CONVERGENCE_XTOL_REL = 1e-5
"""float: Stop when the relative change in parameters is smaller than this.
The exact definition of relative change and whether this refers to the maximum
change or the average change depends on the algorithm and should be documented
@@ -69,7 +56,7 @@
"""
-CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE = 0
+CONVERGENCE_XTOL_ABS = 0
"""float: Stop when the absolute change in parameters between two iterations is smaller
than this. Whether this refers to the maximum change or the average change depends
on the algorithm and should be documented there.
@@ -79,22 +66,8 @@
"""
-CONVERGENCE_NOISE_CORRECTED_CRITERION_TOLERANCE = 1.0
-"""float: Stop when the evaluations on the set of interpolation points all fall within
- this factor of the noise level. The default is 1, i.e. when all evaluations are
- within the noise level. If you want to not use this criterion but still flag your
- criterion function as noisy, set this tolerance to 0.0.
- .. warning::
- Very small values, as in most other tolerances don't make sense here.
-
-"""
-
-CONVERGENCE_MINIMAL_TRUSTREGION_RADIUS_TOLERANCE = 1e-8
-"""float: Stop when the lower trust region radius falls below this value."""
-
-
-STOPPING_MAX_CRITERION_EVALUATIONS = 1_000_000
+STOPPING_MAXFUN = 1_000_000
"""int:
If the maximum number of function evaluation is reached, the optimization stops
but we do not count this as successful convergence. The function evaluations used
@@ -103,7 +76,7 @@
"""
-STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL = 1_000
+STOPPING_MAXFUN_GLOBAL = 1_000
"""int:
If the maximum number of function evaluation is reached, the optimization stops
but we do not count this as successful convergence. The function evaluations used
@@ -113,7 +86,7 @@
"""
-STOPPING_MAX_ITERATIONS = 1_000_000
+STOPPING_MAXITER = 1_000_000
"""int:
If the maximum number of iterations is reached, the
optimization stops, but we do not count this as successful convergence.
@@ -123,49 +96,19 @@
"""
-CONVERGENCE_SECOND_BEST_ABSOLUTE_CRITERION_TOLERANCE = 1e-08
+CONVERGENCE_SECOND_BEST_FTOL_ABS = 1e-08
"""float: absolute criterion tolerance optimagic requires if no other stopping
criterion apart from max iterations etc. is available
this is taken from scipy (SLSQP's value, smaller than Nelder-Mead).
"""
-CONVERGENCE_SECOND_BEST_ABSOLUTE_PARAMS_TOLERANCE = 1e-08
+CONVERGENCE_SECOND_BEST_XTOL_ABS = 1e-08
"""float: The absolute parameter tolerance optimagic requires if no other stopping
criterion apart from max iterations etc. is available. This is taken from pybobyqa.
"""
-CONVERGENCE_SLOW_PROGRESS = {
- "threshold_to_characterize_as_slow": 1e-8,
- "max_insufficient_improvements": None,
- "comparison_period": 5,
-}
-"""dict: Specification of when to terminate or reset the optimization because of only
- slow improvements. This is similar to an absolute criterion tolerance only that
- instead of a single improvement the average over several iterations must be small.
-
- Possible entries are:
- threshold_to_characterize_as_slow (float): Threshold whether an improvement
- is insufficient. Note: the improvement is divided by the
- ``comparison_period``.
- So this is the required average improvement per iteration over the
- comparison period.
- max_insufficient_improvements (int): Number of consecutive
- insufficient improvements before termination (or reset). Default is
- ``20 * len(x)``.
- comparison_period (int):
- How many iterations to go back to calculate the improvement.
- For example 5 would mean that each criterion evaluation is compared to the
- criterion value from 5 iterations before.
-
-"""
-
-"""
-=====================================================================================
-2. Other Common Tuning Parameters for Optimization Algorithms
-=====================================================================================
-"""
MAX_LINE_SEARCH_STEPS = 20
"""int: Inspired by scipy L-BFGS-B."""
@@ -173,13 +116,6 @@
LIMITED_MEMORY_STORAGE_LENGTH = 10
"""int: Taken from scipy L-BFGS-B."""
-THRESHOLD_FOR_SAFETY_STEP = 0.5
-r"""float: Threshold for when to call the safety step (:math:`\gamma_s`).
-
- :math:`\text{proposed step} \leq \text{threshold_for_safety_step} \cdot
- \text{current_lower_trustregion_radius}`.
-
-"""
CONSTRAINTS_ABSOLUTE_TOLERANCE = 1e-5
"""float: Allowed tolerance of the equality and inequality constraints for values to be
@@ -187,272 +123,6 @@
"""
-"""
--------------------------
-Trust Region Parameters
--------------------------
-"""
-
-TRUSTREGION_THRESHOLD_SUCCESSFUL = 0.1
-"""float: Share of the predicted improvement that has to be achieved for a trust
- region iteration to count as successful.
-
-"""
-
-TRUSTREGION_THRESHOLD_VERY_SUCCESSFUL = 0.7
-"""float: Share of predicted improvement that has to be achieved for a trust region
- iteration to count as very successful.``criterion_noisy``
-
-"""
-
-TRUSTREGION_SHRINKING_FACTOR_NOT_SUCCESSFUL = None
-"""float: Ratio by which to shrink the upper trust region radius when realized
- improvement does not match the ``threshold_successful``. The default is 0.98
- if the criterion is noisy and 0.5 else.
-
-"""
-
-TRUSTREGION_EXPANSION_FACTOR_SUCCESSFUL = 2.0
-r"""float: Ratio by which to expand the upper trust region radius :math:`\Delta_k`
- in very successful iterations (:math:`\gamma_{inc}` in the notation of the paper).
-
-"""
-
-TRUSTREGION_EXPANSION_FACTOR_VERY_SUCCESSFUL = 4.0
-r"""float: Ratio of the proposed step ($\|s_k\|$) by which to expand the upper trust
- region radius (:math:`\Delta_k`) in very successful iterations
- (:math:`\overline{\gamma}_{inc}` in the notation of the paper).
-
-"""
-
-TRUSTREGION_SHRINKING_FACTOR_LOWER_RADIUS = None
-r"""float: Ratio by which to shrink the lower trust region radius (:math:`\rho_k`)
- (:math:`\alpha_1` in the notation of the paper). Default is 0.9 if
- the criterion is noisy and 0.1 else.
-
-"""
-
-TRUSTREGION_SHRINKING_FACTOR_UPPER_RADIUS = None
-r"""float: Ratio of the current lower trust region (:math:`\rho_k`) by which to shrink
- the upper trust region radius (:math:`\Delta_k`) when the lower one is shrunk
- (:math:`\alpha_2` in the notation of the paper). Default is 0.95 if the
- criterion is noisy and 0.5 else."""
-
-"""
----------------------------------------------
-Numerical Algorithm Group Tuning Parameters
----------------------------------------------
-"""
-
-INITIAL_DIRECTIONS = "coordinate"
-"""string: How to draw the initial directions. Possible values are "coordinate" for
- coordinate directions (the default) or "random".
-
-"""
-
-RANDOM_DIRECTIONS_ORTHOGONAL = True
-"""bool: Whether to make randomly drawn initial directions orthogonal."""
-
-
-INTERPOLATION_ROUNDING_ERROR = 0.1
-r"""float: Internally, all the NAG algorithms store interpolation points with respect
- to a base point :math:`x_b`; that is, we store :math:`\{y_t-x_b\}`,
- which reduces the risk of roundoff errors. We shift :math:`x_b` to :math:`x_k` when
- :math:`\text{proposed step} \leq \text{interpolation_rounding_error} \cdot
- \|x_k-x_b\|`.
-
-"""
-
-CLIP_CRITERION_IF_OVERFLOWING = True
-"""bool: Whether to clip the criterion to avoid ``OverflowError``."""
-
-
-TRUSTREGION_PRECONDITION_INTERPOLATION = True
-"""bool: whether to scale the interpolation linear system to improve conditioning."""
-
-
-RESET_OPTIONS = {
- "use_resets": None,
- "minimal_trustregion_radius_tolerance_scaling_at_reset": 1.0,
- "reset_type": "soft",
- "move_center_at_soft_reset": True,
- "reuse_criterion_value_at_hard_reset": True,
- "max_iterations_without_new_best_after_soft_reset": None,
- "auto_detect": True,
- "auto_detect_history": 30,
- "auto_detect_min_jacobian_increase": 0.015,
- "auto_detect_min_correlations": 0.1,
- "points_to_replace_at_soft_reset": 3,
- "max_consecutive_unsuccessful_resets": 10,
- # just bobyqa
- "max_unsuccessful_resets": None,
- "trust_region_scaling_at_unsuccessful_reset": None,
- # just dfols
- "max_interpolation_points": None,
- "n_extra_interpolation_points_per_soft_reset": 0,
- "n_extra_interpolation_points_per_hard_reset": 0,
- "n_additional_extra_points_to_replace_per_reset": 0,
-}
-r"""dict: Options for reseting the optimization.
-
- Possible entries are:
-
- use_resets (bool): Whether to do resets when the lower trust
- region radius (:math:`\rho_k`) reaches the stopping criterion
- (:math:`\rho_{end}`), or (optionally) when all interpolation points are
- within noise level. Default is ``True`` if the criterion is noisy.
- minimal_trustregion_radius_tolerance_scaling_at_reset (float): Factor with
- which the trust region stopping criterion is multiplied at each reset.
-
- reset_type (str): Whether to use "soft" or "hard" resets. Default is "soft".
-
- move_center_at_soft_reset (bool): Whether to move the trust region center
- ($x_k$) to the best new point evaluated in stead of keeping it constant.
- points_to_replace_at_soft_reset (int): Number of interpolation points to move
- at each soft reset.
- reuse_criterion_value_at_hard_reset (bool): Whether or not to recycle the
- criterion value at the best iterate found when performing a hard reset.
- This saves one criterion evaluation.
- max_iterations_without_new_best_after_soft_reset (int):
- The maximum number of successful steps in a given run where the new
- criterion value is worse than the best value found in previous runs before
- terminating. Default is ``max_criterion_evaluations``.
- auto_detect (bool): Whether or not to
- automatically determine when to reset. This is an additional condition
- and resets can still be triggered by small upper trust region radius, etc.
- There are two criteria used: upper trust region radius shrinkage
- (no increases over the history, more decreases than no changes) and
- changes in the model Jacobian (consistently increasing trend as measured
- by slope and correlation coefficient of the line of best fit).
- auto_detect_history (int):
- How many iterations of model changes and trust region radii to store.
- auto_detect_min_jacobian_increase (float):
- Minimum rate of increase of the Jacobian over past iterations to cause a
- reset.
- auto_detect_min_correlations (float):
- Minimum correlation of the Jacobian data set required to cause a reset.
- max_consecutive_unsuccessful_resets (int): maximum number of consecutive
- unsuccessful resets allowed (i.e. resets which did not outperform the
- best known value from earlier runs).
-
- Only used when using nag_bobyqa:
-
- max_unsuccessful_resets (int):
- number of total unsuccessful resets allowed.
- Default is 20 if ``seek_global_optimum`` and else unrestricted.
- trust_region_scaling_at_unsuccessful_reset (float): Factor by which to
- expand the initial lower trust region radius (:math:`\rho_{beg}`) after
- unsuccessful resets. Default is 1.1 if ``seek_global_optimum`` else 1.
-
- Only used when using nag_dfols:
-
- max_interpolation_points (int): Maximum allowed value of the number of
- interpolation points. This is useful if the number of interpolation points
- increases with each reset, e.g. when
- ``n_extra_interpolation_points_per_soft_reset > 0``. The default is
- ``n_interpolation_points``.
- n_extra_interpolation_points_per_soft_reset (int): Number of points to add to
- the interpolation set with each soft reset.
- n_extra_interpolation_points_per_hard_reset (int): Number of points to add to
- the interpolation set with each hard reset.
- n_additional_extra_points_to_replace_per_reset (int): This parameter modifies
- ``n_extra_points_to_replace_successful``. With each reset
- ``n_extra_points_to_replace_successful`` is increased by this number.
-
-"""
-
-
-TRUSTREGION_FAST_START_OPTIONS = {
- "min_inital_points": None,
- "method": "auto",
- "scale_of_trustregion_step_perturbation": None,
- "scale_of_jacobian_components_perturbation": 1e-2,
- # the following will be growing.full_rank.min_sing_val
- # but it not supported yet by DF-OLS.
- "floor_of_jacobian_singular_values": 1,
- "jacobian_max_condition_number": 1e8,
- "geometry_improving_steps": False,
- "safety_steps": True,
- "shrink_upper_radius_in_safety_steps": False,
- "full_geometry_improving_step": False,
- "reset_trustregion_radius_after_fast_start": False,
- "reset_min_trustregion_radius_after_fast_start": False,
- "shrinking_factor_not_successful": None,
- "n_extra_search_directions_per_iteration": 0,
-}
-r"""dict: Options to start the optimization while building the full trust region model.
-
- To activate this, set the number of interpolation points at which to evaluate the
- criterion before doing the first step, `min_initial_points`, to something smaller
- than the number of parameters.
-
- The following options can be specified:
-
- min_initial_points (int): Number of initial interpolation
- points in addition to the start point. This should only be changed to
- a value less than ``len(x)``, and only if the default setup cost
- of ``len(x) + 1`` evaluations of the criterion is impractical.
- If this is set to be less than the default, the input value of
- ``n_interpolation_points`` should be set to ``len(x)``.
- If the default is used, all the other parameters have no effect.
- Default is ``n_interpolation_points - 1``.
- If the default setup costs of the evaluations are very large, DF-OLS
- can start with less than ``len(x)`` interpolation points and add points
- to the trust region model with every iteration.
- method ("jacobian", "trustregion" or "auto"):
- When there are less interpolation points than ``len(x)`` the model is
- underdetermined. This can be fixed in two ways:
- If "jacobian", the interpolated Jacobian is perturbed to have full
- rank, allowing the trust region step to include components in the full
- search space. This is the default if
- ``len(x) \geq number of root contributions``.
- If "trustregion_step", the trust region step is perturbed by an
- orthogonal direction not yet searched. It is the default if
- ``len(x) < number of root contributions``.
- scale_of_trustregion_step_perturbation (float):
- When adding new search directions, the length of the step is the trust
- region radius multiplied by this value. The default is 0.1 if
- ``method == "trustregion"`` else 1.
- scale_of_jacobian_components_perturbation (float): Magnitude of extra
- components added to the Jacobian. Default is 1e-2.
- floor_of_jacobian_singular_values (float): Floor singular
- values of the Jacobian at this factor of the last non zero value.
- As of version 1.2.1 this option is not yet supported by DF-OLS!
- scale_of_jacobian_singular_value_floor (float):
- Floor singular values of the Jacobian at this factor of the last nonzero
- value.
- jacobian_max_condition_number (float): Cap on the condition number
- of Jacobian after applying floors to singular values
- (effectively another floor on the smallest singular value, since the
- largest singular value is fixed).
- geometry_improving_steps (bool): Whether to do geometry-improving steps in the
- trust region algorithm, as per the usual algorithm during the fast start.
- safety_steps (bool):
- Whether to perform safety steps.
- shrink_upper_radius_in_safety_steps (bool): During the fast start whether to
- shrink the upper trust region radius in safety steps.
- full_geometry_improving_step (bool): During the fast start whether to do a
- full geometry-improving step within safety steps (the same as the post fast
- start phase of the algorithm). Since this involves reducing the upper trust
- region radius, this can only be `True` if
- `shrink_upper_radius_in_safety_steps == False`.
- reset_trustregion_radius_after_fast_start (bool):
- Whether or not to reset the upper trust region radius to its initial value
- at the end of the fast start phase.
- reset_min_trustregion_radius_after_fast_start (bool):
- Whether or not to reset the minimum trust region radius
- (:math:`\rho_k`) to its initial value at the end of the fast start phase.
- shrinking_factor_not_successful (float):
- Ratio by which to shrink the trust region radius when realized
- improvement does not match the ``threshold_for_successful_iteration``
- during the fast start phase. By default it is the same as
- ``reduction_when_not_successful``.
- n_extra_search_directions_per_iteration (int): Number of new search
- directions to add with each iteration where we do not have a full set
- of search directions. This approach is not recommended! Default is 0.
-
-"""
-
def get_population_size(population_size, x, lower_bound=10):
"""Default population size for genetic algorithms."""
diff --git a/src/optimagic/optimization/optimize.py b/src/optimagic/optimization/optimize.py
index a7bd9c5ca..9140984f3 100644
--- a/src/optimagic/optimization/optimize.py
+++ b/src/optimagic/optimization/optimize.py
@@ -3,7 +3,12 @@
from pathlib import Path
from optimagic.batch_evaluators import process_batch_evaluator
-from optimagic.exceptions import InvalidFunctionError, InvalidKwargsError
+from optimagic.exceptions import (
+ InvalidFunctionError,
+ InvalidKwargsError,
+ MissingInputError,
+ AliasError,
+)
from optimagic.logging.create_tables import (
make_optimization_iteration_table,
make_optimization_problem_table,
@@ -29,25 +34,34 @@
get_converter,
)
from optimagic.parameters.nonlinear_constraints import process_nonlinear_constraints
-from optimagic.shared.process_user_function import process_func_of_params
+from optimagic.shared.process_user_function import (
+ process_func_of_params,
+ get_kwargs_from_args,
+)
+from optimagic.optimization.scipy_aliases import (
+ map_method_to_algorithm,
+ split_fun_and_jac,
+)
+from optimagic import deprecations
+from optimagic.deprecations import replace_and_warn_about_deprecated_algo_options
def maximize(
- criterion,
- params,
- algorithm,
+ fun=None,
+ params=None,
+ algorithm=None,
*,
lower_bounds=None,
upper_bounds=None,
soft_lower_bounds=None,
soft_upper_bounds=None,
- criterion_kwargs=None,
+ fun_kwargs=None,
constraints=None,
algo_options=None,
- derivative=None,
- derivative_kwargs=None,
- criterion_and_derivative=None,
- criterion_and_derivative_kwargs=None,
+ jac=None,
+ jac_kwargs=None,
+ fun_and_jac=None,
+ fun_and_jac_kwargs=None,
numdiff_options=None,
logging=False,
log_options=None,
@@ -59,166 +73,42 @@ def maximize(
multistart_options=None,
collect_history=True,
skip_checks=False,
+ # scipy aliases
+ x0=None,
+ method=None,
+ args=None,
+ # scipy arguments that are not yet supported
+ hess=None,
+ hessp=None,
+ callback=None,
+ # scipy arguments that will never be supported
+ options=None,
+ tol=None,
+ # deprecated arguments
+ criterion=None,
+ criterion_kwargs=None,
+ derivative=None,
+ derivative_kwargs=None,
+ criterion_and_derivative=None,
+ criterion_and_derivative_kwargs=None,
):
- """Maximize criterion using algorithm subject to constraints.
-
- Args:
- criterion (callable): A function that takes a params as first argument and
- returns a scalar (if only scalar algorithms will be used) or a dictionary
- that contains at the entries "value" (a scalar float), "contributions" (a
- pytree containing the summands that make up the criterion value) or
- "root_contributions" (a pytree containing the residuals of a least-squares
- problem) and any number of additional entries. The additional dict entries
- will be stored in a database if logging is used.
- params (pandas): A pytree containing the parameters with respect to which the
- criterion is optimized. Examples are a numpy array, a pandas Series,
- a DataFrame with "value" column, a float and any kind of (nested) dictionary
- or list containing these elements. See :ref:`params` for examples.
- algorithm (str or callable): Specifies the optimization algorithm. For built-in
- algorithms this is a string with the name of the algorithm. Otherwise it can
- be a callable with the optimagic algorithm interface. See :ref:`algorithms`.
- lower_bounds (pytree): A pytree with the same structure as params with lower
- bounds for the parameters. Can be ``-np.inf`` for parameters with no lower
- bound.
- upper_bounds (pytree): As lower_bounds. Can be ``np.inf`` for parameters with
- no upper bound.
- soft_lower_bounds (pytree): As lower bounds but the bounds are not imposed
- during optimization and just used to sample start values if multistart
- optimization is performed.
- soft_upper_bounds (pytree): As soft_lower_bounds.
- criterion_kwargs (dict): Additional keyword arguments for criterion
- constraints (list, dict): List with constraint dictionaries or single dict.
- See :ref:`constraints`.
- algo_options (dict): Algorithm specific configuration of the optimization. See
- :ref:`list_of_algorithms` for supported options of each algorithm.
- derivative (callable): Function that calculates the first derivative
- of criterion. For most algorithm, this is the gradient of the scalar
- output (or "value" entry of the dict). However some algorithms (e.g. bhhh)
- require the jacobian of the "contributions" entry of the dict. You will get
- an error if you provide the wrong type of derivative.
- derivative_kwargs (dict): Additional keyword arguments for derivative.
- criterion_and_derivative (callable): Function that returns criterion
- and derivative as a tuple. This can be used to exploit synergies in the
- evaluation of both functions. The first element of the tuple has to be
- exactly the same as the output of criterion. The second has to be exactly
- the same as the output of derivative.
- criterion_and_derivative_kwargs (dict): Additional keyword arguments for
- criterion and derivative.
- numdiff_options (dict): Keyword arguments for the calculation of numerical
- derivatives. See :ref:`first_derivative` for details. Note that the default
- method is changed to "forward" for speed reasons.
- logging (pathlib.Path, str or False): Path to sqlite3 file (which typically has
- the file extension ``.db``. If the file does not exist, it will be created.
- When doing parallel optimizations and logging is provided, you have to
- provide a different path for each optimization you are running. You can
- disable logging completely by setting it to False, but we highly recommend
- not to do so.
- log_options (dict): Additional keyword arguments to configure the logging.
- - "fast_logging": A boolean that determines if "unsafe" settings are used
- to speed up write processes to the database. This should only be used for
- very short running criterion functions where the main purpose of the log
- is monitoring and it would not be catastrophic to get a
- corrupted database in case of a sudden system shutdown. If one evaluation
- of the criterion function (and gradient if applicable) takes more than
- 100 ms, the logging overhead is negligible.
- - "if_table_exists": (str) One of "extend", "replace", "raise". What to
- do if the tables we want to write to already exist. Default "extend".
- - "if_database_exists": (str): One of "extend", "replace", "raise". What to
- do if the database we want to write to already exists. Default "extend".
- error_handling (str): Either "raise" or "continue". Note that "continue" does
- not absolutely guarantee that no error is raised but we try to handle as
- many errors as possible in that case without aborting the optimization.
- error_penalty (dict): Dict with the entries "constant" (float) and "slope"
- (float). If the criterion or gradient raise an error and error_handling is
- "continue", return ``constant + slope * norm(params - start_params)`` where
- ``norm`` is the euclidean distance as criterion value and adjust the
- derivative accordingly. This is meant to guide the optimizer back into a
- valid region of parameter space (in direction of the start parameters).
- Note that the constant has to be high enough to ensure that the penalty is
- actually a bad function value. The default constant is f0 + abs(f0) + 100
- for minimizations and f0 - abs(f0) - 100 for maximizations, where
- f0 is the criterion value at start parameters. The default slope is 0.1.
- scaling (bool): If True, the parameter vector is rescaled internally for
- better performance with scale sensitive optimizers.
- scaling_options (dict or None): Options to configure the internal scaling ot
- the parameter vector. See :ref:`scaling` for details and recommendations.
- multistart (bool): Whether to do the optimization from multiple starting points.
- Requires the params to have the columns ``"soft_lower_bound"`` and
- ``"soft_upper_bounds"`` with finite values for all parameters, unless
- the standard bounds are already finite for all parameters.
- multistart_options (dict): Options to configure the optimization from multiple
- starting values. The dictionary has the following entries
- (all of which are optional):
- - n_samples (int): Number of sampled points on which to do one function
- evaluation. Default is 10 * n_params.
- - sample (pandas.DataFrame or numpy.ndarray) A user definde sample.
- If this is provided, n_samples, sampling_method and sampling_distribution
- are not used.
- - share_optimizations (float): Share of sampled points that is used to
- construct a starting point for a local optimization. Default 0.1.
- - sampling_distribution (str): One rof "uniform", "triangle". Default is
- "uniform" as in the original tiktak algorithm.
- - sampling_method (str): One of "random", "sobol", "halton", "hammersley",
- "korobov", "latin_hypercube" or a numpy array or DataFrame with custom
- points. Default is sobol for problems with up to 30 parameters and random
- for problems with more than 30 parameters.
- - mixing_weight_method (str or callable): Specifies how much weight is put
- on the currently best point when calculating a new starting point for a
- local optimization out of the currently best point and the next random
- starting point. Either "tiktak" or "linear" or a callable that takes the
- arguments ``iteration``, ``n_iterations``, ``min_weight``, ``max_weight``.
- Default "tiktak".
- - mixing_weight_bounds (tuple): A tuple consisting of a lower and upper
- bound on mixing weights. Default (0.1, 0.995).
- - convergence_max_discoveries (int): The multistart optimization converges
- if the currently best local optimum has been discovered independently in
- ``convergence_max_discoveries`` many local optimizations. Default 2.
- - convergence.relative_params_tolerance (float): Determines the maximum
- relative distance two parameter vectors can have to be considered equal
- for convergence purposes.
- - n_cores (int): Number cores used to evaluate the criterion function in
- parallel during exploration stages and number of parallel local
- optimization in optimization stages. Default 1.
- - batch_evaluator (str or callable): See :ref:`batch_evaluators` for
- details. Default "joblib".
- - batch_size (int): If n_cores is larger than one, several starting points
- for local optimizations are created with the same weight and from the same
- currently best point. The ``batch_size`` argument is a way to reproduce
- this behavior on a small machine where less cores are available. By
- default the batch_size is equal to ``n_cores``. It can never be smaller
- than ``n_cores``.
- - seed (int): Random seed for the creation of starting values. Default None.
- - exploration_error_handling (str): One of "raise" or "continue". Default
- is continue, which means that failed function evaluations are simply
- discarded from the sample.
- - optimization_error_handling (str): One of "raise" or "continue". Default
- is continue, which means that failed optimizations are simply discarded.
- collect_history (bool): Whether the history of parameters and criterion values
- should be collected and returned as part of the result. Default True.
- skip_checks (bool): Whether checks on the inputs are skipped. This makes the
- optimization faster, especially for very fast criterion functions. Default
- False.
-
- Returns:
- OptimizeResult: The optmization result.
-
- """
+ """Maximize criterion using algorithm subject to constraints."""
return _optimize(
direction="maximize",
- criterion=criterion,
+ fun=fun,
params=params,
algorithm=algorithm,
lower_bounds=lower_bounds,
upper_bounds=upper_bounds,
soft_lower_bounds=soft_lower_bounds,
soft_upper_bounds=soft_upper_bounds,
- criterion_kwargs=criterion_kwargs,
+ fun_kwargs=fun_kwargs,
constraints=constraints,
algo_options=algo_options,
- derivative=derivative,
- derivative_kwargs=derivative_kwargs,
- criterion_and_derivative=criterion_and_derivative,
- criterion_and_derivative_kwargs=criterion_and_derivative_kwargs,
+ jac=jac,
+ jac_kwargs=jac_kwargs,
+ fun_and_jac=fun_and_jac,
+ fun_and_jac_kwargs=fun_and_jac_kwargs,
numdiff_options=numdiff_options,
logging=logging,
log_options=log_options,
@@ -230,25 +120,43 @@ def maximize(
multistart_options=multistart_options,
collect_history=collect_history,
skip_checks=skip_checks,
+ # scipy aliases
+ x0=x0,
+ method=method,
+ args=args,
+ # scipy arguments that are not yet supported
+ hess=hess,
+ hessp=hessp,
+ callback=callback,
+ # scipy arguments that will never be supported
+ options=options,
+ tol=tol,
+ # deprecated arguments
+ criterion=criterion,
+ criterion_kwargs=criterion_kwargs,
+ derivative=derivative,
+ derivative_kwargs=derivative_kwargs,
+ criterion_and_derivative=criterion_and_derivative,
+ criterion_and_derivative_kwargs=criterion_and_derivative_kwargs,
)
def minimize(
- criterion,
- params,
- algorithm,
+ fun=None,
+ params=None,
+ algorithm=None,
*,
lower_bounds=None,
upper_bounds=None,
soft_lower_bounds=None,
soft_upper_bounds=None,
- criterion_kwargs=None,
+ fun_kwargs=None,
constraints=None,
algo_options=None,
- derivative=None,
- derivative_kwargs=None,
- criterion_and_derivative=None,
- criterion_and_derivative_kwargs=None,
+ jac=None,
+ jac_kwargs=None,
+ fun_and_jac=None,
+ fun_and_jac_kwargs=None,
numdiff_options=None,
logging=False,
log_options=None,
@@ -260,166 +168,43 @@ def minimize(
multistart_options=None,
collect_history=True,
skip_checks=False,
+ # scipy aliases
+ x0=None,
+ method=None,
+ args=None,
+ # scipy arguments that are not yet supported
+ hess=None,
+ hessp=None,
+ callback=None,
+ # scipy arguments that will never be supported
+ options=None,
+ tol=None,
+ # deprecated arguments
+ criterion=None,
+ criterion_kwargs=None,
+ derivative=None,
+ derivative_kwargs=None,
+ criterion_and_derivative=None,
+ criterion_and_derivative_kwargs=None,
):
- """Minimize criterion using algorithm subject to constraints.
-
- Args:
- criterion (callable): A function that takes a params as first argument and
- returns a scalar (if only scalar algorithms will be used) or a dictionary
- that contains at the entries "value" (a scalar float), "contributions" (a
- pytree containing the summands that make up the criterion value) or
- "root_contributions" (a pytree containing the residuals of a least-squares
- problem) and any number of additional entries. The additional dict entries
- will be stored in a database if logging is used.
- params (pandas): A pytree containing the parameters with respect to which the
- criterion is optimized. Examples are a numpy array, a pandas Series,
- a DataFrame with "value" column, a float and any kind of (nested) dictionary
- or list containing these elements. See :ref:`params` for examples.
- algorithm (str or callable): Specifies the optimization algorithm. For built-in
- algorithms this is a string with the name of the algorithm. Otherwise it can
- be a callable with the optimagic algorithm interface. See :ref:`algorithms`.
- lower_bounds (pytree): A pytree with the same structure as params with lower
- bounds for the parameters. Can be ``-np.inf`` for parameters with no lower
- bound.
- upper_bounds (pytree): As lower_bounds. Can be ``np.inf`` for parameters with
- no upper bound.
- soft_lower_bounds (pytree): As lower bounds but the bounds are not imposed
- during optimization and just used to sample start values if multistart
- optimization is performed.
- soft_upper_bounds (pytree): As soft_lower_bounds.
- criterion_kwargs (dict): Additional keyword arguments for criterion
- constraints (list, dict): List with constraint dictionaries or single dict.
- See :ref:`constraints`.
- algo_options (dict): Algorithm specific configuration of the optimization. See
- :ref:`list_of_algorithms` for supported options of each algorithm.
- derivative (callable): Function that calculates the first derivative
- of criterion. For most algorithm, this is the gradient of the scalar
- output (or "value" entry of the dict). However some algorithms (e.g. bhhh)
- require the jacobian of the "contributions" entry of the dict. You will get
- an error if you provide the wrong type of derivative.
- derivative_kwargs (dict): Additional keyword arguments for derivative.
- criterion_and_derivative (callable): Function that returns criterion
- and derivative as a tuple. This can be used to exploit synergies in the
- evaluation of both functions. The first element of the tuple has to be
- exactly the same as the output of criterion. The second has to be exactly
- the same as the output of derivative.
- criterion_and_derivative_kwargs (dict): Additional keyword arguments for
- criterion and derivative.
- numdiff_options (dict): Keyword arguments for the calculation of numerical
- derivatives. See :ref:`first_derivative` for details. Note that the default
- method is changed to "forward" for speed reasons.
- logging (pathlib.Path, str or False): Path to sqlite3 file (which typically has
- the file extension ``.db``. If the file does not exist, it will be created.
- When doing parallel optimizations and logging is provided, you have to
- provide a different path for each optimization you are running. You can
- disable logging completely by setting it to False, but we highly recommend
- not to do so.
- log_options (dict): Additional keyword arguments to configure the logging.
- - "fast_logging": A boolean that determines if "unsafe" settings are used
- to speed up write processes to the database. This should only be used for
- very short running criterion functions where the main purpose of the log
- is monitoring and it would not be catastrophic to get a
- corrupted database in case of a sudden system shutdown. If one evaluation
- of the criterion function (and gradient if applicable) takes more than
- 100 ms, the logging overhead is negligible.
- - "if_table_exists": (str) One of "extend", "replace", "raise". What to
- do if the tables we want to write to already exist. Default "extend".
- - "if_database_exists": (str): One of "extend", "replace", "raise". What to
- do if the database we want to write to already exists. Default "extend".
- error_handling (str): Either "raise" or "continue". Note that "continue" does
- not absolutely guarantee that no error is raised but we try to handle as
- many errors as possible in that case without aborting the optimization.
- error_penalty (dict): Dict with the entries "constant" (float) and "slope"
- (float). If the criterion or gradient raise an error and error_handling is
- "continue", return ``constant + slope * norm(params - start_params)`` where
- ``norm`` is the euclidean distance as criterion value and adjust the
- derivative accordingly. This is meant to guide the optimizer back into a
- valid region of parameter space (in direction of the start parameters).
- Note that the constant has to be high enough to ensure that the penalty is
- actually a bad function value. The default constant is f0 + abs(f0) + 100
- for minimizations and f0 - abs(f0) - 100 for maximizations, where
- f0 is the criterion value at start parameters. The default slope is 0.1.
- scaling (bool): If True, the parameter vector is rescaled internally for
- better performance with scale sensitive optimizers.
- scaling_options (dict or None): Options to configure the internal scaling ot
- the parameter vector. See :ref:`scaling` for details and recommendations.
- multistart (bool): Whether to do the optimization from multiple starting points.
- Requires the params to have the columns ``"soft_lower_bound"`` and
- ``"soft_upper_bounds"`` with finite values for all parameters, unless
- the standard bounds are already finite for all parameters.
- multistart_options (dict): Options to configure the optimization from multiple
- starting values. The dictionary has the following entries
- (all of which are optional):
- - n_samples (int): Number of sampled points on which to do one function
- evaluation. Default is 10 * n_params.
- - sample (pandas.DataFrame or numpy.ndarray) A user definde sample.
- If this is provided, n_samples, sampling_method and sampling_distribution
- are not used.
- - share_optimizations (float): Share of sampled points that is used to
- construct a starting point for a local optimization. Default 0.1.
- - sampling_distribution (str): One rof "uniform", "triangle". Default is
- "uniform" as in the original tiktak algorithm.
- - sampling_method (str): One of "random", "sobol", "halton", "hammersley",
- "korobov", "latin_hypercube" or a numpy array or DataFrame with custom
- points. Default is sobol for problems with up to 30 parameters and random
- for problems with more than 30 parameters.
- - mixing_weight_method (str or callable): Specifies how much weight is put
- on the currently best point when calculating a new starting point for a
- local optimization out of the currently best point and the next random
- starting point. Either "tiktak" or "linear" or a callable that takes the
- arguments ``iteration``, ``n_iterations``, ``min_weight``, ``max_weight``.
- Default "tiktak".
- - mixing_weight_bounds (tuple): A tuple consisting of a lower and upper
- bound on mixing weights. Default (0.1, 0.995).
- - convergence_max_discoveries (int): The multistart optimization converges
- if the currently best local optimum has been discovered independently in
- ``convergence_max_discoveries`` many local optimizations. Default 2.
- - convergence.relative_params_tolerance (float): Determines the maximum
- relative distance two parameter vectors can have to be considered equal
- for convergence purposes.
- - n_cores (int): Number cores used to evaluate the criterion function in
- parallel during exploration stages and number of parallel local
- optimization in optimization stages. Default 1.
- - batch_evaluator (str or callable): See :ref:`batch_evaluators` for
- details. Default "joblib".
- - batch_size (int): If n_cores is larger than one, several starting points
- for local optimizations are created with the same weight and from the same
- currently best point. The ``batch_size`` argument is a way to reproduce
- this behavior on a small machine where less cores are available. By
- default the batch_size is equal to ``n_cores``. It can never be smaller
- than ``n_cores``.
- - seed (int): Random seed for the creation of starting values. Default None.
- - exploration_error_handling (str): One of "raise" or "continue". Default
- is continue, which means that failed function evaluations are simply
- discarded from the sample.
- - optimization_error_handling (str): One of "raise" or "continue". Default
- is continue, which means that failed optimizations are simply discarded.
- collect_history (bool): Whether the history of parameters and criterion values
- should be collected and returned as part of the result. Default True.
- skip_checks (bool): Whether checks on the inputs are skipped. This makes the
- optimization faster, especially for very fast criterion functions. Default
- False.
-
- Returns:
- OptimizeResult: The optmization result.
+ """Minimize criterion using algorithm subject to constraints."""
- """
return _optimize(
direction="minimize",
- criterion=criterion,
+ fun=fun,
params=params,
algorithm=algorithm,
lower_bounds=lower_bounds,
upper_bounds=upper_bounds,
soft_lower_bounds=soft_lower_bounds,
soft_upper_bounds=soft_upper_bounds,
- criterion_kwargs=criterion_kwargs,
+ fun_kwargs=fun_kwargs,
constraints=constraints,
algo_options=algo_options,
- derivative=derivative,
- derivative_kwargs=derivative_kwargs,
- criterion_and_derivative=criterion_and_derivative,
- criterion_and_derivative_kwargs=criterion_and_derivative_kwargs,
+ jac=jac,
+ jac_kwargs=jac_kwargs,
+ fun_and_jac=fun_and_jac,
+ fun_and_jac_kwargs=fun_and_jac_kwargs,
numdiff_options=numdiff_options,
logging=logging,
log_options=log_options,
@@ -431,26 +216,44 @@ def minimize(
multistart_options=multistart_options,
collect_history=collect_history,
skip_checks=skip_checks,
+ # scipy aliases
+ x0=x0,
+ method=method,
+ args=args,
+ # scipy arguments that are not yet supported
+ hess=hess,
+ hessp=hessp,
+ callback=callback,
+ # scipy arguments that will never be supported
+ options=options,
+ tol=tol,
+ # deprecated arguments
+ criterion=criterion,
+ criterion_kwargs=criterion_kwargs,
+ derivative=derivative,
+ derivative_kwargs=derivative_kwargs,
+ criterion_and_derivative=criterion_and_derivative,
+ criterion_and_derivative_kwargs=criterion_and_derivative_kwargs,
)
def _optimize(
direction,
- criterion,
+ fun,
params,
algorithm,
*,
- lower_bounds=None,
- upper_bounds=None,
- soft_lower_bounds=None,
- soft_upper_bounds=None,
- criterion_kwargs,
+ lower_bounds,
+ upper_bounds,
+ soft_lower_bounds,
+ soft_upper_bounds,
+ fun_kwargs,
constraints,
algo_options,
- derivative,
- derivative_kwargs,
- criterion_and_derivative,
- criterion_and_derivative_kwargs,
+ jac,
+ jac_kwargs,
+ fun_and_jac,
+ fun_and_jac_kwargs,
numdiff_options,
logging,
log_options,
@@ -462,6 +265,24 @@ def _optimize(
multistart_options,
collect_history,
skip_checks,
+ # scipy aliases
+ x0,
+ method,
+ args,
+ # scipy arguments that are not yet supported
+ hess,
+ hessp,
+ callback,
+ # scipy arguments that will never be supported
+ options,
+ tol,
+ # deprecated arguments
+ criterion,
+ criterion_kwargs,
+ derivative,
+ derivative_kwargs,
+ criterion_and_derivative,
+ criterion_and_derivative_kwargs,
):
"""Minimize or maximize criterion using algorithm subject to constraints.
@@ -471,14 +292,173 @@ def _optimize(
Returns are the same as in maximize and minimize.
"""
+ # ==================================================================================
+ # error handling needed as long as fun is an optional argument (i.e. until
+ # criterion is fully removed).
+ # ==================================================================================
+
+ if fun is None and criterion is None:
+ msg = (
+ "Missing objective function. Please provide an objective function as the "
+ "first positional argument or as the keyword argument `fun`."
+ )
+ raise MissingInputError(msg)
+
+ if params is None and x0 is None:
+ msg = (
+ "Missing start parameters. Please provide start parameters as the second "
+ "positional argument or as the keyword argument `params`."
+ )
+ raise MissingInputError(msg)
+
+ if algorithm is None and method is None:
+ msg = (
+ "Missing algorithm. Please provide an algorithm as the third positional "
+ "argument or as the keyword argument `algorithm`."
+ )
+ raise MissingInputError(msg)
+
+ # ==================================================================================
+ # deprecations
+ # ==================================================================================
+
+ if criterion is not None:
+ deprecations.throw_criterion_future_warning()
+ fun = criterion if fun is None else fun
+
+ if criterion_kwargs is not None:
+ deprecations.throw_criterion_kwargs_future_warning()
+ fun_kwargs = criterion_kwargs if fun_kwargs is None else fun_kwargs
+
+ if derivative is not None:
+ deprecations.throw_derivative_future_warning()
+ jac = derivative if jac is None else jac
+
+ if derivative_kwargs is not None:
+ deprecations.throw_derivative_kwargs_future_warning()
+ jac_kwargs = derivative_kwargs if jac_kwargs is None else jac_kwargs
+
+ if criterion_and_derivative is not None:
+ deprecations.throw_criterion_and_derivative_future_warning()
+ fun_and_jac = criterion_and_derivative if fun_and_jac is None else fun_and_jac
+
+ if criterion_and_derivative_kwargs is not None:
+ deprecations.throw_criterion_and_derivative_kwargs_future_warning()
+ fun_and_jac_kwargs = (
+ criterion_and_derivative_kwargs
+ if fun_and_jac_kwargs is None
+ else fun_and_jac_kwargs
+ )
+
+ algo_options = replace_and_warn_about_deprecated_algo_options(algo_options)
+
+ # ==================================================================================
+ # handle scipy aliases
+ # ==================================================================================
+
+ if x0 is not None:
+ if params is not None:
+ msg = (
+ "x0 is an alias for params (for better compatibility with scipy). "
+ "Do not use both x0 and params."
+ )
+ raise AliasError(msg)
+ else:
+ params = x0
+
+ if method is not None:
+ if algorithm is not None:
+ msg = (
+ "method is an alias for algorithm to select the scipy optimizers under "
+ "their original name. Do not use both method and algorithm."
+ )
+ raise AliasError(msg)
+ else:
+ algorithm = map_method_to_algorithm(method)
+
+ if args is not None:
+ if (
+ fun_kwargs is not None
+ or jac_kwargs is not None
+ or fun_and_jac_kwargs is not None
+ ):
+ msg = (
+ "args is an alternative to fun_kwargs, jac_kwargs and "
+ "fun_and_jac_kwargs that optimagic supports for compatibility "
+ "with scipy. Do not use args in conjunction with any of the other "
+ "arguments."
+ )
+ raise AliasError(msg)
+ else:
+ kwargs = get_kwargs_from_args(args, fun, offset=1)
+ fun_kwargs, jac_kwargs, fun_and_jac_kwargs = kwargs, kwargs, kwargs
+
+ # jac is not an alias but we need to handle the case where `jac=True`, i.e. fun is
+ # actually fun_and_jac. This is not recommended in optimagic because then optimizers
+ # cannot evaluate fun in isolation but we can easily support it for compatibility.
+ if jac is True:
+ jac = None
+ if fun_and_jac is None:
+ fun_and_jac = fun
+ fun = split_fun_and_jac(fun_and_jac, target="fun")
+
+ # ==================================================================================
+ # Handle scipy arguments that are not yet implemented
+ # ==================================================================================
+
+ if hess is not None:
+ msg = (
+ "The hess argument is not yet supported in optimagic. Creat an issue on "
+ "https://github.com/OpenSourceEconomics/optimagic/ if you have urgent need "
+ "for this feature."
+ )
+ raise NotImplementedError(msg)
+
+ if hessp is not None:
+ msg = (
+ "The hessp argument is not yet supported in optimagic. Creat an issue on "
+ "https://github.com/OpenSourceEconomics/optimagic/ if you have urgent need "
+ "for this feature."
+ )
+ raise NotImplementedError(msg)
+
+ if callback is not None:
+ msg = (
+ "The callback argument is not yet supported in optimagic. Creat an issue "
+ "on https://github.com/OpenSourceEconomics/optimagic/ if you have urgent "
+ "need for this feature."
+ )
+ raise NotImplementedError(msg)
+
+ # ==================================================================================
+ # Handle scipy arguments that will never be supported
+ # ==================================================================================
+
+ if options is not None:
+ # TODO: Add link to a how-to guide or tutorial for this
+ msg = (
+ "The options argument is not supported in optimagic. Please use the "
+ "algo_options argument instead."
+ )
+ raise NotImplementedError(msg)
+
+ if tol is not None:
+ # TODO: Add link to a how-to guide or tutorial for this
+ msg = (
+ "The tol argument is not supported in optimagic. Please use "
+ "algo_options or configured algorithms instead to set convergence criteria "
+ "for your optimizer."
+ )
+ raise NotImplementedError(msg)
+
# ==================================================================================
# Set default values and check options
# ==================================================================================
- criterion_kwargs = _setdefault(criterion_kwargs, {})
+ fun_kwargs = _setdefault(fun_kwargs, {})
constraints = _setdefault(constraints, [])
algo_options = _setdefault(algo_options, {})
- derivative_kwargs = _setdefault(derivative_kwargs, {})
- criterion_and_derivative_kwargs = _setdefault(criterion_and_derivative_kwargs, {})
+ jac_kwargs = _setdefault(jac_kwargs, {})
+ fun_and_jac_kwargs = _setdefault(fun_and_jac_kwargs, {})
numdiff_options = _setdefault(numdiff_options, {})
log_options = _setdefault(log_options, {})
scaling_options = _setdefault(scaling_options, {})
@@ -490,16 +470,16 @@ def _optimize(
if not skip_checks:
check_optimize_kwargs(
direction=direction,
- criterion=criterion,
- criterion_kwargs=criterion_kwargs,
+ criterion=fun,
+ criterion_kwargs=fun_kwargs,
params=params,
algorithm=algorithm,
constraints=constraints,
algo_options=algo_options,
- derivative=derivative,
- derivative_kwargs=derivative_kwargs,
- criterion_and_derivative=criterion_and_derivative,
- criterion_and_derivative_kwargs=criterion_and_derivative_kwargs,
+ derivative=jac,
+ derivative_kwargs=jac_kwargs,
+ criterion_and_derivative=fun_and_jac,
+ criterion_and_derivative_kwargs=fun_and_jac_kwargs,
numdiff_options=numdiff_options,
logging=logging,
log_options=log_options,
@@ -548,14 +528,14 @@ def _optimize(
problem_data = {
"direction": direction,
# "criterion"-criterion,
- "criterion_kwargs": criterion_kwargs,
+ "criterion_kwargs": fun_kwargs,
"algorithm": algorithm,
"constraints": constraints,
"algo_options": algo_options,
# "derivative"-derivative,
- "derivative_kwargs": derivative_kwargs,
+ "derivative_kwargs": jac_kwargs,
# "criterion_and_derivative"-criterion_and_derivative,
- "criterion_and_derivative_kwargs": criterion_and_derivative_kwargs,
+ "criterion_and_derivative_kwargs": fun_and_jac_kwargs,
"numdiff_options": numdiff_options,
"log_options": log_options,
"error_handling": error_handling,
@@ -566,30 +546,28 @@ def _optimize(
# ==================================================================================
# partial the kwargs into corresponding functions
# ==================================================================================
- criterion = process_func_of_params(
- func=criterion,
- kwargs=criterion_kwargs,
+ fun = process_func_of_params(
+ func=fun,
+ kwargs=fun_kwargs,
name="criterion",
skip_checks=skip_checks,
)
- if isinstance(derivative, dict):
- derivative = derivative.get(algo_info.primary_criterion_entry)
- if derivative is not None:
- derivative = process_func_of_params(
- func=derivative,
- kwargs=derivative_kwargs,
+ if isinstance(jac, dict):
+ jac = jac.get(algo_info.primary_criterion_entry)
+ if jac is not None:
+ jac = process_func_of_params(
+ func=jac,
+ kwargs=jac_kwargs,
name="derivative",
skip_checks=skip_checks,
)
- if isinstance(criterion_and_derivative, dict):
- criterion_and_derivative = criterion_and_derivative.get(
- algo_info.primary_criterion_entry
- )
+ if isinstance(fun_and_jac, dict):
+ fun_and_jac = fun_and_jac.get(algo_info.primary_criterion_entry)
- if criterion_and_derivative is not None:
- criterion_and_derivative = process_func_of_params(
- func=criterion_and_derivative,
- kwargs=criterion_and_derivative_kwargs,
+ if fun_and_jac is not None:
+ fun_and_jac = process_func_of_params(
+ func=fun_and_jac,
+ kwargs=fun_and_jac_kwargs,
name="criterion_and_derivative",
skip_checks=skip_checks,
)
@@ -598,7 +576,7 @@ def _optimize(
# Do first evaluation of user provided functions
# ==================================================================================
try:
- first_crit_eval = criterion(params)
+ first_crit_eval = fun(params)
except (KeyboardInterrupt, SystemExit):
raise
except Exception as e:
@@ -606,27 +584,27 @@ def _optimize(
raise InvalidFunctionError(msg) from e
# do first derivative evaluation (if given)
- if derivative is not None:
+ if jac is not None:
try:
- first_deriv_eval = derivative(params)
+ first_deriv_eval = jac(params)
except (KeyboardInterrupt, SystemExit):
raise
except Exception as e:
msg = "Error while evaluating derivative at start params."
raise InvalidFunctionError(msg) from e
- if criterion_and_derivative is not None:
+ if fun_and_jac is not None:
try:
- first_crit_and_deriv_eval = criterion_and_derivative(params)
+ first_crit_and_deriv_eval = fun_and_jac(params)
except (KeyboardInterrupt, SystemExit):
raise
except Exception as e:
msg = "Error while evaluating criterion_and_derivative at start params."
raise InvalidFunctionError(msg) from e
- if derivative is not None:
+ if jac is not None:
used_deriv = first_deriv_eval
- elif criterion_and_derivative is not None:
+ elif fun_and_jac is not None:
used_deriv = first_crit_and_deriv_eval[1]
else:
used_deriv = None
@@ -714,10 +692,10 @@ def _optimize(
# ==================================================================================
to_partial = {
"direction": direction,
- "criterion": criterion,
+ "criterion": fun,
"converter": converter,
- "derivative": derivative,
- "criterion_and_derivative": criterion_and_derivative,
+ "derivative": jac,
+ "criterion_and_derivative": fun_and_jac,
"numdiff_options": numdiff_options,
"logging": logging,
"database": database,
@@ -783,7 +761,7 @@ def _optimize(
)
fixed_result_kwargs = {
- "start_criterion": _scalar_start_criterion,
+ "start_fun": _scalar_start_criterion,
"start_params": params,
"algorithm": algo_info.name,
"direction": direction,
diff --git a/src/optimagic/optimization/optimize_result.py b/src/optimagic/optimization/optimize_result.py
index 3f9a4a0fc..5eff28347 100644
--- a/src/optimagic/optimization/optimize_result.py
+++ b/src/optimagic/optimization/optimize_result.py
@@ -1,11 +1,13 @@
from dataclasses import dataclass, field
-from typing import Any, Dict, Union
+from typing import Any, Dict
import numpy as np
import pandas as pd
from optimagic.utilities import to_pickle
from optimagic.shared.compat import pd_df_map
+import warnings
+from optimagic.typing import PyTree
@dataclass
@@ -15,50 +17,121 @@ class OptimizeResult:
**Attributes**
Attributes:
- params (Any): The optimal parameters.
- criterion (float): The optimal criterion value.
- start_criterion (float): The criterion value at the start parameters.
- start_params (Any): The start parameters.
- algorithm (str): The algorithm used for the optimization.
- direction (str): Maximize or minimize.
- n_free (int): Number of free parameters.
- message (Union[str, None] = None): Message returned by the underlying algorithm.
- success (Union[bool, None] = None): Whether the optimization was successful.
- n_criterion_evaluations (Union[int, None] = None): Number of criterion
- evaluations.
- n_derivative_evaluations (Union[int, None] = None): Number of
- derivative evaluations.
- n_iterations (Union[int, None] = None): Number of iterations until termination.
- history (Union[Dict, None] = None): Optimization history.
- convergence_report (Union[Dict, None] = None): The convergence report.
- multistart_info (Union[Dict, None] = None): Multistart information.
- algorithm_output (Dict = field(default_factory=dict)): Additional algorithm
- specific information.
+ params: The optimal parameters.
+ fun: The optimal criterion value.
+ start_fun: The criterion value at the start parameters.
+ start_params: The start parameters.
+ algorithm: The algorithm used for the optimization.
+ direction: Maximize or minimize.
+ n_free: Number of free parameters.
+ message: Message returned by the underlying algorithm.
+ success: Whether the optimization was successful.
+ n_fun_evals: Number of criterion evaluations.
+ n_jac_evals: Number of derivative evaluations.
+ n_iterations: Number of iterations until termination.
+ history: Optimization history.
+ convergence_report: The convergence report.
+ multistart_info: Multistart information.
+ algorithm_output: Additional algorithm specific information.
"""
params: Any
- criterion: float
- start_criterion: float
+ fun: float
+ start_fun: float
start_params: Any
algorithm: str
direction: str
n_free: int
- message: Union[str, None] = None
- success: Union[bool, None] = None
- n_criterion_evaluations: Union[int, None] = None
- n_derivative_evaluations: Union[int, None] = None
- n_iterations: Union[int, None] = None
+ message: str | None = None
+ success: bool | None = None
+ n_fun_evals: int | None = None
+ n_jac_evals: int | None = None
+ n_hess_evals: int | None = None
+ n_iterations: int | None = None
+ status: int | None = None
+ jac: PyTree | None = None
+ hess: PyTree | None = None
+ hess_inv: PyTree | None = None
+ max_constaint_violation: float | None = None
- history: Union[Dict, None] = None
+ history: Dict | None = None
- convergence_report: Union[Dict, None] = None
+ convergence_report: Dict | None = None
- multistart_info: Union[Dict, None] = None
+ multistart_info: Dict | None = None
algorithm_output: Dict = field(default_factory=dict)
- def __repr__(self):
+ # ==================================================================================
+ # Deprecations
+ # ==================================================================================
+
+ @property
+ def criterion(self) -> float:
+ msg = "The criterion attribute is deprecated. Use the fun attribute instead."
+ warnings.warn(msg, FutureWarning)
+ return self.fun
+
+ @property
+ def start_criterion(self) -> float:
+ msg = (
+ "The start_criterion attribute is deprecated. Use the start_fun attribute "
+ "instead."
+ )
+ warnings.warn(msg, FutureWarning)
+ return self.start_fun
+
+ @property
+ def n_criterion_evaluations(self) -> int | None:
+ msg = (
+ "The n_criterion_evaluations attribute is deprecated. Use the n_fun_evals "
+ "attribute instead."
+ )
+ warnings.warn(msg, FutureWarning)
+ return self.n_fun_evals
+
+ @property
+ def n_derivative_evaluations(self) -> int | None:
+ msg = (
+ "The n_derivative_evaluations attribute is deprecated. Use the n_jac_evals "
+ "attribute instead."
+ )
+ warnings.warn(msg, FutureWarning)
+ return self.n_jac_evals
+
+ # ==================================================================================
+ # Scipy aliases
+ # ==================================================================================
+
+ @property
+ def x(self) -> PyTree:
+ return self.params
+
+ @property
+ def x0(self) -> PyTree:
+ return self.start_params
+
+ @property
+ def nfev(self) -> int | None:
+ return self.n_fun_evals
+
+ @property
+ def nit(self) -> int | None:
+ return self.n_iterations
+
+ @property
+ def njev(self) -> int | None:
+ return self.n_jac_evals
+
+ @property
+ def nhev(self) -> int | None:
+ return self.n_hess_evals
+
+ def __getitem__(self, key):
+ return getattr(self, key)
+
+ def __repr__(self) -> str:
first_line = (
f"{self.direction.title()} with {self.n_free} free parameters terminated"
)
@@ -68,8 +141,8 @@ def __repr__(self):
first_line += f" {snippet}"
counters = [
- ("criterion evaluations", self.n_criterion_evaluations),
- ("derivative evaluations", self.n_derivative_evaluations),
+ ("criterion evaluations", self.n_fun_evals),
+ ("derivative evaluations", self.n_jac_evals),
("iterations", self.n_iterations),
]
@@ -93,10 +166,10 @@ def __repr__(self):
else:
message = None
- if self.start_criterion is not None and self.criterion is not None:
+ if self.start_fun is not None and self.fun is not None:
improvement = (
- f"The value of criterion improved from {self.start_criterion} to "
- f"{self.criterion}."
+ f"The value of criterion improved from {self.start_fun} to "
+ f"{self.fun}."
)
else:
improvement = None
diff --git a/src/optimagic/optimization/process_results.py b/src/optimagic/optimization/process_results.py
index 7c02a6053..cdea70133 100644
--- a/src/optimagic/optimization/process_results.py
+++ b/src/optimagic/optimization/process_results.py
@@ -37,7 +37,7 @@ def process_internal_optimizer_result(
skip_checks=skip_checks,
)
- crit_hist = [opt.criterion for opt in info["local_optima"]]
+ crit_hist = [opt.fun for opt in info["local_optima"]]
params_hist = [opt.params for opt in info["local_optima"]]
time_hist = [np.nan for opt in info["local_optima"]]
hist = {"criterion": crit_hist, "params": params_hist, "runtime": time_hist}
@@ -54,11 +54,11 @@ def process_internal_optimizer_result(
[opt.n_iterations for opt in info["local_optima"]]
)
- res.n_criterion_evaluations = _sum_or_none(
- [opt.n_criterion_evaluations for opt in info["local_optima"]]
+ res.n_fun_evals = _sum_or_none(
+ [opt.n_fun_evals for opt in info["local_optima"]]
)
- res.n_derivative_evaluations = _sum_or_none(
- [opt.n_derivative_evaluations for opt in info["local_optima"]]
+ res.n_jac_evals = _sum_or_none(
+ [opt.n_jac_evals for opt in info["local_optima"]]
)
res.multistart_info = info
@@ -78,8 +78,8 @@ def _process_one_result(res, converter, primary_key, fixed_kwargs, skip_checks):
_criterion = -_criterion
optional_entries = [
- "n_criterion_evaluations",
- "n_derivative_evaluations",
+ "n_fun_evals",
+ "n_jac_evals",
"n_iterations",
"success",
"message",
@@ -107,7 +107,7 @@ def _process_one_result(res, converter, primary_key, fixed_kwargs, skip_checks):
out = OptimizeResult(
params=_params,
- criterion=_criterion,
+ fun=_criterion,
**fixed_kwargs,
**optional_kwargs,
algorithm_output=algo_output,
@@ -126,7 +126,7 @@ def _process_multistart_info(info, converter, primary_key, fixed_kwargs, skip_ch
for res, start in zip(info["local_optima"], starts):
kwargs = fixed_kwargs.copy()
kwargs["start_params"] = start
- kwargs["start_criterion"] = None
+ kwargs["start_fun"] = None
processed = _process_one_result(
res,
converter=converter,
@@ -155,7 +155,7 @@ def _process_multistart_info(info, converter, primary_key, fixed_kwargs, skip_ch
def _dummy_result_from_traceback(candidate, fixed_kwargs): # noqa: ARG001
out = OptimizeResult(
params=None,
- criterion=None,
+ fun=None,
**fixed_kwargs,
)
return out
diff --git a/src/optimagic/optimization/scipy_aliases.py b/src/optimagic/optimization/scipy_aliases.py
new file mode 100644
index 000000000..4473ebec8
--- /dev/null
+++ b/src/optimagic/optimization/scipy_aliases.py
@@ -0,0 +1,66 @@
+from optimagic.utilities import propose_alternatives
+import functools
+from optimagic.exceptions import InvalidFunctionError
+
+
+def map_method_to_algorithm(method):
+ implemented = {
+ "Nelder-Mead": "scipy_neldermead",
+ "Powell": "scipy_powell",
+ "CG": "scipy_conjugate_gradient",
+ "BFGS": "scipy_bfgs",
+ "Newton-CG": "scipy_newton_cg",
+ "L-BFGS-B": "scipy_lbfgsb",
+ "TNC": "scipy_truncated_newton",
+ "COBYLA": "scipy_cobyla",
+ "SLSQP": "scipy_slsqp",
+ "trust-constr": "scipy_trust_constr",
+ }
+
+ not_implemented = {
+ "dogleg": "scipy_dogleg",
+ "trust-ncg": "scipy_trust_ncg",
+ "trust-exact": "scipy_trust_exact",
+ "trust-krylov": "scipy_trust_krylov",
+ "COBYQA": "scipy_cobyqa",
+ }
+
+ if method in implemented:
+ algo = implemented[method]
+ elif method in not_implemented:
+ msg = (
+ f"The method {method} is not yet wrapped in optimagic. Create an issue on "
+ "https://github.com/OpenSourceEconomics/optimagic/ if you have urgent need "
+ "for this method."
+ )
+ raise NotImplementedError(msg)
+ else:
+ alt = propose_alternatives(method, list(implemented) + list(not_implemented))
+ msg = (
+ "method is an alias for algorithm to select the scipy optimizers under "
+ f"their original name. {method} is not a valid scipy algorithm name. "
+ f"Did you mean {alt}?"
+ )
+ raise ValueError(msg)
+ return algo
+
+
+def split_fun_and_jac(fun_and_jac, target="fun"):
+
+ index = 0 if target == "fun" else 1
+
+ @functools.wraps(fun_and_jac)
+ def fun(*args, **kwargs):
+ raw = fun_and_jac(*args, **kwargs)
+ try:
+ out = raw[index]
+ except TypeError as e:
+ msg = (
+ "If you set `jac=True`, `fun` needs to return a tuple where the first "
+ "entry is the value of your objective function and the second entry "
+ "is its derivative."
+ )
+ raise InvalidFunctionError(msg) from e
+ return out
+
+ return fun
diff --git a/src/optimagic/optimizers/_pounders/bntr.py b/src/optimagic/optimizers/_pounders/bntr.py
index cfe1f6d6f..14d428dc6 100644
--- a/src/optimagic/optimizers/_pounders/bntr.py
+++ b/src/optimagic/optimizers/_pounders/bntr.py
@@ -1,7 +1,7 @@
"""Auxiliary functions for the quadratic BNTR trust-region subsolver."""
from functools import reduce
-from typing import NamedTuple, Union
+from typing import NamedTuple
import numpy as np
from optimagic.optimizers._pounders._conjugate_gradient import (
@@ -16,11 +16,11 @@
class ActiveBounds(NamedTuple):
- lower: Union[np.ndarray, None] = None
- upper: Union[np.ndarray, None] = None
- fixed: Union[np.ndarray, None] = None
- active: Union[np.ndarray, None] = None
- inactive: Union[np.ndarray, None] = None
+ lower: np.ndarray | None = None
+ upper: np.ndarray | None = None
+ fixed: np.ndarray | None = None
+ active: np.ndarray | None = None
+ inactive: np.ndarray | None = None
def bntr(
diff --git a/src/optimagic/optimizers/_pounders/gqtpar.py b/src/optimagic/optimizers/_pounders/gqtpar.py
index d658ba402..a31e59b6a 100644
--- a/src/optimagic/optimizers/_pounders/gqtpar.py
+++ b/src/optimagic/optimizers/_pounders/gqtpar.py
@@ -1,6 +1,6 @@
"""Auxiliary functions for the quadratic GQTPAR trust-region subsolver."""
-from typing import NamedTuple, Union
+from typing import NamedTuple
import numpy as np
from scipy.linalg import cho_solve, solve_triangular
@@ -9,15 +9,15 @@
class HessianInfo(NamedTuple):
- hessian_plus_lambda: Union[np.ndarray, None] = None # shape (n_params, n_params)
- upper_triangular: Union[np.ndarray, None] = None # shape (n_params, n_params)
+ hessian_plus_lambda: np.ndarray | None = None # shape (n_params, n_params)
+ upper_triangular: np.ndarray | None = None # shape (n_params, n_params)
already_factorized: bool = False
class DampingFactors(NamedTuple):
- candidate: Union[float, None] = None
- lower_bound: Union[float, None] = None
- upper_bound: Union[float, None] = None
+ candidate: float | None = None
+ lower_bound: float | None = None
+ upper_bound: float | None = None
def gqtpar(model, x_candidate, *, k_easy=0.1, k_hard=0.2, maxiter=200):
diff --git a/src/optimagic/optimizers/_pounders/linear_subsolvers.py b/src/optimagic/optimizers/_pounders/linear_subsolvers.py
index be33fb4a5..dba336b1b 100644
--- a/src/optimagic/optimizers/_pounders/linear_subsolvers.py
+++ b/src/optimagic/optimizers/_pounders/linear_subsolvers.py
@@ -1,13 +1,13 @@
"""Collection of linear trust-region subsolvers."""
-from typing import NamedTuple, Union
+from typing import NamedTuple
import numpy as np
class LinearModel(NamedTuple):
- intercept: Union[float, None] = None
- linear_terms: Union[np.ndarray, None] = None # shape (n_params, n_params)
+ intercept: float | None = None
+ linear_terms: np.ndarray | None = None # shape (n_params, n_params)
def minimize_trsbox_linear(
diff --git a/src/optimagic/optimizers/_pounders/pounders_auxiliary.py b/src/optimagic/optimizers/_pounders/pounders_auxiliary.py
index d598d3c25..643c7dbc6 100644
--- a/src/optimagic/optimizers/_pounders/pounders_auxiliary.py
+++ b/src/optimagic/optimizers/_pounders/pounders_auxiliary.py
@@ -1,6 +1,6 @@
"""Auxiliary functions for the pounders algorithm."""
-from typing import NamedTuple, Union
+from typing import NamedTuple
import numpy as np
from scipy.linalg import qr_multiply
@@ -14,16 +14,14 @@
class ResidualModel(NamedTuple):
- intercepts: Union[np.ndarray, None] = None # shape (n_residuals,)
- linear_terms: Union[np.ndarray, None] = None # shape (n_residuals, n_params)
- square_terms: Union[np.ndarray, None] = (
- None # shape (n_residuals, n_params, n_params)
- )
+ intercepts: np.ndarray | None = None # shape (n_residuals,)
+ linear_terms: np.ndarray | None = None # shape (n_residuals, n_params)
+ square_terms: np.ndarray | None = None # shape (n_residuals, n_params, n_params)
class MainModel(NamedTuple):
- linear_terms: Union[np.ndarray, None] = None # shape (n_params,)
- square_terms: Union[np.ndarray, None] = None # shape (n_params, n_params)
+ linear_terms: np.ndarray | None = None # shape (n_params,)
+ square_terms: np.ndarray | None = None # shape (n_params, n_params)
def create_initial_residual_model(history, accepted_index, delta):
diff --git a/src/optimagic/optimizers/bhhh.py b/src/optimagic/optimizers/bhhh.py
index df8cc0a8c..10927ee66 100644
--- a/src/optimagic/optimizers/bhhh.py
+++ b/src/optimagic/optimizers/bhhh.py
@@ -15,8 +15,8 @@ def bhhh(
criterion_and_derivative,
x,
*,
- convergence_absolute_gradient_tolerance=1e-8,
- stopping_max_iterations=200,
+ converence_gtol_abs=1e-8,
+ stopping_maxiter=200,
):
"""Minimize a likelihood function using the BHHH algorithm.
@@ -27,8 +27,8 @@ def bhhh(
result_dict = bhhh_internal(
criterion_and_derivative,
x=x,
- convergence_absolute_gradient_tolerance=convergence_absolute_gradient_tolerance,
- stopping_max_iterations=stopping_max_iterations,
+ convergence_gtol_abs=converence_gtol_abs,
+ stopping_maxiter=stopping_maxiter,
)
return result_dict
@@ -37,8 +37,8 @@ def bhhh(
def bhhh_internal(
criterion_and_derivative,
x,
- convergence_absolute_gradient_tolerance,
- stopping_max_iterations,
+ convergence_gtol_abs,
+ stopping_maxiter,
):
"""Minimize a likelihood function using the BHHH algorithm.
@@ -72,7 +72,7 @@ def bhhh_internal(
step_size = initial_step_size
niter = 1
- while niter < stopping_max_iterations:
+ while niter < stopping_maxiter:
niter += 1
x_candidate = x_accepted + step_size * direction
@@ -114,7 +114,7 @@ def bhhh_internal(
# Reset stepsize
step_size = initial_step_size
- if gtol < convergence_absolute_gradient_tolerance:
+ if gtol < convergence_gtol_abs:
break
result_dict = {
diff --git a/src/optimagic/optimizers/fides.py b/src/optimagic/optimizers/fides.py
index feb292fbd..3ebd34642 100644
--- a/src/optimagic/optimizers/fides.py
+++ b/src/optimagic/optimizers/fides.py
@@ -8,12 +8,12 @@
from optimagic.decorators import mark_minimizer
from optimagic.exceptions import NotInstalledError
from optimagic.optimization.algo_options import (
- CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- CONVERGENCE_ABSOLUTE_GRADIENT_TOLERANCE,
- CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- CONVERGENCE_RELATIVE_GRADIENT_TOLERANCE,
- STOPPING_MAX_ITERATIONS,
+ CONVERGENCE_FTOL_ABS,
+ CONVERGENCE_GTOL_ABS,
+ CONVERGENCE_XTOL_ABS,
+ CONVERGENCE_FTOL_REL,
+ CONVERGENCE_GTOL_REL,
+ STOPPING_MAXITER,
)
if IS_FIDES_INSTALLED:
@@ -33,12 +33,12 @@ def fides(
upper_bounds,
*,
hessian_update_strategy="bfgs",
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_absolute_gradient_tolerance=CONVERGENCE_ABSOLUTE_GRADIENT_TOLERANCE,
- convergence_relative_gradient_tolerance=CONVERGENCE_RELATIVE_GRADIENT_TOLERANCE,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_gtol_abs=CONVERGENCE_GTOL_ABS,
+ convergence_gtol_rel=CONVERGENCE_GTOL_REL,
+ stopping_maxiter=STOPPING_MAXITER,
stopping_max_seconds=np.inf,
trustregion_initial_radius=1.0,
trustregion_stepback_strategy="truncate",
@@ -64,19 +64,19 @@ def fides(
fides_options = {
"delta_init": trustregion_initial_radius,
"eta": trustregion_increase_threshold,
- "fatol": convergence_absolute_criterion_tolerance,
- "frtol": convergence_relative_criterion_tolerance,
+ "fatol": convergence_ftol_abs,
+ "frtol": convergence_ftol_rel,
"gamma1": trustregion_decrease_factor,
"gamma2": trustregion_increase_factor,
- "gatol": convergence_absolute_gradient_tolerance,
- "grtol": convergence_relative_gradient_tolerance,
- "maxiter": stopping_max_iterations,
+ "gatol": convergence_gtol_abs,
+ "grtol": convergence_gtol_rel,
+ "maxiter": stopping_maxiter,
"maxtime": stopping_max_seconds,
"mu": trustregion_decrease_threshold,
"stepback_strategy": trustregion_stepback_strategy,
"subspace_solver": trustregion_subspace_dimension,
"theta_max": trustregion_max_stepback_fraction,
- "xtol": convergence_absolute_params_tolerance,
+ "xtol": convergence_xtol_abs,
}
hessian_instance = _create_hessian_updater_from_user_input(hessian_update_strategy)
diff --git a/src/optimagic/optimizers/ipopt.py b/src/optimagic/optimizers/ipopt.py
index ef24c33ce..f80f50733 100644
--- a/src/optimagic/optimizers/ipopt.py
+++ b/src/optimagic/optimizers/ipopt.py
@@ -6,8 +6,8 @@
from optimagic.decorators import mark_minimizer
from optimagic.exceptions import NotInstalledError
from optimagic.optimization.algo_options import (
- CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- STOPPING_MAX_ITERATIONS,
+ CONVERGENCE_FTOL_REL,
+ STOPPING_MAXITER,
)
from optimagic.optimizers.scipy_optimizers import process_scipy_result
@@ -31,14 +31,14 @@ def ipopt(
# nonlinear constraints
nonlinear_constraints=(),
# convergence criteria
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
dual_inf_tol=1.0,
constr_viol_tol=0.0001,
compl_inf_tol=0.0001,
s_max=100.0,
mu_target=0.0,
# stopping criteria
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
+ stopping_maxiter=STOPPING_MAXITER,
stopping_max_wall_time_seconds=1e20,
stopping_max_cpu_time=1e20,
# acceptable heuristic
@@ -225,7 +225,7 @@ def ipopt(
"The 'ipopt' algorithm requires the cyipopt package to be installed. "
"You can it with: `conda install -c conda-forge cyipopt`."
)
- if acceptable_tol <= convergence_relative_criterion_tolerance:
+ if acceptable_tol <= convergence_ftol_rel:
raise ValueError(
"The acceptable tolerance must be larger than the desired tolerance."
)
@@ -246,9 +246,7 @@ def ipopt(
# The default value is actually 1e2*tol, where tol is the general
# termination tolerance.
if resto_failure_feasibility_threshold is None:
- resto_failure_feasibility_threshold = (
- 1e2 * convergence_relative_criterion_tolerance
- )
+ resto_failure_feasibility_threshold = 1e2 * convergence_ftol_rel
# convert None to str none section
linear_solver_options_with_none = [
@@ -333,7 +331,7 @@ def ipopt(
# disable derivative checker
"derivative_test": "none",
"s_max": float(s_max),
- "max_iter": stopping_max_iterations,
+ "max_iter": stopping_maxiter,
"max_wall_time": float(stopping_max_wall_time_seconds),
"max_cpu_time": stopping_max_cpu_time,
"dual_inf_tol": dual_inf_tol,
@@ -498,7 +496,7 @@ def ipopt(
bounds=_get_scipy_bounds(lower_bounds, upper_bounds),
jac=derivative,
constraints=nonlinear_constraints,
- tol=convergence_relative_criterion_tolerance,
+ tol=convergence_ftol_rel,
options=options,
)
diff --git a/src/optimagic/optimizers/nag_optimizers.py b/src/optimagic/optimizers/nag_optimizers.py
index ff869e24d..e6728899d 100644
--- a/src/optimagic/optimizers/nag_optimizers.py
+++ b/src/optimagic/optimizers/nag_optimizers.py
@@ -15,27 +15,8 @@
from optimagic.config import IS_DFOLS_INSTALLED, IS_PYBOBYQA_INSTALLED
from optimagic.decorators import mark_minimizer
from optimagic.exceptions import NotInstalledError
-from optimagic.optimization.algo_options import (
- CLIP_CRITERION_IF_OVERFLOWING,
- CONVERGENCE_MINIMAL_TRUSTREGION_RADIUS_TOLERANCE,
- CONVERGENCE_NOISE_CORRECTED_CRITERION_TOLERANCE,
- CONVERGENCE_SLOW_PROGRESS,
- INITIAL_DIRECTIONS,
- INTERPOLATION_ROUNDING_ERROR,
- RANDOM_DIRECTIONS_ORTHOGONAL,
- RESET_OPTIONS,
- STOPPING_MAX_CRITERION_EVALUATIONS,
- THRESHOLD_FOR_SAFETY_STEP,
- TRUSTREGION_EXPANSION_FACTOR_SUCCESSFUL,
- TRUSTREGION_EXPANSION_FACTOR_VERY_SUCCESSFUL,
- TRUSTREGION_FAST_START_OPTIONS,
- TRUSTREGION_PRECONDITION_INTERPOLATION,
- TRUSTREGION_SHRINKING_FACTOR_LOWER_RADIUS,
- TRUSTREGION_SHRINKING_FACTOR_NOT_SUCCESSFUL,
- TRUSTREGION_SHRINKING_FACTOR_UPPER_RADIUS,
- TRUSTREGION_THRESHOLD_SUCCESSFUL,
- TRUSTREGION_THRESHOLD_VERY_SUCCESSFUL,
-)
+from optimagic.optimization.algo_options import STOPPING_MAXFUN
+
from optimagic.utilities import calculate_trustregion_initial_radius
if IS_PYBOBYQA_INSTALLED:
@@ -45,6 +26,310 @@
import dfols
+CONVERGENCE_MINIMAL_TRUSTREGION_RADIUS_TOLERANCE = 1e-8
+"""float: Stop when the lower trust region radius falls below this value."""
+
+CONVERGENCE_SLOW_PROGRESS = {
+ "threshold_to_characterize_as_slow": 1e-8,
+ "max_insufficient_improvements": None,
+ "comparison_period": 5,
+}
+"""dict: Specification of when to terminate or reset the optimization because of only
+ slow improvements. This is similar to an absolute criterion tolerance only that
+ instead of a single improvement the average over several iterations must be small.
+
+ Possible entries are:
+ threshold_to_characterize_as_slow (float): Threshold whether an improvement
+ is insufficient. Note: the improvement is divided by the
+ ``comparison_period``.
+ So this is the required average improvement per iteration over the
+ comparison period.
+ max_insufficient_improvements (int): Number of consecutive
+ insufficient improvements before termination (or reset). Default is
+ ``20 * len(x)``.
+ comparison_period (int):
+ How many iterations to go back to calculate the improvement.
+ For example 5 would mean that each criterion evaluation is compared to the
+ criterion value from 5 iterations before.
+
+"""
+
+THRESHOLD_FOR_SAFETY_STEP = 0.5
+r"""float: Threshold for when to call the safety step (:math:`\gamma_s`).
+
+ :math:`\text{proposed step} \leq \text{threshold_for_safety_step} \cdot
+ \text{current_lower_trustregion_radius}`.
+
+"""
+
+CONVERGENCE_NOISE_CORRECTED_FTOL = 1.0
+"""float: Stop when the evaluations on the set of interpolation points all fall within
+ this factor of the noise level. The default is 1, i.e. when all evaluations are
+ within the noise level. If you want to not use this criterion but still flag your
+ criterion function as noisy, set this tolerance to 0.0.
+
+ .. warning::
+ Very small values, as in most other tolerances don't make sense here.
+
+"""
+
+
+TRUSTREGION_THRESHOLD_SUCCESSFUL = 0.1
+"""float: Share of the predicted improvement that has to be achieved for a trust
+ region iteration to count as successful.
+
+"""
+
+TRUSTREGION_THRESHOLD_VERY_SUCCESSFUL = 0.7
+"""float: Share of predicted improvement that has to be achieved for a trust region
+ iteration to count as very successful.``criterion_noisy``
+
+"""
+
+TRUSTREGION_SHRINKING_FACTOR_NOT_SUCCESSFUL = None
+"""float: Ratio by which to shrink the upper trust region radius when realized
+ improvement does not match the ``threshold_successful``. The default is 0.98
+ if the criterion is noisy and 0.5 else.
+
+"""
+
+TRUSTREGION_EXPANSION_FACTOR_SUCCESSFUL = 2.0
+r"""float: Ratio by which to expand the upper trust region radius :math:`\Delta_k`
+ in very successful iterations (:math:`\gamma_{inc}` in the notation of the paper).
+
+"""
+
+TRUSTREGION_EXPANSION_FACTOR_VERY_SUCCESSFUL = 4.0
+r"""float: Ratio of the proposed step ($\|s_k\|$) by which to expand the upper trust
+ region radius (:math:`\Delta_k`) in very successful iterations
+ (:math:`\overline{\gamma}_{inc}` in the notation of the paper).
+
+"""
+
+TRUSTREGION_SHRINKING_FACTOR_LOWER_RADIUS = None
+r"""float: Ratio by which to shrink the lower trust region radius (:math:`\rho_k`)
+ (:math:`\alpha_1` in the notation of the paper). Default is 0.9 if
+ the criterion is noisy and 0.1 else.
+
+"""
+
+TRUSTREGION_SHRINKING_FACTOR_UPPER_RADIUS = None
+r"""float: Ratio of the current lower trust region (:math:`\rho_k`) by which to shrink
+ the upper trust region radius (:math:`\Delta_k`) when the lower one is shrunk
+ (:math:`\alpha_2` in the notation of the paper). Default is 0.95 if the
+ criterion is noisy and 0.5 else."""
+
+
+INITIAL_DIRECTIONS = "coordinate"
+"""string: How to draw the initial directions. Possible values are "coordinate" for
+ coordinate directions (the default) or "random".
+
+"""
+
+RANDOM_DIRECTIONS_ORTHOGONAL = True
+"""bool: Whether to make randomly drawn initial directions orthogonal."""
+
+
+INTERPOLATION_ROUNDING_ERROR = 0.1
+r"""float: Internally, all the NAG algorithms store interpolation points with respect
+ to a base point :math:`x_b`; that is, we store :math:`\{y_t-x_b\}`,
+ which reduces the risk of roundoff errors. We shift :math:`x_b` to :math:`x_k` when
+ :math:`\text{proposed step} \leq \text{interpolation_rounding_error} \cdot
+ \|x_k-x_b\|`.
+
+"""
+
+CLIP_CRITERION_IF_OVERFLOWING = True
+"""bool: Whether to clip the criterion to avoid ``OverflowError``."""
+
+
+TRUSTREGION_PRECONDITION_INTERPOLATION = True
+"""bool: whether to scale the interpolation linear system to improve conditioning."""
+
+
+RESET_OPTIONS = {
+ "use_resets": None,
+ "minimal_trustregion_radius_tolerance_scaling_at_reset": 1.0,
+ "reset_type": "soft",
+ "move_center_at_soft_reset": True,
+ "reuse_criterion_value_at_hard_reset": True,
+ "max_iterations_without_new_best_after_soft_reset": None,
+ "auto_detect": True,
+ "auto_detect_history": 30,
+ "auto_detect_min_jacobian_increase": 0.015,
+ "auto_detect_min_correlations": 0.1,
+ "points_to_replace_at_soft_reset": 3,
+ "max_consecutive_unsuccessful_resets": 10,
+ # just bobyqa
+ "max_unsuccessful_resets": None,
+ "trust_region_scaling_at_unsuccessful_reset": None,
+ # just dfols
+ "max_interpolation_points": None,
+ "n_extra_interpolation_points_per_soft_reset": 0,
+ "n_extra_interpolation_points_per_hard_reset": 0,
+ "n_additional_extra_points_to_replace_per_reset": 0,
+}
+r"""dict: Options for reseting the optimization.
+
+ Possible entries are:
+
+ use_resets (bool): Whether to do resets when the lower trust
+ region radius (:math:`\rho_k`) reaches the stopping criterion
+ (:math:`\rho_{end}`), or (optionally) when all interpolation points are
+ within noise level. Default is ``True`` if the criterion is noisy.
+ minimal_trustregion_radius_tolerance_scaling_at_reset (float): Factor with
+ which the trust region stopping criterion is multiplied at each reset.
+
+ reset_type (str): Whether to use "soft" or "hard" resets. Default is "soft".
+
+ move_center_at_soft_reset (bool): Whether to move the trust region center
+ ($x_k$) to the best new point evaluated in stead of keeping it constant.
+ points_to_replace_at_soft_reset (int): Number of interpolation points to move
+ at each soft reset.
+ reuse_criterion_value_at_hard_reset (bool): Whether or not to recycle the
+ criterion value at the best iterate found when performing a hard reset.
+ This saves one criterion evaluation.
+ max_iterations_without_new_best_after_soft_reset (int):
+ The maximum number of successful steps in a given run where the new
+ criterion value is worse than the best value found in previous runs before
+ terminating. Default is ``max_criterion_evaluations``.
+ auto_detect (bool): Whether or not to
+ automatically determine when to reset. This is an additional condition
+ and resets can still be triggered by small upper trust region radius, etc.
+ There are two criteria used: upper trust region radius shrinkage
+ (no increases over the history, more decreases than no changes) and
+ changes in the model Jacobian (consistently increasing trend as measured
+ by slope and correlation coefficient of the line of best fit).
+ auto_detect_history (int):
+ How many iterations of model changes and trust region radii to store.
+ auto_detect_min_jacobian_increase (float):
+ Minimum rate of increase of the Jacobian over past iterations to cause a
+ reset.
+ auto_detect_min_correlations (float):
+ Minimum correlation of the Jacobian data set required to cause a reset.
+ max_consecutive_unsuccessful_resets (int): maximum number of consecutive
+ unsuccessful resets allowed (i.e. resets which did not outperform the
+ best known value from earlier runs).
+
+ Only used when using nag_bobyqa:
+
+ max_unsuccessful_resets (int):
+ number of total unsuccessful resets allowed.
+ Default is 20 if ``seek_global_optimum`` and else unrestricted.
+ trust_region_scaling_at_unsuccessful_reset (float): Factor by which to
+ expand the initial lower trust region radius (:math:`\rho_{beg}`) after
+ unsuccessful resets. Default is 1.1 if ``seek_global_optimum`` else 1.
+
+ Only used when using nag_dfols:
+
+ max_interpolation_points (int): Maximum allowed value of the number of
+ interpolation points. This is useful if the number of interpolation points
+ increases with each reset, e.g. when
+ ``n_extra_interpolation_points_per_soft_reset > 0``. The default is
+ ``n_interpolation_points``.
+ n_extra_interpolation_points_per_soft_reset (int): Number of points to add to
+ the interpolation set with each soft reset.
+ n_extra_interpolation_points_per_hard_reset (int): Number of points to add to
+ the interpolation set with each hard reset.
+ n_additional_extra_points_to_replace_per_reset (int): This parameter modifies
+ ``n_extra_points_to_replace_successful``. With each reset
+ ``n_extra_points_to_replace_successful`` is increased by this number.
+
+"""
+
+
+TRUSTREGION_FAST_START_OPTIONS = {
+ "min_inital_points": None,
+ "method": "auto",
+ "scale_of_trustregion_step_perturbation": None,
+ "scale_of_jacobian_components_perturbation": 1e-2,
+ # the following will be growing.full_rank.min_sing_val
+ # but it not supported yet by DF-OLS.
+ "floor_of_jacobian_singular_values": 1,
+ "jacobian_max_condition_number": 1e8,
+ "geometry_improving_steps": False,
+ "safety_steps": True,
+ "shrink_upper_radius_in_safety_steps": False,
+ "full_geometry_improving_step": False,
+ "reset_trustregion_radius_after_fast_start": False,
+ "reset_min_trustregion_radius_after_fast_start": False,
+ "shrinking_factor_not_successful": None,
+ "n_extra_search_directions_per_iteration": 0,
+}
+r"""dict: Options to start the optimization while building the full trust region model.
+
+ To activate this, set the number of interpolation points at which to evaluate the
+ criterion before doing the first step, `min_initial_points`, to something smaller
+ than the number of parameters.
+
+ The following options can be specified:
+
+ min_initial_points (int): Number of initial interpolation
+ points in addition to the start point. This should only be changed to
+ a value less than ``len(x)``, and only if the default setup cost
+ of ``len(x) + 1`` evaluations of the criterion is impractical.
+ If this is set to be less than the default, the input value of
+ ``n_interpolation_points`` should be set to ``len(x)``.
+ If the default is used, all the other parameters have no effect.
+ Default is ``n_interpolation_points - 1``.
+ If the default setup costs of the evaluations are very large, DF-OLS
+ can start with less than ``len(x)`` interpolation points and add points
+ to the trust region model with every iteration.
+ method ("jacobian", "trustregion" or "auto"):
+ When there are less interpolation points than ``len(x)`` the model is
+ underdetermined. This can be fixed in two ways:
+ If "jacobian", the interpolated Jacobian is perturbed to have full
+ rank, allowing the trust region step to include components in the full
+ search space. This is the default if
+ ``len(x) \geq number of root contributions``.
+ If "trustregion_step", the trust region step is perturbed by an
+ orthogonal direction not yet searched. It is the default if
+ ``len(x) < number of root contributions``.
+ scale_of_trustregion_step_perturbation (float):
+ When adding new search directions, the length of the step is the trust
+ region radius multiplied by this value. The default is 0.1 if
+ ``method == "trustregion"`` else 1.
+ scale_of_jacobian_components_perturbation (float): Magnitude of extra
+ components added to the Jacobian. Default is 1e-2.
+ floor_of_jacobian_singular_values (float): Floor singular
+ values of the Jacobian at this factor of the last non zero value.
+ As of version 1.2.1 this option is not yet supported by DF-OLS!
+ scale_of_jacobian_singular_value_floor (float):
+ Floor singular values of the Jacobian at this factor of the last nonzero
+ value.
+ jacobian_max_condition_number (float): Cap on the condition number
+ of Jacobian after applying floors to singular values
+ (effectively another floor on the smallest singular value, since the
+ largest singular value is fixed).
+ geometry_improving_steps (bool): Whether to do geometry-improving steps in the
+ trust region algorithm, as per the usual algorithm during the fast start.
+ safety_steps (bool):
+ Whether to perform safety steps.
+ shrink_upper_radius_in_safety_steps (bool): During the fast start whether to
+ shrink the upper trust region radius in safety steps.
+ full_geometry_improving_step (bool): During the fast start whether to do a
+ full geometry-improving step within safety steps (the same as the post fast
+ start phase of the algorithm). Since this involves reducing the upper trust
+ region radius, this can only be `True` if
+ `shrink_upper_radius_in_safety_steps == False`.
+ reset_trustregion_radius_after_fast_start (bool):
+ Whether or not to reset the upper trust region radius to its initial value
+ at the end of the fast start phase.
+ reset_min_trustregion_radius_after_fast_start (bool):
+ Whether or not to reset the minimum trust region radius
+ (:math:`\rho_k`) to its initial value at the end of the fast start phase.
+ shrinking_factor_not_successful (float):
+ Ratio by which to shrink the trust region radius when realized
+ improvement does not match the ``threshold_for_successful_iteration``
+ during the fast start phase. By default it is the same as
+ ``reduction_when_not_successful``.
+ n_extra_search_directions_per_iteration (int): Number of new search
+ directions to add with each iteration where we do not have a full set
+ of search directions. This approach is not recommended! Default is 0.
+
+"""
+
+
@mark_minimizer(
name="nag_dfols",
primary_criterion_entry="root_contributions",
@@ -59,8 +344,8 @@ def nag_dfols(
*,
clip_criterion_if_overflowing=CLIP_CRITERION_IF_OVERFLOWING,
convergence_minimal_trustregion_radius_tolerance=CONVERGENCE_MINIMAL_TRUSTREGION_RADIUS_TOLERANCE, # noqa: E501
- convergence_noise_corrected_criterion_tolerance=CONVERGENCE_NOISE_CORRECTED_CRITERION_TOLERANCE, # noqa: E501
- convergence_scaled_criterion_tolerance=0.0,
+ convergence_noise_corrected_criterion_tolerance=CONVERGENCE_NOISE_CORRECTED_FTOL, # noqa: E501
+ convergence_ftol_scaled=0.0,
convergence_slow_progress=None,
initial_directions=INITIAL_DIRECTIONS,
interpolation_rounding_error=INTERPOLATION_ROUNDING_ERROR,
@@ -68,7 +353,7 @@ def nag_dfols(
noise_multiplicative_level=None,
noise_n_evals_per_point=None,
random_directions_orthogonal=RANDOM_DIRECTIONS_ORTHOGONAL,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ stopping_maxfun=STOPPING_MAXFUN,
threshold_for_safety_step=THRESHOLD_FOR_SAFETY_STEP,
trustregion_expansion_factor_successful=TRUSTREGION_EXPANSION_FACTOR_SUCCESSFUL,
trustregion_expansion_factor_very_successful=TRUSTREGION_EXPANSION_FACTOR_VERY_SUCCESSFUL, # noqa: E501
@@ -186,7 +471,7 @@ def nag_dfols(
"n_extra_interpolation_points_per_hard_reset"
]
- trustregion_reset_options["n_extra_interpolation_points_per_soft_reset"],
- "model.rel_tol": convergence_scaled_criterion_tolerance,
+ "model.rel_tol": convergence_ftol_scaled,
"regression.num_extra_steps": trustregion_n_extra_points_to_replace_successful,
"regression.momentum_extra_steps": trustregion_use_momentum,
"regression.increase_num_extra_steps_with_restart": trustregion_reset_options[
@@ -226,7 +511,7 @@ def nag_dfols(
criterion,
x0=x,
bounds=(lower_bounds, upper_bounds),
- maxfun=stopping_max_criterion_evaluations,
+ maxfun=stopping_maxfun,
rhobeg=trustregion_initial_radius,
npt=trustregion_n_interpolation_points,
rhoend=convergence_minimal_trustregion_radius_tolerance,
@@ -255,7 +540,7 @@ def nag_pybobyqa(
clip_criterion_if_overflowing=CLIP_CRITERION_IF_OVERFLOWING,
convergence_criterion_value=None,
convergence_minimal_trustregion_radius_tolerance=CONVERGENCE_MINIMAL_TRUSTREGION_RADIUS_TOLERANCE, # noqa: E501
- convergence_noise_corrected_criterion_tolerance=CONVERGENCE_NOISE_CORRECTED_CRITERION_TOLERANCE, # noqa: E501
+ convergence_noise_corrected_criterion_tolerance=CONVERGENCE_NOISE_CORRECTED_FTOL, # noqa: E501
convergence_slow_progress=None,
initial_directions=INITIAL_DIRECTIONS,
interpolation_rounding_error=INTERPOLATION_ROUNDING_ERROR,
@@ -264,7 +549,7 @@ def nag_pybobyqa(
noise_n_evals_per_point=None,
random_directions_orthogonal=RANDOM_DIRECTIONS_ORTHOGONAL,
seek_global_optimum=False,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ stopping_max_criterion_evaluations=STOPPING_MAXFUN,
threshold_for_safety_step=THRESHOLD_FOR_SAFETY_STEP,
trustregion_expansion_factor_successful=TRUSTREGION_EXPANSION_FACTOR_SUCCESSFUL,
trustregion_expansion_factor_very_successful=TRUSTREGION_EXPANSION_FACTOR_VERY_SUCCESSFUL, # noqa: E501
@@ -377,7 +662,7 @@ def _process_nag_result(nag_result_obj, len_x):
"""
processed = {
"solution_criterion": nag_result_obj.f,
- "n_criterion_evaluations": nag_result_obj.nx,
+ "n_fun_evals": nag_result_obj.nx,
"message": nag_result_obj.msg,
"success": nag_result_obj.flag == nag_result_obj.EXIT_SUCCESS,
"reached_convergence_criterion": None,
diff --git a/src/optimagic/optimizers/neldermead.py b/src/optimagic/optimizers/neldermead.py
index 7bb2f6e97..446a74663 100644
--- a/src/optimagic/optimizers/neldermead.py
+++ b/src/optimagic/optimizers/neldermead.py
@@ -5,9 +5,9 @@
from optimagic.batch_evaluators import process_batch_evaluator
from optimagic.decorators import mark_minimizer
from optimagic.optimization.algo_options import (
- CONVERGENCE_SECOND_BEST_ABSOLUTE_CRITERION_TOLERANCE,
- CONVERGENCE_SECOND_BEST_ABSOLUTE_PARAMS_TOLERANCE,
- STOPPING_MAX_ITERATIONS,
+ CONVERGENCE_SECOND_BEST_FTOL_ABS,
+ CONVERGENCE_SECOND_BEST_XTOL_ABS,
+ STOPPING_MAXITER,
)
@@ -25,9 +25,9 @@ def neldermead_parallel(
init_simplex_method="gao_han",
n_cores=1,
adaptive=True,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
- convergence_absolute_criterion_tolerance=CONVERGENCE_SECOND_BEST_ABSOLUTE_CRITERION_TOLERANCE, # noqa: E501
- convergence_absolute_params_tolerance=CONVERGENCE_SECOND_BEST_ABSOLUTE_PARAMS_TOLERANCE, # noqa: E501
+ stopping_maxiter=STOPPING_MAXITER,
+ convergence_ftol_abs=CONVERGENCE_SECOND_BEST_FTOL_ABS, # noqa: E501
+ convergence_xtol_abs=CONVERGENCE_SECOND_BEST_XTOL_ABS, # noqa: E501
batch_evaluator="joblib",
):
"""Parallel Nelder-Mead algorithm following Lee D., Wiswall M., A parallel
@@ -52,15 +52,14 @@ def neldermead_parallel(
for simplex size.
The default is True.
- stopping_max_iterations (int): Maximum number of algorithm iterations.
+ stopping_maxiter (int): Maximum number of algorithm iterations.
The default is STOPPING_MAX_ITERATIONS.
- convergence_absolute_criterion_tolerance (float): maximal difference between
+ convergence_ftol_abs (float): maximal difference between
function value evaluated on simplex points.
- The default is CONVERGENCE_SECOND_BEST_ABSOLUTE_CRITERION_TOLERANCE.
- convergence_absolute_params_tolerance (float): maximal distance between points in
- the simplex. The default is CONVERGENCE_SECOND_BEST_ABSOLUTE_PARAMS_TOLERANCE.
+ convergence_xtol_abs (float): maximal distance between points in
+ the simplex.
batch_evaluator (string or callable): See :ref:`batch_evaluators` for
details. Default "joblib".
@@ -232,16 +231,14 @@ def func_parallel(args):
# termination criteria
if (
- np.max(np.abs(f_s[0, :] - f_s[1:, :]))
- <= convergence_absolute_criterion_tolerance
- and np.max(np.abs(s[0, :] - s[1:,]))
- <= convergence_absolute_params_tolerance
+ np.max(np.abs(f_s[0, :] - f_s[1:, :])) <= convergence_ftol_abs
+ and np.max(np.abs(s[0, :] - s[1:,])) <= convergence_xtol_abs
):
optimal = True
converge = True
reason_to_stop = "Termination codition satisfied"
elif (
- iterations >= stopping_max_iterations
+ iterations >= stopping_maxiter
): # if maximum amount of iteration is exceeded
optimal = True
converge = False
diff --git a/src/optimagic/optimizers/nlopt_optimizers.py b/src/optimagic/optimizers/nlopt_optimizers.py
index 46001dd66..2896d31a0 100644
--- a/src/optimagic/optimizers/nlopt_optimizers.py
+++ b/src/optimagic/optimizers/nlopt_optimizers.py
@@ -9,12 +9,12 @@
from optimagic.config import IS_NLOPT_INSTALLED
from optimagic.decorators import mark_minimizer
from optimagic.optimization.algo_options import (
- CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- STOPPING_MAX_CRITERION_EVALUATIONS,
- STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ CONVERGENCE_FTOL_ABS,
+ CONVERGENCE_XTOL_ABS,
+ CONVERGENCE_FTOL_REL,
+ CONVERGENCE_XTOL_REL,
+ STOPPING_MAXFUN,
+ STOPPING_MAXFUN_GLOBAL,
)
from optimagic.parameters.nonlinear_constraints import (
equality_as_inequality_constraints,
@@ -36,11 +36,11 @@ def nlopt_bobyqa(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Minimize a scalar function using the BOBYQA algorithm.
@@ -55,11 +55,11 @@ def nlopt_bobyqa(
upper_bounds,
derivative=None,
algorithm=nlopt.LN_BOBYQA,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -77,11 +77,11 @@ def nlopt_neldermead(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=0,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=0,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Minimize a scalar function using the Nelder-Mead simplex algorithm.
@@ -97,11 +97,11 @@ def nlopt_neldermead(
upper_bounds,
algorithm=nlopt.LN_NELDERMEAD,
derivative=None,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -117,11 +117,11 @@ def nlopt_praxis(
criterion,
x,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Minimize a scalar function using principal-axis method.
@@ -136,11 +136,11 @@ def nlopt_praxis(
upper_bounds=None,
algorithm=nlopt.LN_PRAXIS,
derivative=None,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -159,11 +159,11 @@ def nlopt_cobyla(
upper_bounds,
*,
nonlinear_constraints=(),
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Minimize a scalar function using the cobyla method.
@@ -180,11 +180,11 @@ def nlopt_cobyla(
algorithm=nlopt.LN_COBYLA,
derivative=None,
nonlinear_constraints=nonlinear_constraints,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -202,11 +202,11 @@ def nlopt_sbplx(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Minimize a scalar function using the "Subplex" algorithm.
@@ -221,11 +221,11 @@ def nlopt_sbplx(
lower_bounds,
upper_bounds,
algorithm=nlopt.LN_SBPLX,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -243,11 +243,11 @@ def nlopt_newuoa(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Minimize a scalar function using the NEWUOA algorithm.
@@ -266,11 +266,11 @@ def nlopt_newuoa(
lower_bounds,
upper_bounds,
algorithm=algo,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -289,11 +289,11 @@ def nlopt_tnewton(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Minimize a scalar function using the "TNEWTON" algorithm.
@@ -309,11 +309,11 @@ def nlopt_tnewton(
upper_bounds,
algorithm=nlopt.LD_TNEWTON,
derivative=derivative,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -332,11 +332,11 @@ def nlopt_lbfgs(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Minimize a scalar function using the "LBFGS" algorithm.
@@ -352,11 +352,11 @@ def nlopt_lbfgs(
upper_bounds,
algorithm=nlopt.LD_TNEWTON,
derivative=derivative,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -375,11 +375,11 @@ def nlopt_ccsaq(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Minimize a scalar function using CCSAQ algorithm.
@@ -394,11 +394,11 @@ def nlopt_ccsaq(
upper_bounds,
algorithm=nlopt.LD_CCSAQ,
derivative=derivative,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -418,11 +418,11 @@ def nlopt_mma(
upper_bounds,
*,
nonlinear_constraints=(),
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Minimize a scalar function using the method of moving asymptotes (MMA).
@@ -441,11 +441,11 @@ def nlopt_mma(
algorithm=nlopt.LD_MMA,
derivative=derivative,
nonlinear_constraints=nonlinear_constraints,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -464,11 +464,11 @@ def nlopt_var(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
rank_1_update=True,
):
"""Minimize a scalar function limited memory switching variable-metric method.
@@ -488,11 +488,11 @@ def nlopt_var(
upper_bounds,
algorithm=algo,
derivative=derivative,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -512,11 +512,11 @@ def nlopt_slsqp(
upper_bounds,
*,
nonlinear_constraints=(),
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
):
"""Optimize a scalar function based on SLSQP method.
@@ -532,11 +532,11 @@ def nlopt_slsqp(
algorithm=nlopt.LD_SLSQP,
derivative=derivative,
nonlinear_constraints=nonlinear_constraints,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
return out
@@ -554,11 +554,11 @@ def nlopt_direct(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN_GLOBAL,
locally_biased=False,
random_search=False,
unscaled_bounds=False,
@@ -587,11 +587,11 @@ def nlopt_direct(
lower_bounds,
upper_bounds,
algorithm=algo,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
# this is a global optimizer
@@ -612,11 +612,11 @@ def nlopt_esch(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN_GLOBAL,
):
"""Optimize a scalar function using the ESCH algorithm.
@@ -630,11 +630,11 @@ def nlopt_esch(
lower_bounds,
upper_bounds,
algorithm=nlopt.GN_ESCH,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
# this is a global optimizer
@@ -656,11 +656,11 @@ def nlopt_isres(
upper_bounds,
*,
nonlinear_constraints=(),
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN_GLOBAL,
):
"""Optimize a scalar function using the ISRES algorithm.
@@ -675,11 +675,11 @@ def nlopt_isres(
upper_bounds,
algorithm=nlopt.GN_ISRES,
nonlinear_constraints=nonlinear_constraints,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
)
# this is a global optimizer
@@ -700,11 +700,11 @@ def nlopt_crs2_lm(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN_GLOBAL,
population_size=None,
):
"""Optimize a scalar function using the CRS2_LM algorithm.
@@ -721,11 +721,11 @@ def nlopt_crs2_lm(
lower_bounds,
upper_bounds,
algorithm=nlopt.GN_CRS2_LM,
- convergence_xtol_rel=convergence_relative_params_tolerance,
- convergence_xtol_abs=convergence_absolute_params_tolerance,
- convergence_ftol_rel=convergence_relative_criterion_tolerance,
- convergence_ftol_abs=convergence_absolute_criterion_tolerance,
- stopping_max_eval=stopping_max_criterion_evaluations,
+ convergence_xtol_rel=convergence_xtol_rel,
+ convergence_xtol_abs=convergence_xtol_abs,
+ convergence_ftol_rel=convergence_ftol_rel,
+ convergence_ftol_abs=convergence_ftol_abs,
+ stopping_max_eval=stopping_maxfun,
population_size=population_size,
)
@@ -823,12 +823,12 @@ def _process_nlopt_results(nlopt_obj, solution_x):
"Optimizer stopped because maximum value of criterion function was reached"
),
3: (
- "Optimizer stopped because convergence_relative_criterion_tolerance or "
- "convergence_absolute_criterion_tolerance was reached"
+ "Optimizer stopped because convergence_ftol_rel or "
+ "convergence_ftol_abs was reached"
),
4: (
- "Optimizer stopped because convergence_relative_params_tolerance or "
- "convergence_absolute_params_tolerance was reached"
+ "Optimizer stopped because convergence_xtol_rel or "
+ "convergence_xtol_abs was reached"
),
5: "Optimizer stopped because max_criterion_evaluations was reached",
6: "Optimizer stopped because max running time was reached",
@@ -843,8 +843,8 @@ def _process_nlopt_results(nlopt_obj, solution_x):
"solution_criterion": nlopt_obj.last_optimum_value(),
"solution_derivative": None,
"solution_hessian": None,
- "n_criterion_evaluations": nlopt_obj.get_numevals(),
- "n_derivative_evaluations": None,
+ "n_fun_evals": nlopt_obj.get_numevals(),
+ "n_jac_evals": None,
"n_iterations": None,
"success": nlopt_obj.last_optimize_result() in [1, 2, 3, 4],
"message": messages[nlopt_obj.last_optimize_result()],
diff --git a/src/optimagic/optimizers/pounders.py b/src/optimagic/optimizers/pounders.py
index 18f8ed8f4..37a767400 100644
--- a/src/optimagic/optimizers/pounders.py
+++ b/src/optimagic/optimizers/pounders.py
@@ -37,11 +37,11 @@ def pounders(
x,
lower_bounds,
upper_bounds,
- convergence_absolute_gradient_tolerance=1e-8,
- convergence_relative_gradient_tolerance=1e-8,
- convergence_scaled_gradient_tolerance=False,
+ convergence_gtol_abs=1e-8,
+ convergence_gtol_rel=1e-8,
+ convergence_gtol_scaled=False,
max_interpolation_points=None,
- stopping_max_iterations=2_000,
+ stopping_maxiter=2_000,
trustregion_initial_radius=0.1,
trustregion_minimal_radius=1e-6,
trustregion_maximal_radius=1e6,
@@ -97,11 +97,11 @@ def pounders(
x0=x,
lower_bounds=lower_bounds,
upper_bounds=upper_bounds,
- gtol_abs=convergence_absolute_gradient_tolerance,
- gtol_rel=convergence_relative_gradient_tolerance,
- gtol_scaled=convergence_scaled_gradient_tolerance,
+ gtol_abs=convergence_gtol_abs,
+ gtol_rel=convergence_gtol_rel,
+ gtol_scaled=convergence_gtol_scaled,
maxinterp=max_interpolation_points,
- maxiter=stopping_max_iterations,
+ maxiter=stopping_maxiter,
delta=trustregion_initial_radius,
delta_min=trustregion_minimal_radius,
delta_max=trustregion_maximal_radius,
diff --git a/src/optimagic/optimizers/pygmo_optimizers.py b/src/optimagic/optimizers/pygmo_optimizers.py
index 754a8559f..96287e789 100644
--- a/src/optimagic/optimizers/pygmo_optimizers.py
+++ b/src/optimagic/optimizers/pygmo_optimizers.py
@@ -10,8 +10,8 @@
from optimagic.decorators import mark_minimizer
from optimagic.exceptions import NotInstalledError
from optimagic.optimization.algo_options import (
- CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ CONVERGENCE_XTOL_REL,
+ STOPPING_MAXFUN_GLOBAL,
get_population_size,
)
@@ -39,7 +39,7 @@ def pygmo_gaco(
n_cores=1,
seed=None,
discard_start_params=False,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
kernel_size=63,
speed_parameter_q=1.0,
oracle=0.0,
@@ -47,7 +47,7 @@ def pygmo_gaco(
threshold=1,
speed_of_std_values_convergence=7,
stopping_max_n_without_improvements=100000,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ stopping_maxfun=STOPPING_MAXFUN_GLOBAL,
focus=0.0,
cache=False,
):
@@ -71,7 +71,7 @@ def pygmo_gaco(
)
algo_specific_options = {
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
"ker": kernel_size,
"q": speed_parameter_q,
"oracle": oracle,
@@ -79,7 +79,7 @@ def pygmo_gaco(
"threshold": threshold,
"n_gen_mark": int(speed_of_std_values_convergence),
"impstop": stopping_max_n_without_improvements,
- "evalstop": stopping_max_criterion_evaluations,
+ "evalstop": stopping_maxfun,
"focus": focus,
"memory": cache,
}
@@ -116,7 +116,7 @@ def pygmo_bee_colony(
lower_bounds,
upper_bounds,
*,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
seed=None,
discard_start_params=False,
max_n_trials=1,
@@ -137,7 +137,7 @@ def pygmo_bee_colony(
discard_start_params=discard_start_params,
algo_specific_options={
"limit": max_n_trials,
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
},
)
@@ -168,12 +168,12 @@ def pygmo_de(
population_size=None,
seed=None,
discard_start_params=False,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
weight_coefficient=0.8,
crossover_probability=0.9,
mutation_variant="rand/1/exp",
convergence_criterion_tolerance=1e-6,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
+ convergence_relative_params_tolerance=CONVERGENCE_XTOL_REL,
):
"""Minimize a scalar function using the differential evolution algorithm.
@@ -203,7 +203,7 @@ def pygmo_de(
)
algo_specific_options = {
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
"F": weight_coefficient,
"CR": crossover_probability,
"variant": mutation_variant,
@@ -244,7 +244,7 @@ def pygmo_sea(
population_size=None,
seed=None,
discard_start_params=False,
- stopping_max_iterations=10_000, # Each generation will compute the objective once
+ stopping_maxiter=10_000, # Each generation will compute the objective once
):
r"""Minimize a scalar function using the (N+1)-ES simple evolutionary algorithm.
@@ -261,7 +261,7 @@ def pygmo_sea(
population_size=population_size,
seed=seed,
discard_start_params=discard_start_params,
- algo_specific_options={"gen": int(stopping_max_iterations)},
+ algo_specific_options={"gen": int(stopping_maxiter)},
)
res = _minimize_pygmo(
@@ -291,7 +291,7 @@ def pygmo_sga(
population_size=None,
seed=None,
discard_start_params=False,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
crossover_probability=0.9,
crossover_strategy="exponential",
eta_c=None,
@@ -364,7 +364,7 @@ def pygmo_sga(
param_s = 2
algo_specific_options = {
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
"cr": crossover_probability,
"eta_c": eta_c,
"m": mutation_probability,
@@ -409,7 +409,7 @@ def pygmo_sade(
seed=None,
discard_start_params=False,
jde=True,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
mutation_variant="rand/1/exp",
keep_adapted_params=False,
ftol=1e-6,
@@ -451,7 +451,7 @@ def pygmo_sade(
)
algo_specific_options = {
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
"variant": mutation_variant,
"variant_adptv": 1 if jde else 2,
"ftol": ftol,
@@ -492,7 +492,7 @@ def pygmo_cmaes(
population_size=None,
seed=None,
discard_start_params=False,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
backward_horizon=None,
variance_loss_compensation=None,
learning_rate_rank_one_update=None,
@@ -515,7 +515,7 @@ def pygmo_cmaes(
)
algo_specific_options = {
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
"cc": _replace_none(var=backward_horizon, none_value=-1.0),
"cs": _replace_none(var=variance_loss_compensation, none_value=-1.0),
"c1": _replace_none(var=learning_rate_rank_one_update, none_value=-1.0),
@@ -621,7 +621,7 @@ def pygmo_pso(
population_size=None,
seed=None,
discard_start_params=False,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
omega=0.7298,
force_of_previous_best=2.05,
force_of_best_in_neighborhood=2.05,
@@ -667,7 +667,7 @@ def pygmo_pso(
}
algo_specific_options = {
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
"omega": omega,
"eta1": force_of_previous_best,
"eta2": force_of_best_in_neighborhood,
@@ -715,7 +715,7 @@ def pygmo_pso_gen(
n_cores=1,
seed=None,
discard_start_params=False,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
omega=0.7298,
force_of_previous_best=2.05,
force_of_best_in_neighborhood=2.05,
@@ -762,7 +762,7 @@ def pygmo_pso_gen(
)
algo_specific_options = {
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
"omega": omega,
"eta1": force_of_previous_best,
"eta2": force_of_best_in_neighborhood,
@@ -865,7 +865,7 @@ def pygmo_xnes(
population_size=None,
seed=None,
discard_start_params=False,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
learning_rate_mean_update=1.0,
learning_rate_step_size_update=None,
learning_rate_cov_matrix_update=None,
@@ -887,7 +887,7 @@ def pygmo_xnes(
)
algo_specific_options = {
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
"eta_mu": _replace_none(learning_rate_mean_update, -1),
"eta_sigma": _replace_none(learning_rate_step_size_update, -1),
"eta_b": _replace_none(learning_rate_cov_matrix_update, -1),
@@ -931,7 +931,7 @@ def pygmo_gwo(
population_size=None,
seed=None,
discard_start_params=False,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
):
"""Minimize a scalar function using the Grey Wolf Optimizer.
@@ -948,7 +948,7 @@ def pygmo_gwo(
population_size=population_size,
seed=seed,
discard_start_params=discard_start_params,
- algo_specific_options={"gen": int(stopping_max_iterations)},
+ algo_specific_options={"gen": int(stopping_maxiter)},
)
res = _minimize_pygmo(
@@ -978,7 +978,7 @@ def pygmo_compass_search(
population_size=None,
seed=None,
discard_start_params=False,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ stopping_maxfun=STOPPING_MAXFUN_GLOBAL,
start_range=0.1,
stop_range=0.01,
reduction_coeff=0.5,
@@ -1000,7 +1000,7 @@ def pygmo_compass_search(
population_size = 100
algo_specific_options = {
- "max_fevals": stopping_max_criterion_evaluations,
+ "max_fevals": stopping_maxfun,
"start_range": start_range,
"stop_range": stop_range,
"reduction_coeff": reduction_coeff,
@@ -1039,7 +1039,7 @@ def pygmo_ihs(
population_size=None,
seed=None,
discard_start_params=False,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
choose_from_memory_probability=0.85,
min_pitch_adjustment_rate=0.35,
max_pitch_adjustment_rate=0.99,
@@ -1062,7 +1062,7 @@ def pygmo_ihs(
)
algo_specific_options = {
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
"phmcr": choose_from_memory_probability,
"ppar_min": min_pitch_adjustment_rate,
"ppar_max": max_pitch_adjustment_rate,
@@ -1104,7 +1104,7 @@ def pygmo_de1220(
seed=None,
discard_start_params=False,
jde=True,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS_GENETIC,
+ stopping_maxiter=STOPPING_MAX_ITERATIONS_GENETIC,
allowed_variants=None,
keep_adapted_params=False,
ftol=1e-6,
@@ -1151,7 +1151,7 @@ def pygmo_de1220(
)
algo_specific_options = {
- "gen": int(stopping_max_iterations),
+ "gen": int(stopping_maxiter),
"variant_adptv": 1 if jde else 2,
"ftol": ftol,
"xtol": xtol,
@@ -1326,8 +1326,8 @@ def _process_pygmo_results(evolved):
"solution_criterion": evolved.champion_f[0],
"solution_derivative": None,
"solution_hessian": None,
- "n_criterion_evaluations": evolved.problem.get_fevals(),
- "n_derivative_evaluations": evolved.problem.get_gevals(),
+ "n_fun_evals": evolved.problem.get_fevals(),
+ "n_jac_evals": evolved.problem.get_gevals(),
"n_iterations": None,
"success": True,
"reached_convergence_criterion": "Number of generations reached.",
diff --git a/src/optimagic/optimizers/scipy_optimizers.py b/src/optimagic/optimizers/scipy_optimizers.py
index 608e04146..86ab4f1ab 100644
--- a/src/optimagic/optimizers/scipy_optimizers.py
+++ b/src/optimagic/optimizers/scipy_optimizers.py
@@ -42,19 +42,19 @@
from optimagic.batch_evaluators import process_batch_evaluator
from optimagic.decorators import mark_minimizer
from optimagic.optimization.algo_options import (
- CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- CONVERGENCE_ABSOLUTE_GRADIENT_TOLERANCE,
- CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- CONVERGENCE_RELATIVE_GRADIENT_TOLERANCE,
- CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- CONVERGENCE_SECOND_BEST_ABSOLUTE_CRITERION_TOLERANCE,
- CONVERGENCE_SECOND_BEST_ABSOLUTE_PARAMS_TOLERANCE,
+ CONVERGENCE_FTOL_ABS,
+ CONVERGENCE_GTOL_ABS,
+ CONVERGENCE_XTOL_ABS,
+ CONVERGENCE_FTOL_REL,
+ CONVERGENCE_GTOL_REL,
+ CONVERGENCE_XTOL_REL,
+ CONVERGENCE_SECOND_BEST_FTOL_ABS,
+ CONVERGENCE_SECOND_BEST_XTOL_ABS,
LIMITED_MEMORY_STORAGE_LENGTH,
MAX_LINE_SEARCH_STEPS,
- STOPPING_MAX_CRITERION_EVALUATIONS,
- STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
- STOPPING_MAX_ITERATIONS,
+ STOPPING_MAXFUN,
+ STOPPING_MAXFUN_GLOBAL,
+ STOPPING_MAXITER,
)
from optimagic.parameters.nonlinear_constraints import (
equality_as_inequality_constraints,
@@ -70,10 +70,10 @@ def scipy_lbfgsb(
lower_bounds,
upper_bounds,
*,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_absolute_gradient_tolerance=CONVERGENCE_ABSOLUTE_GRADIENT_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_gtol_abs=CONVERGENCE_GTOL_ABS,
+ stopping_maxfun=STOPPING_MAXFUN,
+ stopping_maxiter=STOPPING_MAXITER,
limited_memory_storage_length=LIMITED_MEMORY_STORAGE_LENGTH,
max_line_search_steps=MAX_LINE_SEARCH_STEPS,
):
@@ -85,10 +85,10 @@ def scipy_lbfgsb(
"""
options = {
"maxcor": limited_memory_storage_length,
- "ftol": convergence_relative_criterion_tolerance,
- "gtol": convergence_absolute_gradient_tolerance,
- "maxfun": stopping_max_criterion_evaluations,
- "maxiter": stopping_max_iterations,
+ "ftol": convergence_ftol_rel,
+ "gtol": convergence_gtol_abs,
+ "maxfun": stopping_maxfun,
+ "maxiter": stopping_maxiter,
"maxls": max_line_search_steps,
}
res = scipy.optimize.minimize(
@@ -112,8 +112,8 @@ def scipy_slsqp(
upper_bounds,
*,
nonlinear_constraints=(),
- convergence_absolute_criterion_tolerance=CONVERGENCE_SECOND_BEST_ABSOLUTE_CRITERION_TOLERANCE, # noqa: E501
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
+ convergence_ftol_abs=CONVERGENCE_SECOND_BEST_FTOL_ABS, # noqa: E501
+ stopping_maxiter=STOPPING_MAXITER,
):
"""Minimize a scalar function of one or more variables using the SLSQP algorithm.
@@ -122,8 +122,8 @@ def scipy_slsqp(
"""
options = {
- "maxiter": stopping_max_iterations,
- "ftol": convergence_absolute_criterion_tolerance,
+ "maxiter": stopping_maxiter,
+ "ftol": convergence_ftol_abs,
}
res = scipy.optimize.minimize(
@@ -146,10 +146,10 @@ def scipy_neldermead(
lower_bounds,
upper_bounds,
*,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
- convergence_absolute_criterion_tolerance=CONVERGENCE_SECOND_BEST_ABSOLUTE_CRITERION_TOLERANCE, # noqa: E501
- convergence_absolute_params_tolerance=CONVERGENCE_SECOND_BEST_ABSOLUTE_PARAMS_TOLERANCE, # noqa: E501
+ stopping_maxiter=STOPPING_MAXITER,
+ stopping_maxfun=STOPPING_MAXFUN,
+ convergence_ftol_abs=CONVERGENCE_SECOND_BEST_FTOL_ABS, # noqa: E501
+ convergence_xtol_abs=CONVERGENCE_SECOND_BEST_XTOL_ABS, # noqa: E501
adaptive=False,
):
"""Minimize a scalar function using the Nelder-Mead algorithm.
@@ -159,12 +159,12 @@ def scipy_neldermead(
"""
options = {
- "maxiter": stopping_max_iterations,
- "maxfev": stopping_max_criterion_evaluations,
+ "maxiter": stopping_maxiter,
+ "maxfev": stopping_maxfun,
# both tolerances seem to have to be fulfilled for Nelder-Mead to converge.
# if not both are specified it does not converge in our tests.
- "xatol": convergence_absolute_params_tolerance,
- "fatol": convergence_absolute_criterion_tolerance,
+ "xatol": convergence_xtol_abs,
+ "fatol": convergence_ftol_abs,
"adaptive": adaptive,
}
@@ -186,10 +186,10 @@ def scipy_powell(
lower_bounds,
upper_bounds,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ stopping_maxfun=STOPPING_MAXFUN,
+ stopping_maxiter=STOPPING_MAXITER,
):
"""Minimize a scalar function using the modified Powell method.
@@ -198,10 +198,10 @@ def scipy_powell(
"""
options = {
- "xtol": convergence_relative_params_tolerance,
- "ftol": convergence_relative_criterion_tolerance,
- "maxfev": stopping_max_criterion_evaluations,
- "maxiter": stopping_max_iterations,
+ "xtol": convergence_xtol_rel,
+ "ftol": convergence_ftol_rel,
+ "maxfev": stopping_maxfun,
+ "maxiter": stopping_maxiter,
}
res = scipy.optimize.minimize(
@@ -220,8 +220,8 @@ def scipy_bfgs(
criterion_and_derivative,
x,
*,
- convergence_absolute_gradient_tolerance=CONVERGENCE_ABSOLUTE_GRADIENT_TOLERANCE,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
+ convergence_gtol_abs=CONVERGENCE_GTOL_ABS,
+ stopping_maxiter=STOPPING_MAXITER,
norm=np.inf,
):
"""Minimize a scalar function of one or more variables using the BFGS algorithm.
@@ -231,8 +231,8 @@ def scipy_bfgs(
"""
options = {
- "gtol": convergence_absolute_gradient_tolerance,
- "maxiter": stopping_max_iterations,
+ "gtol": convergence_gtol_abs,
+ "maxiter": stopping_maxiter,
"norm": norm,
}
@@ -252,8 +252,8 @@ def scipy_conjugate_gradient(
criterion_and_derivative,
x,
*,
- convergence_absolute_gradient_tolerance=CONVERGENCE_ABSOLUTE_GRADIENT_TOLERANCE,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
+ convergence_gtol_abs=CONVERGENCE_GTOL_ABS,
+ stopping_maxiter=STOPPING_MAXITER,
norm=np.inf,
):
"""Minimize a function using a nonlinear conjugate gradient algorithm.
@@ -263,8 +263,8 @@ def scipy_conjugate_gradient(
"""
options = {
- "gtol": convergence_absolute_gradient_tolerance,
- "maxiter": stopping_max_iterations,
+ "gtol": convergence_gtol_abs,
+ "maxiter": stopping_maxiter,
"norm": norm,
}
@@ -284,8 +284,8 @@ def scipy_newton_cg(
criterion_and_derivative,
x,
*,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ stopping_maxiter=STOPPING_MAXITER,
):
"""Minimize a scalar function using Newton's conjugate gradient algorithm.
@@ -294,8 +294,8 @@ def scipy_newton_cg(
"""
options = {
- "xtol": convergence_relative_params_tolerance,
- "maxiter": stopping_max_iterations,
+ "xtol": convergence_xtol_rel,
+ "maxiter": stopping_maxiter,
}
res = scipy.optimize.minimize(
@@ -315,8 +315,8 @@ def scipy_cobyla(
x,
*,
nonlinear_constraints=(),
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
+ stopping_maxiter=STOPPING_MAXITER,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
trustregion_initial_radius=None,
):
"""Minimize a scalar function of one or more variables using the COBYLA algorithm.
@@ -328,7 +328,7 @@ def scipy_cobyla(
if trustregion_initial_radius is None:
trustregion_initial_radius = calculate_trustregion_initial_radius(x)
- options = {"maxiter": stopping_max_iterations, "rhobeg": trustregion_initial_radius}
+ options = {"maxiter": stopping_maxiter, "rhobeg": trustregion_initial_radius}
# cannot handle equality constraints
nonlinear_constraints = equality_as_inequality_constraints(nonlinear_constraints)
@@ -339,7 +339,7 @@ def scipy_cobyla(
method="COBYLA",
constraints=nonlinear_constraints,
options=options,
- tol=convergence_relative_params_tolerance,
+ tol=convergence_xtol_rel,
)
return process_scipy_result(res)
@@ -352,10 +352,10 @@ def scipy_truncated_newton(
lower_bounds,
upper_bounds,
*,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
- convergence_absolute_criterion_tolerance=CONVERGENCE_ABSOLUTE_CRITERION_TOLERANCE,
- convergence_absolute_params_tolerance=CONVERGENCE_ABSOLUTE_PARAMS_TOLERANCE,
- convergence_absolute_gradient_tolerance=CONVERGENCE_ABSOLUTE_GRADIENT_TOLERANCE,
+ stopping_maxfun=STOPPING_MAXFUN,
+ convergence_ftol_abs=CONVERGENCE_FTOL_ABS,
+ convergence_xtol_abs=CONVERGENCE_XTOL_ABS,
+ convergence_gtol_abs=CONVERGENCE_GTOL_ABS,
func_min_estimate=0,
max_hess_evaluations_per_iteration=-1,
max_step_for_line_search=0,
@@ -372,12 +372,12 @@ def scipy_truncated_newton(
options = {
# scipy/optimize/tnc/tnc.c::809 and 844 show that ftol is the
# absolute criterion tolerance
- "ftol": convergence_absolute_criterion_tolerance,
+ "ftol": convergence_ftol_abs,
# scipy/optimize/tnc/tnc.c::856 show sthat xtol is the absolute parameter
# tolerance
- "xtol": convergence_absolute_params_tolerance,
- "gtol": convergence_absolute_gradient_tolerance,
- "maxfun": stopping_max_criterion_evaluations,
+ "xtol": convergence_xtol_abs,
+ "gtol": convergence_gtol_abs,
+ "maxfun": stopping_maxfun,
"maxCGit": max_hess_evaluations_per_iteration,
"stepmx": max_step_for_line_search,
"minfev": func_min_estimate,
@@ -406,9 +406,9 @@ def scipy_trust_constr(
upper_bounds,
*,
nonlinear_constraints=(),
- convergence_absolute_gradient_tolerance=1e-08,
- convergence_relative_params_tolerance=CONVERGENCE_RELATIVE_PARAMS_TOLERANCE,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
+ convergence_gtol_abs=1e-08,
+ convergence_xtol_rel=CONVERGENCE_XTOL_REL,
+ stopping_maxiter=STOPPING_MAXITER,
trustregion_initial_radius=None,
):
"""Minimize a scalar function of one or more variables subject to constraints.
@@ -421,9 +421,9 @@ def scipy_trust_constr(
trustregion_initial_radius = calculate_trustregion_initial_radius(x)
options = {
- "gtol": convergence_absolute_gradient_tolerance,
- "maxiter": stopping_max_iterations,
- "xtol": convergence_relative_params_tolerance,
+ "gtol": convergence_gtol_abs,
+ "maxiter": stopping_maxiter,
+ "xtol": convergence_xtol_rel,
"initial_tr_radius": trustregion_initial_radius,
}
@@ -451,8 +451,8 @@ def process_scipy_result(scipy_results_obj):
"solution_criterion": raw_res.get("fun"),
"solution_derivative": raw_res.get("jac"),
"solution_hessian": raw_res.get("hess"),
- "n_criterion_evaluations": raw_res.get("nfev"),
- "n_derivative_evaluations": raw_res.get("njac") or raw_res.get("njev"),
+ "n_fun_evals": raw_res.get("nfev"),
+ "n_jac_evals": raw_res.get("njac") or raw_res.get("njev"),
"n_iterations": raw_res.get("nit"),
"success": raw_res.get("success"),
"reached_convergence_criterion": None,
@@ -492,9 +492,9 @@ def _scipy_least_squares(
lower_bounds,
upper_bounds,
*,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_relative_gradient_tolerance=CONVERGENCE_RELATIVE_GRADIENT_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_gtol_rel=CONVERGENCE_GTOL_REL,
+ stopping_maxfun=STOPPING_MAXFUN,
relative_step_size_diff_approx=None,
tr_solver=None,
tr_solver_options=None,
@@ -516,9 +516,9 @@ def _scipy_least_squares(
jac=derivative,
# Don't use get_scipy_bounds, b.c. least_squares uses np.inf
bounds=(lower_bounds, upper_bounds),
- max_nfev=stopping_max_criterion_evaluations,
- ftol=convergence_relative_criterion_tolerance,
- gtol=convergence_relative_gradient_tolerance,
+ max_nfev=stopping_maxfun,
+ ftol=convergence_ftol_rel,
+ gtol=convergence_gtol_rel,
method=method,
diff_step=relative_step_size_diff_approx,
tr_solver=tr_solver,
@@ -547,9 +547,9 @@ def scipy_ls_lm(
derivative,
x,
*,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- convergence_relative_gradient_tolerance=CONVERGENCE_RELATIVE_GRADIENT_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ convergence_gtol_rel=CONVERGENCE_GTOL_REL,
+ stopping_maxfun=STOPPING_MAXFUN,
relative_step_size_diff_approx=None,
tr_solver=None,
tr_solver_options=None,
@@ -563,9 +563,9 @@ def scipy_ls_lm(
fun=criterion,
x0=x,
jac=derivative,
- max_nfev=stopping_max_criterion_evaluations,
- ftol=convergence_relative_criterion_tolerance,
- gtol=convergence_relative_gradient_tolerance,
+ max_nfev=stopping_maxfun,
+ ftol=convergence_ftol_rel,
+ gtol=convergence_gtol_rel,
method="lm",
diff_step=relative_step_size_diff_approx,
tr_solver=tr_solver,
@@ -669,7 +669,7 @@ def scipy_brute(
out = {
"solution_x": res[0],
"solution_criterion": res[1],
- "n_criterion_evaluations": res[2].size,
+ "n_fun_evals": res[2].size,
"n_iterations": res[2].size,
"success": True,
"message": "brute force optimization terminated successfully",
@@ -689,15 +689,15 @@ def scipy_differential_evolution(
nonlinear_constraints,
*,
strategy="best1bin",
- stopping_max_iterations=STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ stopping_maxiter=STOPPING_MAXFUN_GLOBAL,
population_size_multiplier=15,
- convergence_relative_criterion_tolerance=0.01,
+ convergence_ftol_rel=0.01,
mutation_constant=(0.5, 1),
recombination_constant=0.7,
seed=None,
polish=True,
sampling_method="latinhypercube",
- convergence_absolute_criterion_tolerance=CONVERGENCE_SECOND_BEST_ABSOLUTE_CRITERION_TOLERANCE, # noqa: E501
+ convergence_ftol_abs=CONVERGENCE_SECOND_BEST_FTOL_ABS, # noqa: E501
n_cores=1,
batch_evaluator="joblib",
):
@@ -712,15 +712,15 @@ def scipy_differential_evolution(
func=criterion,
bounds=_get_scipy_bounds(lower_bounds, upper_bounds),
strategy=strategy,
- maxiter=stopping_max_iterations,
+ maxiter=stopping_maxiter,
popsize=population_size_multiplier,
- tol=convergence_relative_criterion_tolerance,
+ tol=convergence_ftol_rel,
mutation=mutation_constant,
recombination=recombination_constant,
seed=seed,
polish=polish,
init=sampling_method,
- atol=convergence_absolute_criterion_tolerance,
+ atol=convergence_ftol_abs,
updating="deferred",
workers=workers,
constraints=_get_scipy_constraints(nonlinear_constraints),
@@ -746,8 +746,8 @@ def scipy_shgo(
max_sampling_evaluations=None,
convergence_minimum_criterion_value=None,
convergence_minimum_criterion_tolerance=1e-4,
- stopping_max_iterations=None,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ stopping_maxiter=None,
+ stopping_maxfun=STOPPING_MAXFUN_GLOBAL,
stopping_max_processing_time=None,
minimum_homology_group_rank_differential=None,
symmetry=False,
@@ -783,8 +783,8 @@ def scipy_shgo(
"maxfev": max_sampling_evaluations,
"f_min": convergence_minimum_criterion_value,
"f_tol": convergence_minimum_criterion_tolerance,
- "maxiter": stopping_max_iterations,
- "maxev": stopping_max_criterion_evaluations,
+ "maxiter": stopping_maxiter,
+ "maxev": stopping_maxfun,
"maxtime": stopping_max_processing_time,
"minhgrd": minimum_homology_group_rank_differential,
"symmetry": symmetry,
@@ -819,14 +819,14 @@ def scipy_dual_annealing(
upper_bounds,
x,
*,
- stopping_max_iterations=STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ stopping_maxiter=STOPPING_MAXFUN_GLOBAL,
local_algorithm="L-BFGS-B",
local_algo_options=None,
initial_temperature=5230.0,
restart_temperature_ratio=2e-05,
visit=2.62,
accept=-5.0,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
+ stopping_maxfun=STOPPING_MAXFUN,
seed=None,
no_local_search=False,
):
@@ -848,13 +848,13 @@ def scipy_dual_annealing(
res = scipy.optimize.dual_annealing(
func=criterion,
bounds=_get_scipy_bounds(lower_bounds, upper_bounds),
- maxiter=stopping_max_iterations,
+ maxiter=stopping_maxiter,
minimizer_kwargs=minimizer_kwargs,
initial_temp=initial_temperature,
restart_temp_ratio=restart_temperature_ratio,
visit=visit,
accept=accept,
- maxfun=stopping_max_criterion_evaluations,
+ maxfun=stopping_maxfun,
seed=seed,
no_local_search=no_local_search,
x0=x,
@@ -870,9 +870,9 @@ def scipy_direct(
upper_bounds,
x, # noqa: ARG001
*,
- convergence_relative_criterion_tolerance=CONVERGENCE_RELATIVE_CRITERION_TOLERANCE,
- stopping_max_criterion_evaluations=STOPPING_MAX_CRITERION_EVALUATIONS,
- stopping_max_iterations=STOPPING_MAX_CRITERION_EVALUATIONS_GLOBAL,
+ convergence_ftol_rel=CONVERGENCE_FTOL_REL,
+ stopping_maxfun=STOPPING_MAXFUN,
+ stopping_maxiter=STOPPING_MAXFUN_GLOBAL,
locally_biased=True,
convergence_minimum_criterion_value=-np.inf,
convergence_minimum_criterion_tolerance=1e-4,
@@ -889,9 +889,9 @@ def scipy_direct(
res = scipy.optimize.direct(
func=criterion,
bounds=_get_scipy_bounds(lower_bounds, upper_bounds),
- eps=convergence_relative_criterion_tolerance,
- maxfun=stopping_max_criterion_evaluations,
- maxiter=stopping_max_iterations,
+ eps=convergence_ftol_rel,
+ maxfun=stopping_maxfun,
+ maxiter=stopping_maxiter,
locally_biased=locally_biased,
f_min=convergence_minimum_criterion_value,
f_min_rtol=convergence_minimum_criterion_tolerance,
diff --git a/src/optimagic/optimizers/tao_optimizers.py b/src/optimagic/optimizers/tao_optimizers.py
index f7f5039b2..012d1a934 100644
--- a/src/optimagic/optimizers/tao_optimizers.py
+++ b/src/optimagic/optimizers/tao_optimizers.py
@@ -9,10 +9,10 @@
from optimagic.decorators import mark_minimizer
from optimagic.exceptions import NotInstalledError
from optimagic.optimization.algo_options import (
- CONVERGENCE_ABSOLUTE_GRADIENT_TOLERANCE,
- CONVERGENCE_RELATIVE_GRADIENT_TOLERANCE,
- CONVERGENCE_SCALED_GRADIENT_TOLERANCE,
- STOPPING_MAX_ITERATIONS,
+ CONVERGENCE_GTOL_ABS,
+ CONVERGENCE_GTOL_REL,
+ CONVERGENCE_GTOL_SCALED,
+ STOPPING_MAXITER,
)
from optimagic.utilities import calculate_trustregion_initial_radius
@@ -32,11 +32,11 @@ def tao_pounders(
lower_bounds,
upper_bounds,
*,
- convergence_absolute_gradient_tolerance=CONVERGENCE_ABSOLUTE_GRADIENT_TOLERANCE,
- convergence_relative_gradient_tolerance=CONVERGENCE_RELATIVE_GRADIENT_TOLERANCE,
- convergence_scaled_gradient_tolerance=CONVERGENCE_SCALED_GRADIENT_TOLERANCE,
+ convergence_gtol_abs=CONVERGENCE_GTOL_ABS,
+ convergence_gtol_rel=CONVERGENCE_GTOL_REL,
+ convergence_gtol_scaled=CONVERGENCE_GTOL_SCALED,
trustregion_initial_radius=None,
- stopping_max_iterations=STOPPING_MAX_ITERATIONS,
+ stopping_maxiter=STOPPING_MAXITER,
):
r"""Minimize a function using the POUNDERs algorithm.
@@ -102,21 +102,9 @@ def func_tao(tao, x, resid_out): # noqa: ARG001
# scaled_gradient_tolerance manually we manually set absolute_gradient_tolerance and
# or relative_gradient_tolerance to zero once a subset of these two is turned off
# and scaled_gradient_tolerance is still turned on.
- default_gatol = (
- convergence_absolute_gradient_tolerance
- if convergence_absolute_gradient_tolerance
- else -1
- )
- default_gttol = (
- convergence_scaled_gradient_tolerance
- if convergence_scaled_gradient_tolerance
- else -1
- )
- default_grtol = (
- convergence_relative_gradient_tolerance
- if convergence_relative_gradient_tolerance
- else -1
- )
+ default_gatol = convergence_gtol_abs if convergence_gtol_abs else -1
+ default_gttol = convergence_gtol_scaled if convergence_gtol_scaled else -1
+ default_grtol = convergence_gtol_rel if convergence_gtol_rel else -1
# Set tolerances for default convergence tests.
tao.setTolerances(
gatol=default_gatol,
@@ -126,28 +114,18 @@ def func_tao(tao, x, resid_out): # noqa: ARG001
# Set user defined convergence tests. Beware that specifying multiple tests could
# overwrite others or lead to unclear behavior.
- if stopping_max_iterations is not None:
- tao.setConvergenceTest(functools.partial(_max_iters, stopping_max_iterations))
- elif (
- convergence_scaled_gradient_tolerance is False
- and convergence_absolute_gradient_tolerance is False
- ):
- tao.setConvergenceTest(
- functools.partial(_grtol_conv, convergence_relative_gradient_tolerance)
- )
- elif (
- convergence_relative_gradient_tolerance is False
- and convergence_scaled_gradient_tolerance is False
- ):
- tao.setConvergenceTest(
- functools.partial(_gatol_conv, convergence_absolute_gradient_tolerance)
- )
- elif convergence_scaled_gradient_tolerance is False:
+ if stopping_maxiter is not None:
+ tao.setConvergenceTest(functools.partial(_max_iters, stopping_maxiter))
+ elif convergence_gtol_scaled is False and convergence_gtol_abs is False:
+ tao.setConvergenceTest(functools.partial(_grtol_conv, convergence_gtol_rel))
+ elif convergence_gtol_rel is False and convergence_gtol_scaled is False:
+ tao.setConvergenceTest(functools.partial(_gatol_conv, convergence_gtol_abs))
+ elif convergence_gtol_scaled is False:
tao.setConvergenceTest(
functools.partial(
_grtol_gatol_conv,
- convergence_relative_gradient_tolerance,
- convergence_absolute_gradient_tolerance,
+ convergence_gtol_rel,
+ convergence_gtol_abs,
)
)
@@ -254,8 +232,8 @@ def _process_pounders_results(residuals_out, tao):
"solution_criterion": tao.function,
"solution_derivative": None,
"solution_hessian": None,
- "n_criterion_evaluations": tao.getIterationNumber(),
- "n_derivative_evaluations": None,
+ "n_fun_evals": tao.getIterationNumber(),
+ "n_jac_evals": None,
"n_iterations": None,
"success": bool(convergence_code >= 0),
"reached_convergence_criterion": (
diff --git a/src/optimagic/shared/process_user_function.py b/src/optimagic/shared/process_user_function.py
index f53e7e57d..3a4471570 100644
--- a/src/optimagic/shared/process_user_function.py
+++ b/src/optimagic/shared/process_user_function.py
@@ -91,3 +91,20 @@ def get_arguments_without_default(func):
no_default = set(no_default)
return no_default
+
+
+def get_kwargs_from_args(args, func, offset=0):
+ """Convert positional arguments to a dict of keyword arguments.
+
+ Args:
+ args (list, tuple): Positional arguments.
+ func (callable): Function to be called.
+ offset (int, optional): Number of arguments to skip. Defaults to 0.
+
+ Returns:
+ dict: Keyword arguments.
+
+ """
+ names = list(inspect.signature(func).parameters)[offset:]
+ kwargs = {name: arg for name, arg in zip(names, args)}
+ return kwargs
diff --git a/src/optimagic/typing.py b/src/optimagic/typing.py
new file mode 100644
index 000000000..f3fb9a5c1
--- /dev/null
+++ b/src/optimagic/typing.py
@@ -0,0 +1,4 @@
+from typing import Any
+
+
+PyTree = Any
diff --git a/tests/estimagic/test_estimate_ml.py b/tests/estimagic/test_estimate_ml.py
index 17245517d..2604b4de6 100644
--- a/tests/estimagic/test_estimate_ml.py
+++ b/tests/estimagic/test_estimate_ml.py
@@ -131,7 +131,7 @@ def logit_loglike_and_derivative(params, y, x):
"scipy_lbfgsb",
{
"algorithm": "scipy_lbfgsb",
- "criterion_and_derivative": logit_loglike_and_derivative,
+ "fun_and_jac": logit_loglike_and_derivative,
},
], # optimize_options
[None, logit_jacobian, False], # jacobian
@@ -160,8 +160,8 @@ def test_estimate_ml_with_logit_no_constraints(
kwargs = {"y": logit_np_inputs["y"], "x": logit_np_inputs["x"]}
- if "criterion_and_derivative" in optimize_options:
- optimize_options["criterion_and_derivative_kwargs"] = kwargs
+ if "fun_and_jac" in optimize_options:
+ optimize_options["fun_and_jac_kwargs"] = kwargs
got = estimate_ml(
loglike=logit_loglike,
@@ -265,11 +265,11 @@ def test_estimate_ml_with_logit_constraints(
optimize_options = {
"algorithm": "scipy_lbfgsb",
- "algo_options": {"convergence.relative_criterion_tolerance": 1e-12},
+ "algo_options": {"convergence.ftol_rel": 1e-12},
}
- if "criterion_and_derivative" in optimize_options:
- optimize_options["criterion_and_derivative_kwargs"] = kwargs
+ if "fun_and_jac" in optimize_options:
+ optimize_options["fun_and_jac_kwargs"] = kwargs
got = estimate_ml(
loglike=logit_loglike,
diff --git a/tests/optimagic/benchmarking/test_cartis_roberts.py b/tests/optimagic/benchmarking/test_cartis_roberts.py
index 6aac6fd5d..b5a5ab53b 100644
--- a/tests/optimagic/benchmarking/test_cartis_roberts.py
+++ b/tests/optimagic/benchmarking/test_cartis_roberts.py
@@ -10,7 +10,7 @@
@pytest.mark.parametrize("name, specification", list(CARTIS_ROBERTS_PROBLEMS.items()))
def test_cartis_roberts_function_at_start_x(name, specification): # noqa: ARG001
- _criterion = specification["criterion"]
+ _criterion = specification["fun"]
_x = np.array(specification["start_x"])
assert isinstance(specification["start_x"], list)
_contributions = _criterion(_x)
@@ -22,7 +22,7 @@ def test_cartis_roberts_function_at_start_x(name, specification): # noqa: ARG00
@pytest.mark.parametrize("name, specification", list(CARTIS_ROBERTS_PROBLEMS.items()))
def test_cartis_roberts_function_at_solution_x(name, specification): # noqa: ARG001
- _criterion = specification["criterion"]
+ _criterion = specification["fun"]
_x = specification["solution_x"]
if _x is not None:
assert isinstance(_x, list)
diff --git a/tests/optimagic/benchmarking/test_get_benchmark_problems.py b/tests/optimagic/benchmarking/test_get_benchmark_problems.py
index aa628a92b..d08fbdae0 100644
--- a/tests/optimagic/benchmarking/test_get_benchmark_problems.py
+++ b/tests/optimagic/benchmarking/test_get_benchmark_problems.py
@@ -26,7 +26,7 @@ def test_get_problems(name, additive_noise, multiplicative_noise, scaling):
)
first_name = list(problems)[0]
first = problems[first_name]
- func = first["inputs"]["criterion"]
+ func = first["inputs"]["fun"]
params = first["inputs"]["params"]
first_eval = func(params)["value"]
diff --git a/tests/optimagic/benchmarking/test_more_wild.py b/tests/optimagic/benchmarking/test_more_wild.py
index 0e3b6c697..b7427cca0 100644
--- a/tests/optimagic/benchmarking/test_more_wild.py
+++ b/tests/optimagic/benchmarking/test_more_wild.py
@@ -8,7 +8,7 @@
@pytest.mark.parametrize("name, specification", list(MORE_WILD_PROBLEMS.items()))
def test_more_wild_function_at_start_x(name, specification): # noqa: ARG001
- _criterion = specification["criterion"]
+ _criterion = specification["fun"]
assert isinstance(specification["start_x"], list)
_x = np.array(specification["start_x"])
_contributions = _criterion(_x)
diff --git a/tests/optimagic/benchmarking/test_run_benchmark.py b/tests/optimagic/benchmarking/test_run_benchmark.py
index 9b0a1d3f0..aa32a293b 100644
--- a/tests/optimagic/benchmarking/test_run_benchmark.py
+++ b/tests/optimagic/benchmarking/test_run_benchmark.py
@@ -91,8 +91,8 @@ def get_benchmark_problems_custom(raw_problems, internal_criterion):
"inputs": inputs,
"solution": _create_problem_solution_custom(specification),
"info": specification.get("info", {}),
- "noise_free_criterion": partial(
- internal_criterion, criterion=specification["criterion"]
+ "noise_free_fun": partial(
+ internal_criterion, criterion=specification["fun"]
),
"noisy": False,
}
@@ -103,11 +103,11 @@ def get_benchmark_problems_custom(raw_problems, internal_criterion):
def _create_problem_inputs_custom(specification, internal_criterion_func):
_criterion = partial(
internal_criterion_func,
- criterion=specification["criterion"],
+ criterion=specification["fun"],
)
_params = specification["start_x"]
- inputs = {"criterion": _criterion, "params": _params}
+ inputs = {"fun": _criterion, "params": _params}
return inputs
@@ -155,14 +155,14 @@ def _internal_criterion_dict(params, criterion):
problems_pandas_input = {
"linear_full_rank_good_start": {
- "criterion": partial(linear_full_rank, dim_out=45),
+ "fun": partial(linear_full_rank, dim_out=45),
"start_x": pd.DataFrame(np.ones(9), columns=["value"]),
"solution_x": pd.DataFrame(linear_full_rank_solution_x, columns=["value"]),
"start_criterion": 72,
"solution_criterion": 36,
},
"rosenbrock_good_start": {
- "criterion": rosenbrock,
+ "fun": rosenbrock,
"start_x": pd.DataFrame([-1.2, 1], columns=["value"]),
"solution_x": pd.DataFrame(np.ones(2), columns=["value"]),
"start_criterion": 24.2,
@@ -173,7 +173,7 @@ def _internal_criterion_dict(params, criterion):
prolems_dict_input = {
"linear_full_rank_good_start": {
- "criterion": partial(linear_full_rank, dim_out=45),
+ "fun": partial(linear_full_rank, dim_out=45),
"start_x": {"a": 1, "b": np.ones((2, 2, 2))},
"solution_x": {
"a": linear_full_rank_solution_x[0],
@@ -183,7 +183,7 @@ def _internal_criterion_dict(params, criterion):
"solution_criterion": 36,
},
"rosenbrock_good_start": {
- "criterion": rosenbrock,
+ "fun": rosenbrock,
"start_x": {"a": np.array([-1.2, 1])},
"solution_x": {"a": np.ones(2)},
"start_criterion": 24.2,
diff --git a/tests/optimagic/logging/test_read_log.py b/tests/optimagic/logging/test_read_log.py
index a030675a1..4b6c890c7 100644
--- a/tests/optimagic/logging/test_read_log.py
+++ b/tests/optimagic/logging/test_read_log.py
@@ -21,7 +21,7 @@ def _crit(params):
return x @ x
minimize(
- criterion=_crit,
+ fun=_crit,
params={"a": 1, "b": 2, "c": 3},
algorithm="scipy_lbfgsb",
logging=path,
diff --git a/tests/optimagic/optimization/test_criterion_versions.py b/tests/optimagic/optimization/test_criterion_versions.py
index e111c84d2..d04d05f64 100644
--- a/tests/optimagic/optimization/test_criterion_versions.py
+++ b/tests/optimagic/optimization/test_criterion_versions.py
@@ -49,7 +49,7 @@ def test_valid_criterion_versions(criterion, algorithm):
start_params = pd.DataFrame()
start_params["value"] = [1, 2, 3]
res = minimize(
- criterion=criterion,
+ fun=criterion,
params=start_params,
algorithm=algorithm,
)
@@ -64,7 +64,7 @@ def test_invalid_criterion_versions(criterion, algorithm):
with pytest.raises(InvalidFunctionError):
minimize(
- criterion=criterion,
+ fun=criterion,
params=start_params,
algorithm=algorithm,
)
diff --git a/tests/optimagic/optimization/test_derivative_versions.py b/tests/optimagic/optimization/test_derivative_versions.py
index 5f13173c9..e84595c99 100644
--- a/tests/optimagic/optimization/test_derivative_versions.py
+++ b/tests/optimagic/optimization/test_derivative_versions.py
@@ -69,11 +69,11 @@ def test_valid_derivative_versions(
if direction == "minimize":
res = minimize(
- criterion=sos_dict_criterion,
+ fun=sos_dict_criterion,
params=start_params,
algorithm=algorithm,
- derivative=derivative,
- criterion_and_derivative=criterion_and_derivative,
+ jac=derivative,
+ fun_and_jac=criterion_and_derivative,
error_handling="raise",
)
else:
@@ -84,11 +84,11 @@ def test_valid_derivative_versions(
else switch_sign(criterion_and_derivative)
)
res = maximize(
- criterion=switch_sign(sos_dict_criterion),
+ fun=switch_sign(sos_dict_criterion),
params=start_params,
algorithm=algorithm,
- derivative=deriv,
- criterion_and_derivative=crit_and_deriv,
+ jac=deriv,
+ fun_and_jac=crit_and_deriv,
error_handling="raise",
)
@@ -107,11 +107,11 @@ def test_invalid_derivative_versions(
if direction == "minimize":
with pytest.raises(ValueError):
minimize(
- criterion=sos_dict_criterion,
+ fun=sos_dict_criterion,
params=start_params,
algorithm=algorithm,
- derivative=derivative,
- criterion_and_derivative=criterion_and_derivative,
+ jac=derivative,
+ fun_and_jac=criterion_and_derivative,
)
else:
deriv = derivative if derivative is None else switch_sign(derivative)
@@ -122,11 +122,11 @@ def test_invalid_derivative_versions(
)
with pytest.raises(ValueError):
maximize(
- criterion=switch_sign(sos_dict_criterion),
+ fun=switch_sign(sos_dict_criterion),
params=start_params,
algorithm=algorithm,
- derivative=deriv,
- criterion_and_derivative=crit_and_deriv,
+ jac=deriv,
+ fun_and_jac=crit_and_deriv,
)
@@ -135,10 +135,10 @@ def test_dict_derivative():
start_params["value"] = [1, 2, 3]
res = minimize(
- criterion=sos_dict_criterion,
+ fun=sos_dict_criterion,
params=start_params,
algorithm="scipy_lbfgsb",
- derivative=sos_dict_derivative,
+ jac=sos_dict_derivative,
)
aaae(res.params["value"].to_numpy(), np.zeros(3))
diff --git a/tests/optimagic/optimization/test_history_collection.py b/tests/optimagic/optimization/test_history_collection.py
index f77d3620e..e3d2409a6 100644
--- a/tests/optimagic/optimization/test_history_collection.py
+++ b/tests/optimagic/optimization/test_history_collection.py
@@ -29,7 +29,7 @@ def test_history_collection_with_parallelization(algorithm, tmp_path):
logging = tmp_path / "log.db"
collected_hist = minimize(
- criterion=lambda x: {"root_contributions": x, "value": x @ x},
+ fun=lambda x: {"root_contributions": x, "value": x @ x},
params=np.arange(5),
algorithm=algorithm,
lower_bounds=lb,
@@ -68,7 +68,7 @@ def _dummy_optimizer(criterion, x, n_cores, batch_size, batch_evaluator):
out = {
"solution_x": xs[-1],
"solution_criterion": 5,
- "n_criterion_evaluations": 15,
+ "n_fun_evals": 15,
"n_iterations": 3,
"success": True,
}
@@ -112,7 +112,7 @@ def test_history_collection_with_dummy_optimizer(n_cores, batch_size):
}
res = minimize(
- criterion=_fake_criterion,
+ fun=_fake_criterion,
params=np.arange(5),
algorithm=_dummy_optimizer,
algo_options=options,
diff --git a/tests/optimagic/optimization/test_jax_derivatives.py b/tests/optimagic/optimization/test_jax_derivatives.py
index fd8c043a4..15c77d250 100644
--- a/tests/optimagic/optimization/test_jax_derivatives.py
+++ b/tests/optimagic/optimization/test_jax_derivatives.py
@@ -26,8 +26,8 @@ def criterion(x):
gradient = jax.grad(criterion)
res = minimize(
- criterion=criterion,
- derivative=gradient,
+ fun=criterion,
+ jac=gradient,
params=start_params,
algorithm="scipy_lbfgsb",
)
@@ -44,10 +44,10 @@ def criterion(x):
return x**2
res = minimize(
- criterion=criterion,
+ fun=criterion,
params=jnp.array(1.0),
algorithm="scipy_lbfgsb",
- derivative=jax.grad(criterion),
+ jac=jax.grad(criterion),
)
assert isinstance(res.params, jnp.ndarray)
@@ -60,10 +60,10 @@ def criterion(x):
return x @ x
res = minimize(
- criterion=criterion,
+ fun=criterion,
params=jnp.arange(3),
algorithm="scipy_lbfgsb",
- derivative=jax.grad(criterion),
+ jac=jax.grad(criterion),
)
assert isinstance(res.params, jnp.ndarray)
@@ -88,10 +88,10 @@ def ls_wrapper(x):
}
res = minimize(
- criterion=criterion,
+ fun=criterion,
params=jnp.array([1.0, 2.0, 3.0]),
algorithm=algorithm,
- derivative=deriv_dict,
+ jac=deriv_dict,
)
assert isinstance(res.params, jnp.ndarray)
@@ -110,10 +110,10 @@ def ls_wrapper(x):
jac = jax.jacobian(ls_wrapper)
res = minimize(
- criterion=criterion,
+ fun=criterion,
params=params,
algorithm="scipy_ls_lm",
- derivative=jac,
+ jac=jac,
)
assert isinstance(res.params, dict)
diff --git a/tests/optimagic/optimization/test_many_algorithms.py b/tests/optimagic/optimization/test_many_algorithms.py
index 98586e46e..429354d06 100644
--- a/tests/optimagic/optimization/test_many_algorithms.py
+++ b/tests/optimagic/optimization/test_many_algorithms.py
@@ -39,7 +39,7 @@ def sos(x):
@pytest.mark.parametrize("algorithm", LOCAL_ALGORITHMS)
def test_algorithm_on_sum_of_squares(algorithm):
res = minimize(
- criterion=sos,
+ fun=sos,
params=np.arange(3),
algorithm=algorithm,
collect_history=True,
@@ -52,7 +52,7 @@ def test_algorithm_on_sum_of_squares(algorithm):
@pytest.mark.parametrize("algorithm", BOUNDED_ALGORITHMS)
def test_algorithm_on_sum_of_squares_with_binding_bounds(algorithm):
res = minimize(
- criterion=sos,
+ fun=sos,
params=np.array([3, 2, -3]),
lower_bounds=np.array([1, -np.inf, -np.inf]),
upper_bounds=np.array([np.inf, np.inf, -1]),
@@ -75,7 +75,7 @@ def test_algorithm_on_sum_of_squares_with_binding_bounds(algorithm):
@pytest.mark.parametrize("algorithm", GLOBAL_ALGORITHMS_AVAILABLE)
def test_global_algorithms_on_sum_of_squares(algorithm):
res = minimize(
- criterion=sos,
+ fun=sos,
params=np.array([0.35, 0.35]),
lower_bounds=np.array([0.2, -0.5]),
upper_bounds=np.array([1, 0.5]),
diff --git a/tests/optimagic/optimization/test_multistart.py b/tests/optimagic/optimization/test_multistart.py
index f9b426679..fbcd5c92d 100644
--- a/tests/optimagic/optimization/test_multistart.py
+++ b/tests/optimagic/optimization/test_multistart.py
@@ -36,14 +36,14 @@ def test_multistart_minimize_with_sum_of_squares_at_defaults(
):
if direction == "minimize":
res = minimize(
- criterion=criterion,
+ fun=criterion,
params=params,
algorithm="scipy_lbfgsb",
multistart=True,
)
else:
res = maximize(
- criterion=switch_sign(sos_dict_criterion),
+ fun=switch_sign(sos_dict_criterion),
params=params,
algorithm="scipy_lbfgsb",
multistart=True,
@@ -56,7 +56,7 @@ def test_multistart_minimize_with_sum_of_squares_at_defaults(
assert all(isinstance(entry, float) for entry in ms_info["exploration_results"])
assert all(isinstance(entry, OptimizeResult) for entry in ms_info["local_optima"])
assert all(isinstance(entry, pd.DataFrame) for entry in ms_info["start_parameters"])
- assert np.allclose(res.criterion, 0)
+ assert np.allclose(res.fun, 0)
aaae(res.params["value"], np.zeros(4))
@@ -68,7 +68,7 @@ def test_multistart_with_existing_sample(params):
options = {"sample": sample}
res = minimize(
- criterion=sos_dict_criterion,
+ fun=sos_dict_criterion,
params=params,
algorithm="scipy_lbfgsb",
multistart=True,
@@ -86,7 +86,7 @@ def test_convergence_via_max_discoveries_works(params):
}
res = maximize(
- criterion=switch_sign(sos_dict_criterion),
+ fun=switch_sign(sos_dict_criterion),
params=params,
algorithm="scipy_lbfgsb",
multistart=True,
@@ -103,7 +103,7 @@ def test_steps_are_logged_as_skipped_if_convergence(params):
}
minimize(
- criterion=sos_dict_criterion,
+ fun=sos_dict_criterion,
params=params,
algorithm="scipy_lbfgsb",
multistart=True,
@@ -120,7 +120,7 @@ def test_all_steps_occur_in_optimization_iterations_if_no_convergence(params):
options = {"convergence_max_discoveries": np.inf}
minimize(
- criterion=sos_dict_criterion,
+ fun=sos_dict_criterion,
params=params,
algorithm="scipy_lbfgsb",
multistart=True,
@@ -143,7 +143,7 @@ def test_all_steps_occur_in_optimization_iterations_if_no_convergence(params):
def test_with_non_transforming_constraints(params):
res = minimize(
- criterion=sos_dict_criterion,
+ fun=sos_dict_criterion,
params=params,
constraints=[{"loc": [0, 1], "type": "fixed", "value": [0, 1]}],
algorithm="scipy_lbfgsb",
@@ -156,7 +156,7 @@ def test_with_non_transforming_constraints(params):
def test_error_is_raised_with_transforming_constraints(params):
with pytest.raises(NotImplementedError):
minimize(
- criterion=sos_dict_criterion,
+ fun=sos_dict_criterion,
params=params,
constraints=[{"loc": [0, 1], "type": "probability"}],
algorithm="scipy_lbfgsb",
@@ -171,7 +171,7 @@ def _params_list_to_aray(params_list):
def test_multistart_with_numpy_params():
res = minimize(
- criterion=lambda params: params @ params,
+ fun=lambda params: params @ params,
params=np.arange(5),
algorithm="scipy_lbfgsb",
soft_lower_bounds=np.full(5, -10),
@@ -185,7 +185,7 @@ def test_multistart_with_numpy_params():
def test_with_invalid_bounds():
with pytest.raises(ValueError):
minimize(
- criterion=lambda x: x @ x,
+ fun=lambda x: x @ x,
params=np.arange(5),
algorithm="scipy_neldermead",
multistart=True,
@@ -198,7 +198,7 @@ def _crit(params):
return x @ x
res = minimize(
- criterion=_crit,
+ fun=_crit,
params=np.full(5, 10),
soft_lower_bounds=np.full(5, -1),
soft_upper_bounds=np.full(5, 11),
@@ -222,11 +222,11 @@ def ackley(x):
dim = 5
kwargs = {
- "criterion": ackley,
+ "fun": ackley,
"params": np.full(dim, -10),
"lower_bounds": np.full(dim, -32),
"upper_bounds": np.full(dim, 32),
- "algo_options": {"stopping.max_criterion_evaluations": 1000},
+ "algo_options": {"stopping.maxfun": 1000},
}
minimize(
@@ -243,7 +243,7 @@ def ackley(x):
def test_multistart_with_least_squares_optimizers():
est = minimize(
- criterion=sos_dict_criterion,
+ fun=sos_dict_criterion,
params=np.array([-1, 1.0]),
lower_bounds=np.full(2, -10.0),
upper_bounds=np.full(2, 10.0),
diff --git a/tests/optimagic/optimization/test_optimizations_with_scaling.py b/tests/optimagic/optimization/test_optimizations_with_scaling.py
index 4679727b6..5d6db79f7 100644
--- a/tests/optimagic/optimization/test_optimizations_with_scaling.py
+++ b/tests/optimagic/optimization/test_optimizations_with_scaling.py
@@ -41,13 +41,13 @@ def test_optimizations_with_scaling(algorithm, scaling_options):
constraints = [{"loc": [3, 4], "type": "fixed"}]
res = minimize(
- criterion=sos_scalar_criterion,
+ fun=sos_scalar_criterion,
params=params,
constraints=constraints,
algorithm=algorithm,
scaling=True,
scaling_options=scaling_options,
- derivative=sos_gradient,
+ jac=sos_gradient,
)
expected_solution = np.array([0, 0, 0, 3, 4])
diff --git a/tests/optimagic/optimization/test_optimize.py b/tests/optimagic/optimization/test_optimize.py
index 2d3094e4c..2f9b86d84 100644
--- a/tests/optimagic/optimization/test_optimize.py
+++ b/tests/optimagic/optimization/test_optimize.py
@@ -17,7 +17,7 @@ def test_sign_is_switched_back_after_maximization():
algorithm="scipy_lbfgsb",
)
- assert np.allclose(res.criterion, 1)
+ assert np.allclose(res.fun, 1)
def test_scipy_lbfgsb_actually_calls_criterion_and_derivative():
@@ -28,17 +28,17 @@ def raising_crit_and_deriv(params): # noqa: ARG001
with pytest.raises(InvalidFunctionError, match="Error while evaluating"):
minimize(
- criterion=sos_scalar_criterion,
+ fun=sos_scalar_criterion,
params=params,
algorithm="scipy_lbfgsb",
- criterion_and_derivative=raising_crit_and_deriv,
+ fun_and_jac=raising_crit_and_deriv,
)
def test_with_invalid_numdiff_options():
with pytest.raises(InvalidKwargsError):
minimize(
- criterion=lambda x: x @ x,
+ fun=lambda x: x @ x,
params=np.arange(5),
algorithm="scipy_lbfgsb",
numdiff_options={"bla": 15},
diff --git a/tests/optimagic/optimization/test_optimize_result.py b/tests/optimagic/optimization/test_optimize_result.py
index a32470c92..24100bf6b 100644
--- a/tests/optimagic/optimization/test_optimize_result.py
+++ b/tests/optimagic/optimization/test_optimize_result.py
@@ -31,14 +31,14 @@ def convergence_report():
def base_inputs():
out = {
"params": np.ones(3),
- "criterion": 500,
- "start_criterion": 1000,
+ "fun": 500,
+ "start_fun": 1000,
"start_params": np.full(3, 10),
"direction": "minimize",
"message": "OPTIMIZATION TERMINATED SUCCESSFULLY",
"success": True,
- "n_criterion_evaluations": 100,
- "n_derivative_evaluations": 0,
+ "n_fun_evals": 100,
+ "n_jac_evals": 0,
"n_iterations": 80,
"history": {"criterion": list(range(10))},
"algorithm": "scipy_lbfgsb",
@@ -68,3 +68,9 @@ def test_to_pickle(base_inputs, convergence_report, tmp_path):
**base_inputs,
)
res.to_pickle(tmp_path / "bla.pkl")
+
+
+def test_dict_access(base_inputs):
+ res = OptimizeResult(**base_inputs)
+ assert res["fun"] == 500
+ assert res["nfev"] == 100
diff --git a/tests/optimagic/optimization/test_params_versions.py b/tests/optimagic/optimization/test_params_versions.py
index 8172d59a8..8557cecac 100644
--- a/tests/optimagic/optimization/test_params_versions.py
+++ b/tests/optimagic/optimization/test_params_versions.py
@@ -51,7 +51,7 @@ def test_tree_params_numerical_derivative_scalar_criterion(params):
expected = np.zeros_like(flat)
res = minimize(
- criterion=flexible_sos_scalar,
+ fun=flexible_sos_scalar,
params=params,
algorithm="scipy_lbfgsb",
)
@@ -65,8 +65,8 @@ def test_tree_params_scalar_criterion(params):
expected = np.zeros_like(flat)
res = minimize(
- criterion=flexible_sos_scalar,
- derivative=flexible_sos_scalar_derivative,
+ fun=flexible_sos_scalar,
+ jac=flexible_sos_scalar_derivative,
params=params,
algorithm="scipy_lbfgsb",
)
@@ -86,7 +86,7 @@ def test_tree_params_numerical_derivative_sos_ls(params, algorithm):
expected = np.zeros_like(flat)
res = minimize(
- criterion=flexible_sos_ls,
+ fun=flexible_sos_ls,
params=params,
algorithm=algorithm,
)
@@ -105,8 +105,8 @@ def test_tree_params_sos_ls(params, algorithm):
}
res = minimize(
- criterion=flexible_sos_ls,
- derivative=derivatives,
+ fun=flexible_sos_ls,
+ jac=derivatives,
params=params,
algorithm=algorithm,
)
diff --git a/tests/optimagic/optimization/test_scipy_aliases.py b/tests/optimagic/optimization/test_scipy_aliases.py
new file mode 100644
index 000000000..e0cfbb730
--- /dev/null
+++ b/tests/optimagic/optimization/test_scipy_aliases.py
@@ -0,0 +1,203 @@
+import optimagic as om
+import numpy as np
+from numpy.testing import assert_array_almost_equal as aaae
+from optimagic.exceptions import AliasError
+import pytest
+
+
+def test_x0_works_in_minimize():
+ res = om.minimize(
+ fun=lambda x: x @ x,
+ x0=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ )
+ aaae(res.params, np.zeros(3))
+
+
+def test_x0_works_in_maximize():
+ res = om.maximize(
+ fun=lambda x: -x @ x,
+ x0=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ )
+ aaae(res.params, np.zeros(3))
+
+
+def test_x0_and_params_do_not_work_together_in_minimize():
+ with pytest.raises(AliasError, match="x0 is an alias"):
+ om.minimize(
+ fun=lambda x: x @ x,
+ x0=np.arange(3),
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ )
+
+
+def test_x0_and_params_do_not_work_together_in_maximize():
+ with pytest.raises(AliasError, match="x0 is an alias"):
+ om.maximize(
+ fun=lambda x: -x @ x,
+ x0=np.arange(3),
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ )
+
+
+METHODS = [
+ "Nelder-Mead",
+ "Powell",
+ "CG",
+ "BFGS",
+ "Newton-CG",
+ "L-BFGS-B",
+ "TNC",
+ "COBYLA",
+ "SLSQP",
+ "trust-constr",
+]
+
+
+@pytest.mark.parametrize("method", METHODS)
+def test_method_works_in_minimize(method):
+ res = om.minimize(
+ fun=lambda x: x @ x,
+ x0=np.arange(3),
+ method="L-BFGS-B",
+ )
+ aaae(res.params, np.zeros(3))
+
+
+@pytest.mark.parametrize("method", METHODS)
+def test_method_works_in_maximize(method):
+ res = om.maximize(
+ fun=lambda x: -x @ x,
+ x0=np.arange(3),
+ method="L-BFGS-B",
+ )
+ aaae(res.params, np.zeros(3))
+
+
+def test_method_and_algorithm_do_not_work_together_in_minimize():
+ with pytest.raises(AliasError, match="method is an alias"):
+ om.minimize(
+ fun=lambda x: x @ x,
+ x0=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ method="L-BFGS-B",
+ )
+
+
+def test_method_and_algorithm_do_not_work_together_in_maximize():
+ with pytest.raises(AliasError, match="method is an alias"):
+ om.maximize(
+ fun=lambda x: -x @ x,
+ x0=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ method="L-BFGS-B",
+ )
+
+
+def test_exception_for_hess():
+ msg = "The hess argument is not yet supported"
+ with pytest.raises(NotImplementedError, match=msg):
+ om.minimize(
+ fun=lambda x: x @ x,
+ x0=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ hess=lambda x: np.eye(len(x)),
+ )
+
+
+def test_exception_for_hessp():
+ msg = "The hessp argument is not yet supported"
+ with pytest.raises(NotImplementedError, match=msg):
+ om.minimize(
+ fun=lambda x: x @ x,
+ x0=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ hessp=lambda x, p: np.eye(len(x)) @ p,
+ )
+
+
+def test_exception_for_callback():
+ msg = "The callback argument is not yet supported"
+ with pytest.raises(NotImplementedError, match=msg):
+ om.minimize(
+ fun=lambda x: x @ x,
+ x0=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ callback=lambda x: print(x),
+ )
+
+
+def test_exception_for_options():
+ msg = "The options argument is not supported"
+ with pytest.raises(NotImplementedError, match=msg):
+ om.minimize(
+ fun=lambda x: x @ x,
+ x0=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ options={"maxiter": 100},
+ )
+
+
+def test_exception_for_tol():
+ msg = "The tol argument is not supported"
+ with pytest.raises(NotImplementedError, match=msg):
+ om.minimize(
+ fun=lambda x: x @ x,
+ x0=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ tol=1e-6,
+ )
+
+
+def test_args_works_in_minimize():
+ res = om.minimize(
+ fun=lambda x, a: ((x - a) ** 2).sum(),
+ x0=np.arange(3),
+ args=(1,),
+ algorithm="scipy_lbfgsb",
+ )
+ aaae(res.params, np.ones(3))
+
+
+def test_args_works_in_maximize():
+ res = om.maximize(
+ fun=lambda x, a: -((x - a) ** 2).sum(),
+ x0=np.arange(3),
+ args=(1,),
+ algorithm="scipy_lbfgsb",
+ )
+ aaae(res.params, np.ones(3))
+
+
+def test_args_does_not_work_with_together_with_any_kwargs():
+ with pytest.raises(AliasError, match="args is an alternative"):
+ om.minimize(
+ fun=lambda x, a: ((x - a) ** 2).sum(),
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ args=(1,),
+ fun_kwargs={"a": 1},
+ )
+
+
+def test_jac_equal_true_works_in_minimize():
+ res = om.minimize(
+ fun=lambda x: (x @ x, 2 * x),
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ jac=True,
+ )
+ aaae(res.params, np.zeros(3))
+
+
+def test_jac_equal_true_works_in_maximize():
+ res = om.maximize(
+ fun=lambda x: (-x @ x, -2 * x),
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ jac=True,
+ )
+ aaae(res.params, np.zeros(3))
diff --git a/tests/optimagic/optimization/test_useful_exceptions.py b/tests/optimagic/optimization/test_useful_exceptions.py
index f5ace0662..ee5c02fc6 100644
--- a/tests/optimagic/optimization/test_useful_exceptions.py
+++ b/tests/optimagic/optimization/test_useful_exceptions.py
@@ -16,7 +16,7 @@ def f(params, bla, blubb): # noqa: ARG001
params = pd.DataFrame(np.ones((3, 1)), columns=["value"])
with pytest.raises(InvalidKwargsError):
- minimize(f, params, "scipy_lbfgsb", criterion_kwargs={"bla": 3})
+ minimize(f, params, "scipy_lbfgsb", fun_kwargs={"bla": 3})
def test_missing_derivative_kwargs():
@@ -29,9 +29,7 @@ def grad(params, bla, blubb): # noqa: ARG001
params = pd.DataFrame(np.ones((3, 1)), columns=["value"])
with pytest.raises(InvalidKwargsError):
- minimize(
- f, params, "scipy_lbfgsb", derivative=grad, derivative_kwargs={"bla": 3}
- )
+ minimize(f, params, "scipy_lbfgsb", jac=grad, jac_kwargs={"bla": 3})
def test_missing_criterion_and_derivative_kwargs():
@@ -48,8 +46,8 @@ def f_and_grad(params, bla, blubb): # noqa: ARG001
f,
params,
"scipy_lbfgsb",
- criterion_and_derivative=f_and_grad,
- criterion_and_derivative_kwargs={"bla": 3},
+ fun_and_jac=f_and_grad,
+ fun_and_jac_kwargs={"bla": 3},
)
@@ -61,7 +59,7 @@ def f(params, bla, foo): # noqa: ARG001
snippet = "Did you mean"
with pytest.raises(InvalidKwargsError, match=snippet):
- minimize(f, params, "scipy_lbfgsb", criterion_kwargs={"bla": 3, "foa": 4})
+ minimize(f, params, "scipy_lbfgsb", fun_kwargs={"bla": 3, "foa": 4})
def test_criterion_with_runtime_error_derivative_free():
diff --git a/tests/optimagic/optimization/test_with_advanced_constraints.py b/tests/optimagic/optimization/test_with_advanced_constraints.py
index d5403f587..d43e52bed 100644
--- a/tests/optimagic/optimization/test_with_advanced_constraints.py
+++ b/tests/optimagic/optimization/test_with_advanced_constraints.py
@@ -70,10 +70,10 @@ def test_with_covariance_constraint_bounds_distance(derivative, constr_name):
params = pd.Series(START_INFO[constr_name], name="value").to_frame()
res = minimize(
- criterion=sos_scalar_criterion,
+ fun=sos_scalar_criterion,
params=params,
algorithm="scipy_lbfgsb",
- derivative=derivative,
+ jac=derivative,
constraints=CONSTR_INFO[constr_name],
)
diff --git a/tests/optimagic/optimization/test_with_constraints.py b/tests/optimagic/optimization/test_with_constraints.py
index af21e59cb..733b29d14 100644
--- a/tests/optimagic/optimization/test_with_constraints.py
+++ b/tests/optimagic/optimization/test_with_constraints.py
@@ -168,12 +168,12 @@ def test_constrained_minimization(
params = np.array(START_INFO[constraint_name])
res = minimize(
- criterion=criterion,
+ fun=criterion,
params=params,
algorithm=algorithm,
- derivative=derivative,
+ jac=derivative,
constraints=constraints,
- algo_options={"convergence.relative_criterion_tolerance": 1e-12},
+ algo_options={"convergence.ftol_rel": 1e-12},
)
if params_type == "pandas":
@@ -191,7 +191,7 @@ def test_constrained_minimization(
def test_fix_that_differs_from_start_value_raises_an_error():
with pytest.raises(InvalidParamsError):
minimize(
- criterion=lambda x: x @ x,
+ fun=lambda x: x @ x,
params=np.arange(3),
algorithm="scipy_lbfgsb",
constraints=[{"loc": [1], "type": "fixed", "value": 10}],
@@ -209,11 +209,11 @@ def test_three_independent_constraints():
]
res = minimize(
- criterion=lambda x: x @ x,
+ fun=lambda x: x @ x,
params=params,
algorithm="scipy_lbfgsb",
constraints=constraints,
- algo_options={"convergence.relative_criterion_tolerance": 1e-12},
+ algo_options={"convergence.ftol_rel": 1e-12},
)
expected = np.array([0] * 4 + [4, 5] + [0] + [7.5] * 2 + [0])
@@ -235,7 +235,7 @@ def test_incompatible_constraints_raise_errors(constraints):
with pytest.raises(InvalidConstraintError):
minimize(
- criterion=lambda x: x @ x,
+ fun=lambda x: x @ x,
params=params,
algorithm="scipy_lbfgsb",
constraints=constraints,
@@ -258,7 +258,7 @@ def return_all_but_working_hours(params):
return out
res = maximize(
- criterion=u,
+ fun=u,
params=start_params,
algorithm="scipy_lbfgsb",
constraints=[
@@ -279,7 +279,7 @@ def test_constraint_inheritance():
equality constraint, no matter to which set they were applied originally."""
for loc in [[0, 1], [2, 3]]:
res = minimize(
- criterion=lambda x: x @ x,
+ fun=lambda x: x @ x,
params=np.array([0.1, 0.9, 0.9, 0.1]),
algorithm="scipy_lbfgsb",
constraints=[
@@ -314,8 +314,8 @@ def test_covariance_constraint_in_2_by_2_case():
kwargs = {"y": spector_data.endog, "x": x_df.to_numpy()}
result = maximize(
- criterion=logit_loglike,
- criterion_kwargs=kwargs,
+ fun=logit_loglike,
+ fun_kwargs=kwargs,
params=start_params,
algorithm="scipy_lbfgsb",
constraints={"loc": [1, 2, 3], "type": "covariance"},
diff --git a/tests/optimagic/optimization/test_with_nonlinear_constraints.py b/tests/optimagic/optimization/test_with_nonlinear_constraints.py
index ee7f00726..367a30da8 100644
--- a/tests/optimagic/optimization/test_with_nonlinear_constraints.py
+++ b/tests/optimagic/optimization/test_with_nonlinear_constraints.py
@@ -169,7 +169,7 @@ def test_documentation_example(algorithm):
del kwargs["upper_bounds"]
minimize(
- criterion=criterion,
+ fun=criterion,
params=np.ones(6),
algorithm=algorithm,
constraints={
diff --git a/tests/optimagic/optimizers/test_bhhh.py b/tests/optimagic/optimizers/test_bhhh.py
index 789a622b8..a5dbd1168 100644
--- a/tests/optimagic/optimizers/test_bhhh.py
+++ b/tests/optimagic/optimizers/test_bhhh.py
@@ -129,8 +129,8 @@ def test_maximum_likelihood(criterion_and_derivative, result_statsmodels, reques
result_bhhh = bhhh_internal(
criterion_and_derivative,
x=x,
- convergence_absolute_gradient_tolerance=1e-8,
- stopping_max_iterations=200,
+ convergence_gtol_abs=1e-8,
+ stopping_maxiter=200,
)
aaae(result_bhhh["solution_x"], result_expected.params, decimal=4)
diff --git a/tests/optimagic/optimizers/test_fides_options.py b/tests/optimagic/optimizers/test_fides_options.py
index 5a0c65ab2..389f83f77 100644
--- a/tests/optimagic/optimizers/test_fides_options.py
+++ b/tests/optimagic/optimizers/test_fides_options.py
@@ -22,11 +22,11 @@
{"hessian_update_strategy": "sr1"},
{"hessian_update_strategy": "DFP"},
{"hessian_update_strategy": "bb"},
- {"convergence_relative_criterion_tolerance": 1e-6},
- {"convergence_absolute_params_tolerance": 1e-6},
- {"convergence_absolute_gradient_tolerance": 1e-6},
- {"convergence_relative_gradient_tolerance": 1e-6},
- {"stopping_max_iterations": 100},
+ {"convergence_ftol_rel": 1e-6},
+ {"convergence_xtol_abs": 1e-6},
+ {"convergence_gtol_abs": 1e-6},
+ {"convergence_gtol_rel": 1e-6},
+ {"stopping_maxiter": 100},
{"stopping_max_seconds": 200},
{"trustregion_initial_radius": 20, "trustregion_stepback_strategy": "truncate"},
{"trustregion_subspace_dimension": "full"},
@@ -81,7 +81,7 @@ def test_fides_stop_after_one_iteration():
x=np.array([1, -5, 3]),
lower_bounds=np.array([-10, -10, -10]),
upper_bounds=np.array([10, 10, 10]),
- stopping_max_iterations=1,
+ stopping_maxiter=1,
)
assert not res["success"]
assert res["n_iterations"] == 1
diff --git a/tests/optimagic/optimizers/test_ipopt_options.py b/tests/optimagic/optimizers/test_ipopt_options.py
index d50bab776..6f9aba809 100644
--- a/tests/optimagic/optimizers/test_ipopt_options.py
+++ b/tests/optimagic/optimizers/test_ipopt_options.py
@@ -8,8 +8,8 @@
test_cases = [
{},
- {"convergence_relative_criterion_tolerance": 1e-7},
- {"stopping_max_iterations": 1_100_000},
+ {"convergence_ftol_rel": 1e-7},
+ {"stopping_maxiter": 1_100_000},
{"mu_target": 1e-8},
{"s_max": 200},
{"stopping_max_wall_time_seconds": 200},
diff --git a/tests/optimagic/optimizers/test_neldermead.py b/tests/optimagic/optimizers/test_neldermead.py
index d04727677..62ece7a81 100644
--- a/tests/optimagic/optimizers/test_neldermead.py
+++ b/tests/optimagic/optimizers/test_neldermead.py
@@ -128,7 +128,7 @@ def test_fides_stop_after_one_iteration():
res = neldermead_parallel(
criterion=sphere,
x=np.array([1, -5, 3]),
- stopping_max_iterations=1,
+ stopping_maxiter=1,
)
assert not res["success"]
assert res["n_iterations"] == 1
diff --git a/tests/optimagic/shared/__init__.py b/tests/optimagic/shared/__init__.py
new file mode 100644
index 000000000..e69de29bb
diff --git a/tests/optimagic/test_process_function.py b/tests/optimagic/shared/test_process_function.py
similarity index 67%
rename from tests/optimagic/test_process_function.py
rename to tests/optimagic/shared/test_process_function.py
index 6c4cfdff7..44d67defd 100644
--- a/tests/optimagic/test_process_function.py
+++ b/tests/optimagic/shared/test_process_function.py
@@ -1,6 +1,9 @@
import pytest
from optimagic.exceptions import InvalidKwargsError
-from optimagic.shared.process_user_function import process_func_of_params
+from optimagic.shared.process_user_function import (
+ process_func_of_params,
+ get_kwargs_from_args,
+)
def test_process_func_of_params():
@@ -26,3 +29,13 @@ def f(params, b, c):
with pytest.raises(InvalidKwargsError):
process_func_of_params(f, {"c": 3})
+
+
+def test_get_kwargs_from_args():
+ def f(a, b, c=3, d=4):
+ return a + b + c
+
+ got = get_kwargs_from_args([1, 2], f, offset=1)
+ expected = {"b": 1, "c": 2}
+
+ assert got == expected
diff --git a/tests/optimagic/visualization/test_history_plots.py b/tests/optimagic/visualization/test_history_plots.py
index 903c25e5d..a31c7ecdc 100644
--- a/tests/optimagic/visualization/test_history_plots.py
+++ b/tests/optimagic/visualization/test_history_plots.py
@@ -13,7 +13,7 @@ def minimize_result():
res = []
for algorithm in ["scipy_neldermead", "scipy_lbfgsb"]:
_res = minimize(
- criterion=lambda x: x @ x,
+ fun=lambda x: x @ x,
params=np.arange(5),
algorithm=algorithm,
soft_lower_bounds=np.full(5, -1),
@@ -96,7 +96,7 @@ def test_criterion_plot_wrong_results():
def test_criterion_plot_different_input_types():
# logged result
minimize(
- criterion=lambda x: x @ x,
+ fun=lambda x: x @ x,
params=np.arange(5),
algorithm="scipy_lbfgsb",
soft_lower_bounds=np.full(5, -1),
@@ -108,7 +108,7 @@ def test_criterion_plot_different_input_types():
)
res = minimize(
- criterion=lambda x: x @ x,
+ fun=lambda x: x @ x,
params=np.arange(5),
algorithm="scipy_lbfgsb",
soft_lower_bounds=np.full(5, -1),
diff --git a/tests/test_deprecations.py b/tests/test_deprecations.py
index d442e2a35..a492b9f5a 100644
--- a/tests/test_deprecations.py
+++ b/tests/test_deprecations.py
@@ -25,6 +25,7 @@
from estimagic import OptimizeLogReader, OptimizeResult
from estimagic import criterion_plot, params_plot
import optimagic as om
+import warnings
# ======================================================================================
# Deprecated in 0.5.0, remove in 0.6.0
@@ -121,7 +122,7 @@ def _crit(params):
return x @ x
om.minimize(
- criterion=_crit,
+ fun=_crit,
params={"a": 1, "b": 2, "c": 3},
algorithm="scipy_lbfgsb",
logging=path,
@@ -142,8 +143,8 @@ def test_estimagic_optimize_result_is_deprecated():
with pytest.warns(FutureWarning, match=msg):
OptimizeResult(
params=res.params,
- criterion=res.criterion,
- start_criterion=res.start_criterion,
+ fun=res.fun,
+ start_fun=res.start_fun,
start_params=res.start_params,
algorithm=res.algorithm,
direction=res.direction,
@@ -275,3 +276,115 @@ def test_estimagic_params_plot_is_deprecated():
res = om.minimize(lambda x: x @ x, np.arange(3), algorithm="scipy_lbfgsb")
with pytest.warns(FutureWarning, match=msg):
params_plot(res)
+
+
+def test_criterion_is_depracated():
+ msg = "the `criterion` argument has been renamed"
+ with pytest.warns(FutureWarning, match=msg):
+ om.minimize(
+ criterion=lambda x: x @ x,
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ )
+
+
+def test_criterion_kwargs_is_deprecated():
+ msg = "the `criterion_kwargs` argument has been renamed"
+ with pytest.warns(FutureWarning, match=msg):
+ om.minimize(
+ lambda x, a: x @ x,
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ criterion_kwargs={"a": 1},
+ )
+
+
+def test_derivative_is_deprecated():
+ msg = "the `derivative` argument has been renamed"
+ with pytest.warns(FutureWarning, match=msg):
+ om.minimize(
+ lambda x: x @ x,
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ derivative=lambda x: 2 * x,
+ )
+
+
+def test_derivative_kwargs_is_deprecated():
+ msg = "the `derivative_kwargs` argument has been renamed"
+ with pytest.warns(FutureWarning, match=msg):
+ om.minimize(
+ lambda x: x @ x,
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ jac=lambda x, a: 2 * x,
+ derivative_kwargs={"a": 1},
+ )
+
+
+def test_criterion_and_derivative_is_deprecated():
+ msg = "the `criterion_and_derivative` argument has been renamed"
+ with pytest.warns(FutureWarning, match=msg):
+ om.minimize(
+ lambda x: x @ x,
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ criterion_and_derivative=lambda x: (x @ x, 2 * x),
+ )
+
+
+def test_criterion_and_derivative_kwargs_is_deprecated():
+ msg = "the `criterion_and_derivative_kwargs` argument has been renamed"
+ with pytest.warns(FutureWarning, match=msg):
+ om.minimize(
+ lambda x: x @ x,
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ fun_and_jac=lambda x, a: (x @ x, 2 * x),
+ criterion_and_derivative_kwargs={"a": 1},
+ )
+
+
+ALGO_OPTIONS = [
+ {"convergence_absolute_criterion_tolerance": 1e-8},
+ {"convergence_relative_criterion_tolerance": 1e-8},
+ {"convergence_absolute_params_tolerance": 1e-8},
+ {"convergence_relative_params_tolerance": 1e-8},
+ {"convergence_absolute_gradient_tolerance": 1e-8},
+ {"convergence_relative_gradient_tolerance": 1e-8},
+ {"convergence_scaled_gradient_tolerance": 1e-8},
+ {"stopping_max_iterations": 1_000},
+ {"stopping_max_criterion_evaluations": 1_000},
+]
+
+
+@pytest.mark.parametrize("algo_option", ALGO_OPTIONS)
+def test_old_convergence_criteria_are_deprecated(algo_option):
+ msg = "The following keys in `algo_options` are deprecated"
+ with warnings.catch_warnings():
+ warnings.simplefilter("ignore", category=UserWarning)
+ with pytest.warns(FutureWarning, match=msg):
+ om.minimize(
+ lambda x: x @ x,
+ params=np.arange(3),
+ algorithm="scipy_lbfgsb",
+ algo_options=algo_option,
+ )
+
+
+def test_deprecated_attributes_of_optimize_result():
+ res = om.minimize(lambda x: x @ x, np.arange(3), algorithm="scipy_lbfgsb")
+
+ msg = "attribute is deprecated"
+
+ with pytest.warns(FutureWarning, match=msg):
+ _ = res.n_criterion_evaluations
+
+ with pytest.warns(FutureWarning, match=msg):
+ _ = res.n_derivative_evaluations
+
+ with pytest.warns(FutureWarning, match=msg):
+ _ = res.criterion
+
+ with pytest.warns(FutureWarning, match=msg):
+ _ = res.start_criterion