Skip to content

Commit

Permalink
Polishing (#532)
Browse files Browse the repository at this point in the history
  • Loading branch information
janosg authored Sep 18, 2024
1 parent 786247c commit 42fd8a5
Show file tree
Hide file tree
Showing 131 changed files with 302 additions and 184 deletions.
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ repos:
- id: check-useless-excludes
# - id: identity # Prints all files passed to pre-commits. Debugging.
- repo: https://github.com/lyz-code/yamlfix
rev: 1.16.0
rev: 1.17.0
hooks:
- id: yamlfix
exclude: tests/optimagic/optimizers/_pounders/fixtures
Expand Down Expand Up @@ -68,7 +68,7 @@ repos:
- --blank
exclude: src/optimagic/optimization/algo_options.py
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.5.7
rev: v0.6.3
hooks:
# Run the linter.
- id: ruff
Expand Down Expand Up @@ -119,7 +119,7 @@ repos:
args:
- --drop-empty-cells
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.11.1
rev: v1.11.2
hooks:
- id: mypy
files: src|tests
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,13 +42,14 @@
"metadata": {},
"outputs": [],
"source": [
"import estimagic as em\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import pandas as pd\n",
"import scipy\n",
"import statsmodels.api as sm\n",
"from joblib import Parallel, delayed"
"from joblib import Parallel, delayed\n",
"\n",
"import estimagic as em"
]
},
{
Expand Down
5 changes: 3 additions & 2 deletions docs/source/estimagic/tutorials/bootstrap_overview.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,12 @@
"metadata": {},
"outputs": [],
"source": [
"import estimagic as em\n",
"import numpy as np\n",
"import pandas as pd\n",
"import seaborn as sns\n",
"import statsmodels.api as sm"
"import statsmodels.api as sm\n",
"\n",
"import estimagic as em"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,12 @@
"outputs": [],
"source": [
"# Make necessary imports\n",
"import estimagic as em\n",
"import pandas as pd\n",
"import statsmodels.formula.api as sm\n",
"from estimagic.config import EXAMPLE_DIR\n",
"from IPython.core.display import HTML"
"from IPython.core.display import HTML\n",
"\n",
"import estimagic as em\n",
"from estimagic.config import EXAMPLE_DIR"
]
},
{
Expand Down
3 changes: 2 additions & 1 deletion docs/source/estimagic/tutorials/likelihood_overview.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -34,11 +34,12 @@
"metadata": {},
"outputs": [],
"source": [
"import estimagic as em\n",
"import numpy as np\n",
"import pandas as pd\n",
"from scipy.stats import norm\n",
"\n",
"import estimagic as em\n",
"\n",
"rng = np.random.default_rng(seed=0)"
]
},
Expand Down
3 changes: 2 additions & 1 deletion docs/source/estimagic/tutorials/msm_overview.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -42,10 +42,11 @@
"metadata": {},
"outputs": [],
"source": [
"import estimagic as em\n",
"import numpy as np\n",
"import pandas as pd\n",
"\n",
"import estimagic as em\n",
"\n",
"rng = np.random.default_rng(seed=0)"
]
},
Expand Down
5 changes: 3 additions & 2 deletions docs/source/explanation/why_optimization_is_hard.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,9 @@
"outputs": [],
"source": [
"import numpy as np\n",
"import optimagic as om\n",
"import seaborn as sns"
"import seaborn as sns\n",
"\n",
"import optimagic as om"
]
},
{
Expand Down
1 change: 1 addition & 0 deletions docs/source/how_to/how_to_algorithm_selection.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"import optimagic as om"
]
},
Expand Down
1 change: 1 addition & 0 deletions docs/source/how_to/how_to_bounds.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"import optimagic as om"
]
},
Expand Down
1 change: 1 addition & 0 deletions docs/source/how_to/how_to_criterion_function.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"import optimagic as om\n",
"\n",
"\n",
Expand Down
1 change: 1 addition & 0 deletions docs/source/how_to/how_to_derivatives.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"import optimagic as om\n",
"\n",
"\n",
Expand Down
3 changes: 2 additions & 1 deletion docs/source/how_to/how_to_errors_during_optimization.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -49,9 +49,10 @@
"import warnings\n",
"\n",
"import numpy as np\n",
"import optimagic as om\n",
"from scipy.optimize import minimize as scipy_minimize\n",
"\n",
"import optimagic as om\n",
"\n",
"warnings.simplefilter(\"ignore\")"
]
},
Expand Down
1 change: 1 addition & 0 deletions docs/source/how_to/how_to_logging.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@
"from pathlib import Path\n",
"\n",
"import numpy as np\n",
"\n",
"import optimagic as om"
]
},
Expand Down
1 change: 1 addition & 0 deletions docs/source/how_to/how_to_multistart.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"import optimagic as om\n",
"\n",
"\n",
Expand Down
1 change: 1 addition & 0 deletions docs/source/how_to/how_to_slice_plot.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"import optimagic as om"
]
},
Expand Down
1 change: 1 addition & 0 deletions docs/source/how_to/how_to_visualize_histories.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"import optimagic as om"
]
},
Expand Down
5 changes: 3 additions & 2 deletions docs/source/tutorials/numdiff_overview.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,9 @@
"outputs": [],
"source": [
"import numpy as np\n",
"import optimagic as om\n",
"import pandas as pd"
"import pandas as pd\n",
"\n",
"import optimagic as om"
]
},
{
Expand Down
67 changes: 32 additions & 35 deletions docs/source/tutorials/optimization_overview.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -16,15 +16,18 @@
"outputs": [],
"source": [
"import numpy as np\n",
"import optimagic as om\n",
"import pandas as pd"
"import pandas as pd\n",
"\n",
"import optimagic as om"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Basic usage of `minimize`"
"## Basic usage of `minimize`\n",
"\n",
"The basic usage of `optimagic.minimize` is very similar to `scipy.optimize.minimize`"
]
},
{
Expand All @@ -43,13 +46,13 @@
"metadata": {},
"outputs": [],
"source": [
"res = om.minimize(\n",
"lbfgsb_res = om.minimize(\n",
" fun=sphere,\n",
" params=np.arange(5),\n",
" algorithm=\"scipy_lbfgsb\",\n",
")\n",
"\n",
"res.params.round(5)"
"lbfgsb_res.params.round(5)"
]
},
{
Expand All @@ -58,7 +61,7 @@
"source": [
"## `params` do not have to be vectors\n",
"\n",
"In optimagic, params can by arbitrary [pytrees](https://jax.readthedocs.io/en/latest/pytrees.html). Examples are (nested) dictionaries of numbers, arrays and pandas objects. "
"In optimagic, params can by arbitrary [pytrees](https://jax.readthedocs.io/en/latest/pytrees.html). Examples are (nested) dictionaries of numbers, arrays and pandas objects. This is very useful if you have many parameters!"
]
},
{
Expand All @@ -77,20 +80,22 @@
"metadata": {},
"outputs": [],
"source": [
"res = om.minimize(\n",
"nm_res = om.minimize(\n",
" fun=dict_sphere,\n",
" params={\"a\": 0, \"b\": 1, \"c\": pd.Series([2, 3, 4])},\n",
" algorithm=\"scipy_powell\",\n",
" algorithm=\"scipy_neldermead\",\n",
")\n",
"\n",
"res.params"
"nm_res.params"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## The result contains all you need to know"
"## You can compare optimizers\n",
"\n",
"In practice, it is super hard to pick the right optimizer for your problem. With optimagic, you can simply try a few and compare their results!"
]
},
{
Expand All @@ -99,29 +104,16 @@
"metadata": {},
"outputs": [],
"source": [
"res = om.minimize(\n",
" fun=dict_sphere,\n",
" params={\"a\": 0, \"b\": 1, \"c\": pd.Series([2, 3, 4])},\n",
" algorithm=\"scipy_neldermead\",\n",
")\n",
"res"
"results = {\"lbfgsb\": lbfgsb_res, \"nelder_mead\": nm_res}\n",
"fig = om.criterion_plot(results, max_evaluations=300)\n",
"fig.show(renderer=\"png\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## You can visualize the convergence"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig = om.criterion_plot(res, max_evaluations=300)\n",
"fig.show(renderer=\"png\")"
"You can also zoom in on the history of specific parameters. This can be super helpful to diagnose problems in the optimization. "
]
},
{
Expand All @@ -131,7 +123,7 @@
"outputs": [],
"source": [
"fig = om.params_plot(\n",
" res,\n",
" nm_res,\n",
" max_evaluations=300,\n",
" # optionally select a subset of parameters to plot\n",
" selector=lambda params: params[\"c\"],\n",
Expand All @@ -145,20 +137,23 @@
"source": [
"## There are many optimizers\n",
"\n",
"If you install some optional dependencies, you can choose from a large (and growing) set of optimization algorithms -- all with the same interface!\n",
"By default, optimagic comes with optimizers from scipy, including global optimizers \n",
"and least-squares optimizers. But we also have wrappers for algorithms from **NlOpt**, \n",
"**Pygmo**, as well as several optimizers from individual packages like **fides**, \n",
"**ipopt**, **pybobyqa** and **dfols**. \n",
"\n",
"For example, we wrap optimizers from `scipy.optimize`, `nlopt`, `cyipopt`, `pygmo`, `fides`, `tao` and others. \n",
"To use optimizers that are not from scipy, follow our [installation guide](../installation.md) for optional dependencies. To see which optimizers we have, check out the [full list](../algorithms.md).\n",
"\n",
"We also have some optimizers that are not part of other packages. Examples are a `parallel Nelder-Mead` algorithm, The `BHHH` algorithm and a `parallel Pounders` algorithm.\n",
"\n",
"See the full list [here](../how_to_guides/optimization/how_to_specify_algorithm_and_algo_options"
"If you are missing your favorite optimizer in the list, let us know with an [issue](https://github.com/optimagic-dev/optimagic/issues)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## You can add bounds"
"## You can add bounds\n",
"\n",
"As any optimizer library, optimagic lets you specify bounds for the parameters."
]
},
{
Expand All @@ -183,7 +178,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## You can fix parameters "
"## You can fix parameters \n",
"\n",
"On top of bounds, you can also fix one or more parameters during the optimization. "
]
},
{
Expand Down
5 changes: 3 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -40,8 +40,9 @@ classifiers = [
"Operating System :: MacOS :: MacOS X",
"Operating System :: Microsoft :: Windows",
"Operating System :: POSIX",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Topic :: Scientific/Engineering",
]
authors = [
Expand Down
Loading

0 comments on commit 42fd8a5

Please sign in to comment.