Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doc fixes #893

Merged
merged 40 commits into from
Dec 20, 2022
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
2cb6556
[DOC] Read-through and correcting mistakes up to advanced usage
Dec 8, 2022
7671563
[DOC] Correct spelling, grammar and some broken links/ lists
helegraf Dec 8, 2022
0057b05
Add overview figure of components
Dec 14, 2022
4245c42
Fix spelling and links in README.md, rename changelog.md to CHANGELOG.md
helegraf Dec 15, 2022
ecdd18c
[DOC] runhistory.py typos
Dec 15, 2022
1394a96
[DOC] runhistory/enumerations.py killed the old no longer existing ty…
Dec 15, 2022
5130e88
Fix docstrings
Dec 15, 2022
0b44442
Change docstring style to numpy in CONTRIBUTING.md + typos
helegraf Dec 15, 2022
7633c5f
[DOC] dataclasses.py typos & minor rephrasing
Dec 15, 2022
1290e0a
Merge branch 'doc_fixes' of https://github.com/automl/SMAC3 into doc_…
Dec 15, 2022
a583f34
Fix docstrings
Dec 15, 2022
67eb862
Fix docstrings
Dec 15, 2022
a40ccc7
Fix docstrings
Dec 15, 2022
79e9c58
Fix docstrings
Dec 15, 2022
a273b53
Fix docstrings
Dec 15, 2022
14ba440
[DOC] Abstract_encoder typos & annotations. added doc to private getters
Dec 15, 2022
b0f95ae
[DOC] eips_encoder typos
Dec 15, 2022
ceb147e
[DOC] runhistory/encoder: typos
Dec 15, 2022
0cf19fa
Merge remote-tracking branch 'origin/doc_fixes' into doc_fixes
Dec 15, 2022
80b3ca4
doc for intensifier
dengdifan Dec 15, 2022
cd9ece3
[DOC] runner (all of it)
Dec 15, 2022
7880734
Fix doc string issues in facade package.
Dec 15, 2022
adf51a3
Move boing_facade to old folder
Dec 15, 2022
c01d272
Remove old boing_facade
Dec 15, 2022
716df02
Fix doc string problems in main package
Dec 15, 2022
2dc7d46
Fix doc string errors in the random_design package
Dec 15, 2022
27f9463
Fix doc string problems in the utils package
Dec 15, 2022
3238692
Fix import for boing_facade in tests
Dec 15, 2022
1f649b8
Merge remote-tracking branch 'origin/doc_fixes' into doc_fixes
Dec 15, 2022
757982e
Fix docstrings
Dec 15, 2022
a2ebd26
Merge branch 'doc_fixes' of https://github.com/automl/SMAC3 into doc_…
Dec 15, 2022
231f0bb
Add link for latin hypercube design
helegraf Dec 15, 2022
f836d06
Merge remote-tracking branch 'origin/doc_fixes' into doc_fixes
helegraf Dec 15, 2022
4f1516c
docs for multi-objective
dengdifan Dec 15, 2022
659f34f
Merge branch 'doc_fixes' of https://github.com/automl/SMAC3 into doc_…
dengdifan Dec 15, 2022
eddb313
Fix doc string and comment typos in model
Dec 15, 2022
8668ace
Fix doc string and comment typos in model
Dec 15, 2022
a526909
Add pycharm to gitignore
Dec 15, 2022
44b9597
Fix doc string and comment typos in model
Dec 15, 2022
8d3c347
Merge branch 'development' into doc_fixes
renesass Dec 20, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions docs/2_package_overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ Package Overview
================

SMAC supports you in determining well-performing hyperparameter configurations for your algorithms. By being a robust
and flexible framework for :term:`Bayesian Optimization<BO>`, SMAC can improve performance within few function
and flexible framework for :term:`Bayesian Optimization<BO>`, SMAC can improve performance within a few function
evaluations. It offers several entry points and pre-sets for typical use cases, such as optimizing
hyperparameters, solving low dimensional continuous (artificial) global optimization problems and configuring algorithms
to perform well across multiple problem :term:`instances<Instances>`.
Expand All @@ -11,7 +11,7 @@ to perform well across multiple problem :term:`instances<Instances>`.
Features
--------

SMAC has following characteristics and capabilities:
SMAC has the following characteristics and capabilities:

Global Optimizer
:term:`Bayesian Optimization` is used for sample-efficient optimization.
Expand All @@ -28,7 +28,7 @@ Any Objectives
Optimization with any :term:`objective<Objective>` (e.g., accuracy, runtime, cross-validation, ...) is possible.

:ref:`Multi-Objective<Multi-Objective Optimization>`
Optimize arbitrary number of objectives using scalarized multi-ojective algorithms. Both ParEGO [Know06]_ and
Optimize arbitrary number of objectives using scalarized multi-objective algorithms. Both ParEGO [Know06]_ and
mean aggregation strategies are supported.

:ref:`Multi-Fidelity<Multi-Fidelity Optimization>` Optimization
Expand All @@ -41,11 +41,11 @@ Any Objectives

Command-Line Interface
SMAC can not only be executed within a python file but also from the commandline. Consequently,
not only algorithms in python can be optimized but in other languages as well.
not only algorithms in python can be optimized, but implementations in other languages as well.

.. note ::

Command-line interface has been temporarely disabled in v2.0. Please fall back to v1.4 if you need it.
Command-line interface has been temporarily disabled in v2.0. Please fall back to v1.4 if you need it.


Comparison
Expand Down
32 changes: 20 additions & 12 deletions docs/3_getting_started.rst
Original file line number Diff line number Diff line change
@@ -1,9 +1,15 @@
Getting Started
===============

In the core, SMAC needs four components (configuration space, target function, scenario and a facade) to run an
SMAC needs four core components (configuration space, target function, scenario and a facade) to run an
optimization process, all of which are explained on this page.

They interact in the following way:

.. image:: ../../images/smac_components_interaction.jpg
:width: 400
:alt: Interaction of SMAC's components


Configuration Space
-------------------
Expand All @@ -28,9 +34,11 @@ Target Function
---------------

The target function takes a configuration from the configuration space and returns a performance value.
For example, you could use a Neural Network and predict the performance based on the learning rate. Every configuration
would (most likely) return a different value. However, SMAC tries to find the best learning rate by trying
different and potentially improving configurations.
For example, you could use a Neural Network to predict on your data and get some validation performance.
If, for instance, you would tune the learning rate of the Network's optimizer, every learning rate will
change the final validation performance of the network. This is the target function.
SMAC tries to find the best performing learning rate by trying different values and evaluating the target function -
in an efficient way.

.. code-block:: python

Expand Down Expand Up @@ -65,7 +73,7 @@ Scenario
--------

The :ref:`Scenario<smac.scenario>` is used to provide environment variables. For example,
you want to limit the optimization process by a time limit or want to specify where to save the results.
if you want to limit the optimization process by a time limit or want to specify where to save the results.

.. code-block:: python

Expand All @@ -85,11 +93,11 @@ Facade
------

A :ref:`facade<smac.facade>` is the entry point to SMAC, which constructs a default optimization
pipeline for you. SMAC offers various facades, which satisfy many use cases and are crucial to
achieving peak performance. The idea behind the facades is to provide a simple interface to SMAC,
which is easy to use and understand without diving deep into the material. However, experts are
invited to change the components as they please to achieve even better performance potentially. The following
table (horizontal scrollable) shows you what is supported and reveals the default :ref:`components<Components>`:
pipeline for you. SMAC offers various facades, which satisfy many common use cases and are crucial to
achieving peak performance. The idea behind the facades is to provide a simple interface to all of SMAC's components,
which is easy to use and understand and without the need of deep diving into the material. However, experts are
invited to change the components to their specific hyperparameter optimization needs. The following
table (horizontally scrollable) shows you what is supported and reveals the default :ref:`components<Components>`:


.. csv-table::
Expand All @@ -101,7 +109,7 @@ table (horizontal scrollable) shows you what is supported and reveals the defaul
"Initial Design", ":ref:`Sobol<smac.initial\\_design.sobol\\_design>`", ":ref:`Sobol<smac.initial\\_design.sobol\\_design>`", ":ref:`Random<smac.initial\\_design.random\\_design>`", ":ref:`Default<smac.initial\\_design.default\\_design>`", ":ref:`Default<smac.initial\\_design.default\\_design>`", ":ref:`Default<smac.initial\\_design.default\\_design>`"
"Surrogate Model", ":ref:`Gaussian Process<smac.model.gaussian\\_process.gaussian\\_process>`", ":ref:`Random Forest<smac.model.random\\_forest.random\\_forest>`", ":ref:`Random Forest<smac.model.random\\_forest.random\\_forest>`", ":ref:`Random Forest<smac.model.random\\_forest.random\\_forest>`", "Not used", "Not used"
"Acquisition Function", ":ref:`Expected Improvement<smac.acquisition.function.expected\\_improvement>`", ":ref:`Log Expected Improvement<smac.acquisition.function.expected\\_improvement>`", ":ref:`Log Expected Improvement<smac.acquisition.function.expected\\_improvement>`", ":ref:`Expected Improvement<smac.acquisition.function.expected\\_improvement>`", "Not used", "Not used"
"Acquisition Maximier", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.local\\_and\\_random\\_search>`", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.local\\_and\\_random\\_search>`", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.local\\_and\\_random\\_search>`", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.local\\_and\\_random\\_search>`", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.random\\_search>`", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.random\\_search>`"
"Acquisition Maximizer", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.local\\_and\\_random\\_search>`", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.local\\_and\\_random\\_search>`", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.local\\_and\\_random\\_search>`", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.local\\_and\\_random\\_search>`", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.random\\_search>`", ":ref:`Local and Sorted Random Search<smac.acquisition.maximizer.random\\_search>`"
"Intensifier", ":ref:`Default<smac.intensifier.intensifier>`", ":ref:`Default<smac.intensifier.intensifier>`", ":ref:`Hyperband<smac.intensifier.hyperband>`", ":ref:`Hyperband<smac.intensifier.hyperband>`", ":ref:`Default<smac.intensifier.intensifier>`", ":ref:`Hyperband<smac.intensifier.hyperband>`",
"Runhistory Encoder", ":ref:`Default<smac.runhistory.encoder.encoder>`", ":ref:`Log<smac.runhistory.encoder.log\\_encoder>`", ":ref:`Log<smac.runhistory.encoder.log\\_encoder>`", ":ref:`Default<smac.runhistory.encoder.encoder>`", ":ref:`Default<smac.runhistory.encoder.encoder>`", ":ref:`Default<smac.runhistory.encoder.encoder>`"
"Random Design Probability", "8.5%", "20%", "20%", "50%", "Not used", "Not used"
Expand All @@ -119,7 +127,7 @@ table (horizontal scrollable) shows you what is supported and reveals the defaul
to previous versions, instantiated objects are passed instead of *kwargs*.


The facades can be imported directely from the ``smac`` module.
The facades can be imported directly from the ``smac`` module.

.. code-block:: python

Expand Down
111 changes: 57 additions & 54 deletions docs/7_glossary.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,96 +3,99 @@ Glossary

.. glossary::

SMAC
Sequential Model-Based Algorithm Configuration.
BB
See :term:`Black-Box`.

BO
See :term:`Bayesian Optimization`.

HB
See :term:`Hyperband`.

BOHB
`Bayesian optimization and Hyperband <https://arxiv.org/abs/1807.01774>`_.

ROAR
See :term:`Random Online Adaptive Racing`.

BB
See :term:`Black-Box`.

MF
See :term:`Multi-Fidelity`.
CLI
Command-Line Interface.

RF
Random Forest.
CV
Cross-Validation.

GP
Gaussian Process.

GP-MCMC
Gaussian Process with Markov-Chain Monte-Carlo.

CV
Cross-Validation.

CLI
Command-Line Interface.
HB
See :term:`Hyperband`.

HP
Hyperparameter.

MF
See :term:`Multi-Fidelity`.

RF
Random Forest.

ROAR
See :term:`Random Online Adaptive Racing`.

SMAC
Sequential Model-Based Algorithm Configuration.

SMBO
Sequential Mode-Based Optimization.

Bayesian Optimization
Bayesian optimization is a sequential design strategy for global optimization of black-box functions that does
not assume any functional forms. It is usually employed to optimize expensive-to-evaluate functions.
A Bayesian optimization weights exploration and exploitation to find the minimum of its objective.

Hyperband
`Hyperband <https://arxiv.org/abs/1603.06560>`_. A novel bandit-based algorithm for hyperparameter
optimization. Hyperband is an extension of successive halving and therefore works with
multi-fidelities.

Random Online Adaptive Racing
Random Online Adaptive Racing. A simple model-free instantiation of the general SMBO framework.
It selects configurations uniformly random and iteratively compares them against the current incumbent
using the intensification mechanism. See `SMAC extended <https://ai.dmi.unibas.ch/research/reading_group/hutter-et-al-tr2010.pdf>`_
chapter 3.2 for details.

Black-Box
Refers to an algorithm being optimized, where only input and output are observable.

Target Function
Your model, which returns a cost based on the given config, seed, budget, and/or instance.

Trial
Trial is a single run of a target function on a combination of configuration, seed, budget and/or instance.

Objective
An objective is a metric to evaluate the quality or performance of an algorithm.

Multi-Objective
A multi-objective optimization problem is a problem with more than one objective.
The goal is to find a solution that is optimal or at least a good compromise in all objectives.

Budget
Budget is another word for fidelity. Examples are the number of training epochs or the size of
the data subset the algorithm is trained on. However, budget can also be used in the context of
instances. For example, if you have 100 instances (let's say we optimize across datasets) and you want to run
the data subset the algorithm is trained on. However, budget can also be used in the context of
instances. For example, if you have 100 instances (let's say we optimize across datasets) and you want to run
your algorithm on 10 of them, then the budget is 10.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a bit confusing to me. It sounds like that we will only use 10 instances for all budget levels


Multi-Fidelity
Multi-fidelity refers to running an algorithm on multiple budgets (such as number of epochs or
subsets of data) and thereby evaluating the performance prematurely.
Hyperband
`Hyperband <https://arxiv.org/abs/1603.06560>`_. A novel bandit-based algorithm for hyperparameter
optimization. Hyperband is an extension of successive halving and therefore works with
multi-fidelities.

Incumbent
The incumbent is the current best known configuration.

Instances
Often you want to optimize across different datasets, subsets, or even different transformations (e.g.
Often you want to optimize across different datasets, subsets, or even different transformations (e.g.
augmentation). In general, each of these is called an instance. Configurations are evaluated on multiple
instances so that a configuration found which performs superior on all instances instead of only
instances so that a configuration is found which performs superior on all instances instead of only
a few.

Intensification
A mechanism, that governs how many evaluations to perform with each configuration and when to trust a
A mechanism that governs how many evaluations to perform with each configuration and when to trust a
configuration enough to make it the new current best known configuration (the incumbent).

Incumbent
The incumbent is the current best known configuration.
Multi-Fidelity
Multi-fidelity refers to running an algorithm on multiple budgets (such as number of epochs or
subsets of data) and thereby evaluating the performance prematurely.

Multi-Objective
A multi-objective optimization problem is a problem with more than one objective.
The goal is to find a solution that is optimal or at least a good compromise in all objectives.

Objective
An objective is a metric to evaluate the quality or performance of an algorithm.

Random Online Adaptive Racing
Random Online Adaptive Racing. A simple model-free instantiation of the general :term:`SMBO` framework.
It selects configurations uniformly at random and iteratively compares them against the current incumbent
using the intensification mechanism. See `SMAC extended <https://ai.dmi.unibas.ch/research/reading_group/hutter-et-al-tr2010.pdf>`_
chapter 3.2 for details.

Target Function
Your model, which returns a cost based on the given config, seed, budget, and/or instance.

Trial
Trial is a single run of a target function on a combination of configuration, seed, budget and/or instance.
4 changes: 2 additions & 2 deletions docs/8_faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ pyrfr raises cryptic import errors.


How can I use :term:`BOHB` and/or `HpBandSter <https://github.com/automl/HpBandSter>`_ with SMAC?
The facade MultiFidelityFacade is the closes implementation to :term:`BOHB` and/or `HpBandSter <https://github.com/automl/HpBandSter>`_.
The facade MultiFidelityFacade is the closest implementation to :term:`BOHB` and/or `HpBandSter <https://github.com/automl/HpBandSter>`_.


I discovered a bug or SMAC does not behave as expected. Where should I report to?
Expand All @@ -35,7 +35,7 @@ I discovered a bug or SMAC does not behave as expected. Where should I report to
* A description of the problem.
* An example to reproduce the problem.
* Any information about your setup that could be helpful to resolve the bug (such as installed python packages).
* Feel free, to add a screenshot showing the issue.
* Feel free to add a screenshot showing the issue.


I want to contribute code or discuss a new idea. Where should I report to?
Expand Down
4 changes: 2 additions & 2 deletions docs/advanced_usage/10_continue.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,12 @@ Continue
========

SMAC automatically restores states where it left off if a run was interrupted or finished. To do so, it reads in old
files (derivided from scenario's name, output_directory and seed) and sets the components.
files (derived from scenario's name, output_directory and seed) and sets the components.

.. warning::

If you changed any code and specified a name, SMAC will ask you whether you still want to resume or
delete the old run completely. If you not specified a name, SMAC generates a new name and the old run is
delete the old run completely. If you did not specify a name, SMAC generates a new name and the old run is
not affected.

Please have a look at our :ref:`continue example<Continue an Optimization>`.
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Reproducability
Reproducibility
===============

Reproducibility can only be ensured if one worker is used and no time (wallclock or CPU time) is involved.
Loading