This repository has been archived by the owner on Sep 18, 2024. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add GP Tuner and related doc (#1191)
* fix link err in docs * add spaces * re-organise links for detailed descriptions of the tuners and accessors; fix link err in HpoComparision.md * add in-page link by change .md to .html * delete #section from cross-file links to make links work in both readthedocs and github docs * gp_tuner init from fmfn's repo * fix params bug by adding float>int transition * add optimal choices; support randint&quniform type; add doc * refine doc and code * change mnist yml comments * typo fix * fix val err * fix minimize mode err * add config test and Hpo result * support quniform type; update doc; update test config * update doc * un-commit changed in yarn.lock * fix optimize mode bug * optimize mode * optimize mode * reset pylint, gitignore * revert .gitignore yarn.lock
- Loading branch information
1 parent
c217992
commit a587648
Showing
21 changed files
with
679 additions
and
13 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,10 @@ | ||
GP Tuner on NNI | ||
=== | ||
|
||
## GP Tuner | ||
|
||
Bayesian optimization works by constructing a posterior distribution of functions (Gaussian Process here) that best describes the function you want to optimize. As the number of observations grows, the posterior distribution improves, and the algorithm becomes more certain of which regions in parameter space are worth exploring and which are not. | ||
|
||
GP Tuner is designed to minimize/maximize the number of steps required to find a combination of parameters that are close to the optimal combination. To do so, this method uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) and common tools can be employed. Therefore Bayesian Optimization is most adequate for situations where sampling the function to be optimized is a very expensive endeavor. | ||
|
||
This optimization approach is described in Section 3 of [Algorithms for Hyper-Parameter Optimization](https://papers.nips.cc/paper/4443-algorithms-for-hyper-parameter-optimization.pdf). |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,170 @@ | ||
# Copyright (c) Microsoft Corporation | ||
# All rights reserved. | ||
# | ||
# MIT License | ||
# | ||
# Permission is hereby granted, free of charge, | ||
# to any person obtaining a copy of this software and associated | ||
# documentation files (the "Software"), to deal in the Software without restriction, | ||
# including without limitation the rights to use, copy, modify, merge, publish, | ||
# distribute, sublicense, and/or sell copies of the Software, and | ||
# to permit persons to whom the Software is furnished to do so, subject to the following conditions: | ||
# The above copyright notice and this permission notice shall be included | ||
# in all copies or substantial portions of the Software. | ||
# | ||
# THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING | ||
# BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND | ||
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, | ||
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, | ||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. | ||
''' | ||
gp_tuner.py | ||
''' | ||
|
||
import warnings | ||
import logging | ||
import numpy as np | ||
|
||
from sklearn.gaussian_process.kernels import Matern | ||
from sklearn.gaussian_process import GaussianProcessRegressor | ||
|
||
from nni.tuner import Tuner | ||
from nni.utils import OptimizeMode, extract_scalar_reward | ||
|
||
from .target_space import TargetSpace | ||
from .util import UtilityFunction, acq_max | ||
|
||
logger = logging.getLogger("GP_Tuner_AutoML") | ||
|
||
|
||
class GPTuner(Tuner): | ||
''' | ||
GPTuner | ||
''' | ||
|
||
def __init__(self, optimize_mode="maximize", utility='ei', kappa=5, xi=0, nu=2.5, alpha=1e-6, cold_start_num=10, | ||
selection_num_warm_up=100000, selection_num_starting_points=250): | ||
self.optimize_mode = OptimizeMode(optimize_mode) | ||
|
||
# utility function related | ||
self.utility = utility | ||
self.kappa = kappa | ||
self.xi = xi | ||
|
||
# target space | ||
self._space = None | ||
|
||
self._random_state = np.random.RandomState() | ||
|
||
# nu, alpha are GPR related params | ||
self._gp = GaussianProcessRegressor( | ||
kernel=Matern(nu=nu), | ||
alpha=alpha, | ||
normalize_y=True, | ||
n_restarts_optimizer=25, | ||
random_state=self._random_state | ||
) | ||
# num of random evaluations before GPR | ||
self._cold_start_num = cold_start_num | ||
|
||
# params for acq_max | ||
self._selection_num_warm_up = selection_num_warm_up | ||
self._selection_num_starting_points = selection_num_starting_points | ||
|
||
# num of imported data | ||
self.supplement_data_num = 0 | ||
|
||
def update_search_space(self, search_space): | ||
"""Update the self.bounds and self.types by the search_space.json | ||
Parameters | ||
---------- | ||
search_space : dict | ||
""" | ||
self._space = TargetSpace(search_space, self._random_state) | ||
|
||
def generate_parameters(self, parameter_id): | ||
"""Generate next parameter for trial | ||
If the number of trial result is lower than cold start number, | ||
gp will first randomly generate some parameters. | ||
Otherwise, choose the parameters by the Gussian Process Model | ||
Parameters | ||
---------- | ||
parameter_id : int | ||
Returns | ||
------- | ||
result : dict | ||
""" | ||
if self._space.len() < self._cold_start_num: | ||
results = self._space.random_sample() | ||
else: | ||
# Sklearn's GP throws a large number of warnings at times, but | ||
# we don't really need to see them here. | ||
with warnings.catch_warnings(): | ||
warnings.simplefilter("ignore") | ||
self._gp.fit(self._space.params, self._space.target) | ||
|
||
util = UtilityFunction( | ||
kind=self.utility, kappa=self.kappa, xi=self.xi) | ||
|
||
results = acq_max( | ||
f_acq=util.utility, | ||
gp=self._gp, | ||
y_max=self._space.target.max(), | ||
bounds=self._space.bounds, | ||
space=self._space, | ||
num_warmup=self._selection_num_warm_up, | ||
num_starting_points=self._selection_num_starting_points | ||
) | ||
|
||
results = self._space.array_to_params(results) | ||
logger.info("Generate paramageters:\n %s", results) | ||
return results | ||
|
||
def receive_trial_result(self, parameter_id, parameters, value): | ||
"""Tuner receive result from trial. | ||
Parameters | ||
---------- | ||
parameter_id : int | ||
parameters : dict | ||
value : dict/float | ||
if value is dict, it should have "default" key. | ||
""" | ||
value = extract_scalar_reward(value) | ||
if self.optimize_mode == OptimizeMode.Minimize: | ||
value = -value | ||
|
||
logger.info("Received trial result.") | ||
logger.info("value :%s", value) | ||
logger.info("parameter : %s", parameters) | ||
self._space.register(parameters, value) | ||
|
||
def import_data(self, data): | ||
"""Import additional data for tuning | ||
Parameters | ||
---------- | ||
data: | ||
a list of dictionarys, each of which has at least two keys, 'parameter' and 'value' | ||
""" | ||
_completed_num = 0 | ||
for trial_info in data: | ||
logger.info("Importing data, current processing progress %s / %s" % | ||
(_completed_num, len(data))) | ||
_completed_num += 1 | ||
assert "parameter" in trial_info | ||
_params = trial_info["parameter"] | ||
assert "value" in trial_info | ||
_value = trial_info['value'] | ||
if not _value: | ||
logger.info( | ||
"Useless trial data, value is %s, skip this trial data." % _value) | ||
continue | ||
self.supplement_data_num += 1 | ||
_parameter_id = '_'.join( | ||
["ImportData", str(self.supplement_data_num)]) | ||
self.receive_trial_result( | ||
parameter_id=_parameter_id, parameters=_params, value=_value) | ||
logger.info("Successfully import data to GP tuner.") |
Oops, something went wrong.