Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Dev smac #116

Merged
merged 8 commits into from
Sep 25, 2018
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions docs/SearchSpaceSpec.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,3 +66,5 @@ The candidate type and value for variable is here:
* Which means the variable value is a value like round(exp(normal(mu, sigma)) / q) * q
* Suitable for a discrete variable with respect to which the objective is smooth and gets smoother with the size of the variable, which is bounded from one side.
<br/>

Note that SMAC only supports a subset of the types above, including `choice`, `randint`, `uniform`, `loguniform`, `quniform(q=1)`.
8 changes: 1 addition & 7 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,16 +81,10 @@ def run(self):
'pyyaml',
'requests',
'scipy',
'schema'
],
dependency_links = [
'git+https://github.com/hyperopt/hyperopt.git'
'schema'
],

cmdclass={
'install': CustomInstallCommand
},
entry_points={
QuanluZhang marked this conversation as resolved.
Show resolved Hide resolved
'console_scripts': ['nnictl = nnicmd.nnictl:parse_args']
}
)
2 changes: 1 addition & 1 deletion src/nni_manager/rest_server/restValidationSchemas.ts
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ export namespace ValidationSchemas {
searchSpace: joi.string().required(),
maxExecDuration: joi.number().min(0).required(),
tuner: joi.object({
builtinTunerName: joi.string().valid('TPE', 'Random', 'Anneal', 'Evolution'),
builtinTunerName: joi.string().valid('TPE', 'Random', 'Anneal', 'Evolution', 'SMAC'),
codeDir: joi.string(),
classFileName: joi.string(),
className: joi.string(),
Expand Down
36 changes: 25 additions & 11 deletions src/sdk/pynni/nni/README.md
Original file line number Diff line number Diff line change
@@ -1,41 +1,53 @@
# How to use Tuner that NNI support?
# How to use Tuner that NNI supports?

For now, NNI could support tuner algorithm as following:
For now, NNI could support tuner algorithms as following:

- TPE
- Random Search
- Anneal
- Naive Evolution
- ENAS (on going)
- SMAC
- ENAS (ongoing)
- Batch (ongoing)


**1. Tuner algorithm introduction**
## 1. Tuner algorithm introduction


We will introduce some basic knowledge about tuner algorithm here. If you are an expert, you could skip this part and jump to how to use.

*1.1 TPE*
**TPE**

The Tree-structured Parzen Estimator (TPE) is a sequential model-based optimization (SMBO) approach. SMBO methods sequentially construct models to approximate the performance of hyperparameters based on historical measurements, and then subsequently choose new hyperparameters to test based on this model.

The TPE approach models P(x|y) and P(y) where x represents hyperparameters and y the associated evalate matric. P(x|y) is modeled by transforming the generative process of hyperparameters, replacing the distributions of the configuration prior with non-parametric densities. This optimization approach is described in detail in [Algorithms for Hyper-Parameter Optimization][1].

Comparing with other algorithm, TPE could be achieve better result when the number of trial experiment is small. Also TPE support continuous or discrete hyper-parameters. From a large amount of experiments, we could found that TPE is far better than Random Search.

*1.2 Random Search*
**Random Search**

In [Random Search for Hyper-Parameter Optimization][2] show that Random Search might be surprsingly simple and effective. We suggests that we could use Random Search as basline when we have no knowledge about the prior distribution of hyper-parameters.

*1.3 Anneal*
**Anneal**

*1.4 Naive Evolution*
**Naive Evolution**

Naive Evolution comes from [Large-Scale Evolution of Image Classifiers][3]. Naive Evolution requir more experiments to works, but it's very simple and easily to expand new features. There are some tips for user:

1) large initial population could avoid to fall into local optimum
2) use some strategies to keep the deversity of population could be better.

**SMAC**

[SMAC][4] is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO, in order to handle categorical parameters. The SMAC supported by nni is a wrapper on [the SMAC3 github repo][5].

Note that SMAC only supports a subset of the types in [search space spec](../../../../docs/SearchSpaceSpec.md), including `choice`, `randint`, `uniform`, `loguniform`, `quniform(q=1)`.
QuanluZhang marked this conversation as resolved.
Show resolved Hide resolved

**Batch**

Batch allows users to simply provide several configurations (i.e., choices of hyper-parameters) for their trial code. After finishing all the configurations, the experiment is done.


**2. How to use the tuner algorithm in NNI?**
## 2. How to use the tuner algorithm in NNI?

User only need to do one thing: choose a Tuner```config.yaml```.
Here is an example:
Expand All @@ -61,4 +73,6 @@ There are two filed you need to set:

[1]: https://papers.nips.cc/paper/4443-algorithms-for-hyper-parameter-optimization.pdf
[2]: http://www.jmlr.org/papers/volume13/bergstra12a/bergstra12a.pdf
[3]: https://arxiv.org/pdf/1703.01041.pdf
[3]: https://arxiv.org/pdf/1703.01041.pdf
[4]: https://www.cs.ubc.ca/~hutter/papers/10-TR-SMAC.pdf
[5]: https://github.com/automl/SMAC3
67 changes: 45 additions & 22 deletions src/sdk/pynni/nni/__main__.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,28 +27,40 @@
import json
import importlib

from .constants import ModuleName, ClassName, ClassArgs
from nni.msg_dispatcher import MsgDispatcher
from nni.hyperopt_tuner.hyperopt_tuner import HyperoptTuner
from nni.evolution_tuner.evolution_tuner import EvolutionTuner
from nni.batch_tuner.batch_tuner import BatchTuner
from nni.medianstop_assessor.medianstop_assessor import MedianstopAssessor

logger = logging.getLogger('nni.main')
logger.debug('START')

BUILT_IN_CLASS_NAMES = ['HyperoptTuner', 'EvolutionTuner', 'BatchTuner', 'MedianstopAssessor']
def augment_classargs(input_class_args, classname):
if classname in ClassArgs:
for key, value in ClassArgs[classname].items():
if key not in input_class_args:
input_class_args[key] = value
return input_class_args

def create_builtin_class_instance(classname, jsonstr_args):
if classname not in ModuleName or \
importlib.util.find_spec(ModuleName[classname]) is None:
raise RuntimeError('Tuner module is not found: {}'.format(classname))
class_module = importlib.import_module(ModuleName[classname])
class_constructor = getattr(class_module, ClassName[classname])
if jsonstr_args:
class_args = json.loads(jsonstr_args)
instance = eval(classname)(**class_args)
class_args = augment_classargs(class_args, classname)
yds05 marked this conversation as resolved.
Show resolved Hide resolved
else:
instance = eval(classname)()
class_args = augment_classargs({}, classname)
if class_args:
instance = class_constructor(**class_args)
else:
instance = class_constructor()
return instance

def create_customized_class_instance(class_dir, class_filename, classname, jsonstr_args):
if not os.path.isfile(os.path.join(class_dir, class_filename)):
raise ValueError('Class file not found: {}'.format(os.path.join(class_dir, class_filename)))
raise ValueError('Class file not found: {}'.format(
os.path.join(class_dir, class_filename)))
sys.path.append(class_dir)
module_name = class_filename.split('.')[0]
class_module = importlib.import_module(module_name)
Expand All @@ -64,12 +76,12 @@ def parse_args():
parser = argparse.ArgumentParser(description='parse command line parameters.')
parser.add_argument('--tuner_class_name', type=str, required=True,
help='Tuner class name, the class must be a subclass of nni.Tuner')
parser.add_argument('--tuner_class_filename', type=str, required=False,
help='Tuner class file path')
parser.add_argument('--tuner_args', type=str, required=False,
help='Parameters pass to tuner __init__ constructor')
parser.add_argument('--tuner_directory', type=str, required=False,
help='Tuner directory')
parser.add_argument('--tuner_class_filename', type=str, required=False,
help='Tuner class file path')

parser.add_argument('--assessor_class_name', type=str, required=False,
help='Assessor class name, the class must be a subclass of nni.Assessor')
Expand All @@ -93,23 +105,34 @@ def main():
tuner = None
assessor = None

if args.tuner_class_name is None:
raise ValueError('Tuner must be specified')
if args.tuner_class_name in BUILT_IN_CLASS_NAMES:
tuner = create_builtin_class_instance(args.tuner_class_name, args.tuner_args)
if args.tuner_class_name in ModuleName:
tuner = create_builtin_class_instance(
args.tuner_class_name,
args.tuner_args)
else:
tuner = create_customized_class_instance(args.tuner_directory, args.tuner_class_filename, args.tuner_class_name, args.tuner_args)

if args.assessor_class_name:
if args.assessor_class_name in BUILT_IN_CLASS_NAMES:
assessor = create_builtin_class_instance(args.assessor_class_name, args.assessor_args)
else:
assessor = create_customized_class_instance(args.assessor_directory, \
args.assessor_class_filename, args.assessor_class_name, args.assessor_args)
tuner = create_customized_class_instance(
args.tuner_directory,
args.tuner_class_filename,
args.tuner_class_name,
args.tuner_args)

if tuner is None:
raise AssertionError('Failed to create Tuner instance')

if args.assessor_class_name:
if args.assessor_class_name in ModuleName:
assessor = create_builtin_class_instance(
args.assessor_class_name,
args.assessor_args)
else:
assessor = create_customized_class_instance(
args.assessor_directory,
args.assessor_class_filename,
args.assessor_class_name,
args.assessor_args)
if assessor is None:
raise AssertionError('Failed to create Assessor instance')

dispatcher = MsgDispatcher(tuner, assessor)

try:
Expand Down
22 changes: 13 additions & 9 deletions src/sdk/pynni/nni/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@


from collections import namedtuple
from datetime import datetime
from io import TextIOBase
import logging
import os
Expand All @@ -39,13 +40,16 @@ def _load_env_args():
'''Arguments passed from environment'''


class _LoggerFile(TextIOBase):
def __init__(self, logger):
self.logger = logger
_time_format = '%Y-%m-%d %H:%M:%S'
class _LoggerFileWrapper(TextIOBase):
def __init__(self, logger_file):
self.file = logger_file

def write(self, s):
if s != '\n': # ignore line break, since logger will add it
self.logger.info(s)
if s != '\n':
time = datetime.now().strftime(_time_format)
self.file.write('[{}] PRINT '.format(time) + s + '\n')
self.file.flush()
return len(s)


Expand All @@ -58,12 +62,12 @@ def init_logger(logger_file_path):
logger_file_path = 'unittest.log'
elif env_args.log_dir is not None:
logger_file_path = os.path.join(env_args.log_dir, logger_file_path)
logger_file = open(logger_file_path, 'w')

fmt = '[%(asctime)s] %(levelname)s (%(name)s) %(message)s'
datefmt = '%Y-%m-%d %H:%M:%S'
formatter = logging.Formatter(fmt, datefmt)
formatter = logging.Formatter(fmt, _time_format)

handler = logging.FileHandler(logger_file_path)
handler = logging.StreamHandler(logger_file)
handler.setFormatter(formatter)

root_logger = logging.getLogger()
Expand All @@ -73,4 +77,4 @@ def init_logger(logger_file_path):
# these modules are too verbose
logging.getLogger('matplotlib').setLevel(logging.INFO)

sys.stdout = _LoggerFile(logging.getLogger('print'))
sys.stdout = _LoggerFileWrapper(logger_file)
51 changes: 51 additions & 0 deletions src/sdk/pynni/nni/constants.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
# Copyright (c) Microsoft Corporation
# All rights reserved.
#
# MIT License
#
# Permission is hereby granted, free of charge,
# to any person obtaining a copy of this software and associated
# documentation files (the "Software"), to deal in the Software without restriction,
# including without limitation the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the Software, and
# to permit persons to whom the Software is furnished to do so, subject to the following conditions:
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING
# BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

ModuleName = {
'TPE': 'nni.hyperopt_tuner.hyperopt_tuner',
'Random': 'nni.hyperopt_tuner.hyperopt_tuner',
'Anneal': 'nni.hyperopt_tuner.hyperopt_tuner',
'Evolution': 'nni.evolution_tuner.evolution_tuner',
'SMAC': 'nni.smac_tuner.smac_tuner',

'Medianstop': 'nni.medianstop_assessor.medianstop_assessor'
}

ClassName = {
'TPE': 'HyperoptTuner',
'Random': 'HyperoptTuner',
'Anneal': 'HyperoptTuner',
'Evolution': 'EvolutionTuner',
'SMAC': 'SMACTuner',

'Medianstop': 'MedianstopAssessor'
}

ClassArgs = {
'TPE': {
'algorithm_name': 'tpe'
},
'Random': {
'algorithm_name': 'random_search'
},
'Anneal': {
'algorithm_name': 'anneal'
}
}
1 change: 1 addition & 0 deletions src/sdk/pynni/nni/smac_tuner/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# Integration doc: SMAC on nni
Empty file.
Loading