Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Add comprehensive tests for tuners (merge into master) #1681

Merged
merged 6 commits into from
Nov 4, 2019
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,8 @@ jobs:
displayName: 'Run flake8 tests to find Python syntax errors and undefined names'
- script: |
cd test
sudo apt install -y swig
PATH=$HOME/.local/bin:$PATH nnictl package install --name=SMAC
source unittest.sh
displayName: 'Unit test'
- script: |
Expand Down Expand Up @@ -65,7 +67,11 @@ jobs:
displayName: 'Install nni toolkit via source code'
- script: |
cd test
PATH=$HOME/Library/Python/3.7/bin:$PATH && source unittest.sh
ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" < /dev/null 2> /dev/null
brew install swig@3
ln -s /usr/local/opt/swig\@3/bin/swig /usr/local/bin/swig
PATH=$HOME/Library/Python/3.7/bin:$PATH nnictl package install --name=SMAC
PATH=$HOME/Library/Python/3.7/bin:$PATH source unittest.sh
displayName: 'Unit test'
- script: |
cd test
Expand Down
6 changes: 3 additions & 3 deletions docs/en_US/Tuner/BuiltinTuner.md
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@ Its requirement of computation resource is relatively high. Specifically, it req

* **optimize_mode** (*maximize or minimize, optional, default = maximize*) - If 'maximize', the tuner will target to maximize metrics. If 'minimize', the tuner will target to minimize metrics.

* **population_size** (*int value(should >0), optional, default = 20*) - the initial size of the population(trial num) in evolution tuner.
* **population_size** (*int value(should >0), optional, default = 20*) - the initial size of the population (trial num) in evolution tuner.

**Usage example**

Expand All @@ -143,11 +143,11 @@ tuner:

> Built-in Tuner Name: **SMAC**

**Please note that SMAC doesn't support running on windows currently. The specific reason can be referred to this [GitHub issue](https://github.com/automl/SMAC3/issues/483).**
**Please note that SMAC doesn't support running on Windows currently. The specific reason can be referred to this [GitHub issue](https://github.com/automl/SMAC3/issues/483).**

**Installation**

SMAC need to be installed by following command before first use.
SMAC need to be installed by following command before first use. As a reminder, `swig` is required for SMAC: for Ubuntu `swig` can be installed with `apt`.

```bash
nnictl package install --name=SMAC
Expand Down
12 changes: 8 additions & 4 deletions docs/en_US/Tutorial/SearchSpaceSpec.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,8 @@ To define a search space, users should define the name of variable, the type of

Take the first line as an example. `dropout_rate` is defined as a variable whose priori distribution is a uniform distribution of a range from `0.1` and `0.5`.

Note that the ability of a search space is highly connected with your tuner. We listed the supported types for each builtin tuner below. For a customized tuner, you don't have to follow our convention and you will have the flexibility to define any type you want.

## Types

All types of sampling strategies and their parameter are listed here:
Expand Down Expand Up @@ -74,6 +76,8 @@ All types of sampling strategies and their parameter are listed here:
* `{"_type": "mutable_layer", "_value": {mutable_layer_infomation}}`
* Type for [Neural Architecture Search Space][1]. Value is also a dictionary, which contains key-value pairs representing respectively name and search space of each mutable_layer.
* For now, users can only use this type of search space with annotation, which means that there is no need to define a json file for search space since it will be automatically generated according to the annotation in trial code.
* The following HPO tuners can be adapted to tune this search space: TPE, Random, Anneal, Evolution, Grid Search,
Hyperband and BOHB.
* For detailed usage, please refer to [General NAS Interfaces][1].

## Search Space Types Supported by Each Tuner
Expand All @@ -86,20 +90,20 @@ All types of sampling strategies and their parameter are listed here:
| Evolution Tuner | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; |
| SMAC Tuner | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | | | | | |
| Batch Tuner | &#10003; | | | | | | | | | |
| Grid Search Tuner | &#10003; | &#10003; | | &#10003; | | | | | | |
| Grid Search Tuner | &#10003; | &#10003; | | &#10003; | | | | | | |
| Hyperband Advisor | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; |
| Metis Tuner | &#10003; | &#10003; | &#10003; | &#10003; | | | | | | |
| GP Tuner | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | | | | |
| GP Tuner | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | | | | |


Known Limitations:

* GP Tuner and Metis Tuner support only **numerical values** in search space(`choice` type values can be no-numeraical with other tuners, e.g. string values). Both GP Tuner and Metis Tuner use Gaussian Process Regressor(GPR). GPR make predictions based on a kernel function and the 'distance' between different points, it's hard to get the true distance between no-numerical values.
* GP Tuner and Metis Tuner support only **numerical values** in search space (`choice` type values can be no-numeraical with other tuners, e.g. string values). Both GP Tuner and Metis Tuner use Gaussian Process Regressor(GPR). GPR make predictions based on a kernel function and the 'distance' between different points, it's hard to get the true distance between no-numerical values.

* Note that for nested search space:

* Only Random Search/TPE/Anneal/Evolution tuner supports nested search space

* We do not support nested search space "Hyper Parameter" in visualization now, the enhancement is being considered in #1110(https://github.com/microsoft/nni/issues/1110), any suggestions or discussions or contributions are warmly welcomed
* We do not support nested search space "Hyper Parameter" in visualization now, the enhancement is being considered in [#1110](https://github.com/microsoft/nni/issues/1110), any suggestions or discussions or contributions are warmly welcomed

[1]: ../AdvancedFeature/GeneralNasInterfaces.md
4 changes: 2 additions & 2 deletions src/sdk/pynni/nni/evolution_tuner/evolution_tuner.py
Original file line number Diff line number Diff line change
Expand Up @@ -158,11 +158,11 @@ class EvolutionTuner(Tuner):
EvolutionTuner is tuner using navie evolution algorithm.
"""

def __init__(self, optimize_mode, population_size=32):
def __init__(self, optimize_mode="maximize", population_size=32):
"""
Parameters
----------
optimize_mode : str
optimize_mode : str, default 'maximize'
population_size : int
initial population size. The larger population size,
the better evolution performance.
Expand Down
2 changes: 2 additions & 0 deletions src/sdk/pynni/nni/nas_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -265,6 +265,8 @@ def convert_nas_search_space(search_space):
param search_space: raw search space
return: the new search space, mutable_layers will be converted into choice
"""
if not isinstance(search_space, dict):
return search_space
ret = dict()
for k, v in search_space.items():
if "_type" not in v:
Expand Down
2 changes: 1 addition & 1 deletion src/sdk/pynni/nni/parameter_expressions.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ def uniform(low, high, random_state):
high: an float that represent an upper bound
random_state: an object of numpy.random.RandomState
'''
assert high > low, 'Upper bound must be larger than lower bound'
assert high >= low, 'Upper bound must be larger than lower bound'
return random_state.uniform(low, high)


Expand Down
88 changes: 88 additions & 0 deletions src/sdk/pynni/tests/assets/search_space.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
{
"choice_str": {
"_type": "choice",
"_value": ["cat", "dog", "elephant", "cow", "sheep", "panda"],
"fail": ["metis", "gp"]
},
"choice_int": {
"_type": "choice",
"_value": [42, 43, -1]
},
"choice_mixed": {
"_type": "choice",
"_value": [0.3, "cat", 1, null],
"fail": ["metis", "gp"]
},
"choice_float": {
"_type": "choice",
"_value": [0.3, 1, 2.0]
},
"choice_single": {
"_type": "choice",
"_value": [1]
},
"randint_ok": {
"_type": "randint",
"_value": [-2, 3]
},
"randint_single": {
"_type": "randint",
"_value": [10, 11]
},
"randint_fail_equal": {
"_type": "randint",
"_value": [0, 0]
},
"uniform_ok": {
"_type": "uniform",
"_value": [-1.0, 1.5]
},
"uniform_equal": {
"_type": "uniform",
"_value": [99.9, 99.9]
},
"quniform_ok": {
"_type": "quniform",
"_value": [0.0, 10.0, 2.5]
},
"quniform_clip": {
"_type": "quniform",
"_value": [2.0, 10.0, 5.0]
},
"quniform_clip_2": {
"_type": "quniform",
"_value": [-5.5, -0.5, 6]
},
"loguniform_ok": {
"_type": "loguniform",
"_value": [0.001, 100]
},
"loguniform_equal": {
"_type": "loguniform",
"_value": [1, 1]
},
"qloguniform_ok": {
"_type": "qloguniform",
"_value": [0.001, 100, 1]
},
"qloguniform_equal": {
"_type": "qloguniform",
"_value": [2, 2, 1]
},
"normal_ok": {
"_type": "normal",
"_value": [-1.0, 5.0]
},
"qnormal_ok": {
"_type": "qnormal",
"_value": [-1.5, 5.0, 0.1]
},
"lognormal_ok": {
"_type": "lognormal",
"_value": [-1.0, 5.0]
},
"qlognormal_ok": {
"_type": "qlognormal",
"_value": [-1.5, 5.0, 0.1]
}
}
125 changes: 125 additions & 0 deletions src/sdk/pynni/tests/test_msg_dispatcher.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,125 @@
# Copyright (c) Microsoft Corporation. All rights reserved.
#
# MIT License
#
# Permission is hereby granted, free of charge, to any person obtaining a copy of this software and
# associated documentation files (the "Software"), to deal in the Software without restriction,
# including without limitation the rights to use, copy, modify, merge, publish, distribute,
# sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all copies or
# substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT
# NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT
# OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
# ==================================================================================================


import json
from io import BytesIO
from unittest import TestCase, main

import nni.protocol
from nni.msg_dispatcher import MsgDispatcher
from nni.protocol import CommandType, send, receive
from nni.tuner import Tuner
from nni.utils import extract_scalar_reward


class NaiveTuner(Tuner):
def __init__(self):
self.param = 0
self.trial_results = []
self.search_space = None
self.accept_customized_trials()

def generate_parameters(self, parameter_id, **kwargs):
# report Tuner's internal states to generated parameters,
# so we don't need to pause the main loop
self.param += 2
return {
'param': self.param,
'trial_results': self.trial_results,
'search_space': self.search_space
}

def receive_trial_result(self, parameter_id, parameters, value, **kwargs):
reward = extract_scalar_reward(value)
self.trial_results.append((parameter_id, parameters['param'], reward, kwargs.get("customized")))

def update_search_space(self, search_space):
self.search_space = search_space


_in_buf = BytesIO()
_out_buf = BytesIO()


def _reverse_io():
_in_buf.seek(0)
_out_buf.seek(0)
nni.protocol._out_file = _in_buf
nni.protocol._in_file = _out_buf


def _restore_io():
_in_buf.seek(0)
_out_buf.seek(0)
nni.protocol._in_file = _in_buf
nni.protocol._out_file = _out_buf


class MsgDispatcherTestCase(TestCase):
def test_msg_dispatcher(self):
_reverse_io() # now we are sending to Tuner's incoming stream
send(CommandType.RequestTrialJobs, '2')
send(CommandType.ReportMetricData, '{"parameter_id":0,"type":"PERIODICAL","value":10}')
send(CommandType.ReportMetricData, '{"parameter_id":1,"type":"FINAL","value":11}')
send(CommandType.UpdateSearchSpace, '{"name":"SS0"}')
send(CommandType.AddCustomizedTrialJob, '{"param":-1}')
send(CommandType.ReportMetricData, '{"parameter_id":2,"type":"FINAL","value":22}')
send(CommandType.RequestTrialJobs, '1')
send(CommandType.KillTrialJob, 'null')
_restore_io()

tuner = NaiveTuner()
dispatcher = MsgDispatcher(tuner)
nni.msg_dispatcher_base._worker_fast_exit_on_terminate = False

dispatcher.run()
e = dispatcher.worker_exceptions[0]
self.assertIs(type(e), AssertionError)
self.assertEqual(e.args[0], 'Unsupported command: CommandType.KillTrialJob')

_reverse_io() # now we are receiving from Tuner's outgoing stream
self._assert_params(0, 2, [], None)
self._assert_params(1, 4, [], None)

command, data = receive() # this one is customized
data = json.loads(data)
self.assertIs(command, CommandType.NewTrialJob)
self.assertEqual(data['parameter_id'], 2)
self.assertEqual(data['parameter_source'], 'customized')
self.assertEqual(data['parameters'], {'param': -1})

self._assert_params(3, 6, [[1, 4, 11, False], [2, -1, 22, True]], {'name': 'SS0'})

self.assertEqual(len(_out_buf.read()), 0) # no more commands

def _assert_params(self, parameter_id, param, trial_results, search_space):
command, data = receive()
self.assertIs(command, CommandType.NewTrialJob)
data = json.loads(data)
self.assertEqual(data['parameter_id'], parameter_id)
self.assertEqual(data['parameter_source'], 'algorithm')
self.assertEqual(data['parameters']['param'], param)
self.assertEqual(data['parameters']['trial_results'], trial_results)
self.assertEqual(data['parameters']['search_space'], search_space)


if __name__ == '__main__':
main()
Loading