Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Add GP Tuner and related doc #1191

Merged
merged 26 commits into from
Jun 25, 2019
Merged
Show file tree
Hide file tree
Changes from 13 commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
6099fb4
fix link err in docs
suiguoxin May 28, 2019
44cf69c
add spaces
suiguoxin May 28, 2019
3f87626
re-organise links for detailed descriptions of the tuners and accesso…
suiguoxin May 28, 2019
b92c4ab
add in-page link by change .md to .html
suiguoxin May 29, 2019
3be8892
Merge branch 'master' of git://github.com/microsoft/nni
suiguoxin May 29, 2019
0717988
delete #section from cross-file links to make links work in both read…
suiguoxin May 29, 2019
db20820
Merge branch 'master' of git://github.com/microsoft/nni
suiguoxin Jun 5, 2019
21725f9
gp_tuner init from fmfn's repo
suiguoxin Jun 20, 2019
e35fa2b
fix params bug by adding float>int transition
suiguoxin Jun 20, 2019
17be796
add optimal choices; support randint&quniform type; add doc
suiguoxin Jun 21, 2019
3906b34
refine doc and code
suiguoxin Jun 21, 2019
b776d7e
change mnist yml comments
suiguoxin Jun 21, 2019
dbac6ed
typo fix
suiguoxin Jun 21, 2019
df3952a
fix val err
suiguoxin Jun 23, 2019
37f4b12
fix minimize mode err
suiguoxin Jun 23, 2019
56a1575
add config test and Hpo result
suiguoxin Jun 23, 2019
ba8dccd
Merge branch 'master' of https://github.com/microsoft/nni
suiguoxin Jun 23, 2019
8cd50e3
support quniform type; update doc; update test config
suiguoxin Jun 24, 2019
942f519
update doc
suiguoxin Jun 24, 2019
ce4906f
un-commit changed in yarn.lock
suiguoxin Jun 24, 2019
a0d6cd1
fix optimize mode bug
suiguoxin Jun 25, 2019
10df680
optimize mode
suiguoxin Jun 25, 2019
c5f3da0
optimize mode
suiguoxin Jun 25, 2019
77c9547
reset pylint, gitignore
suiguoxin Jun 25, 2019
d6febf2
Merge branch 'master' of git://github.com/microsoft/nni
suiguoxin Jun 25, 2019
66ba114
revert .gitignore yarn.lock
suiguoxin Jun 25, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 43 additions & 1 deletion docs/en_US/BuiltinTuner.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Currently we support the following algorithms:
|[__Network Morphism__](#NetworkMorphism)|Network Morphism provides functions to automatically search for architecture of deep learning models. Every child network inherits the knowledge from its parent network and morphs into diverse types of networks, including changes of depth, width, and skip-connection. Next, it estimates the value of a child network using the historic architecture and metric pairs. Then it selects the most promising one to train. [Reference Paper](https://arxiv.org/abs/1806.10282)|
|[__Metis Tuner__](#MetisTuner)|Metis offers the following benefits when it comes to tuning parameters: While most tools only predict the optimal configuration, Metis gives you two outputs: (a) current prediction of optimal configuration, and (b) suggestion for the next trial. No more guesswork. While most tools assume training datasets do not have noisy data, Metis actually tells you if you need to re-sample a particular hyper-parameter. [Reference Paper](https://www.microsoft.com/en-us/research/publication/metis-robustly-tuning-tail-latencies-cloud-systems/)|
|[__BOHB__](#BOHB)|BOHB is a follow-up work of Hyperband. It targets the weakness of Hyperband that new configurations are generated randomly without leveraging finished trials. For the name BOHB, HB means Hyperband, BO means Byesian Optimization. BOHB leverages finished trials by building multiple TPE models, a proportion of new configurations are generated through these models. [Reference Paper](https://arxiv.org/abs/1807.01774)|
Copy link
Contributor

@xuehui1991 xuehui1991 Jun 23, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please book a meeting to review your code.

Also could you give some experiment result in HPO.md, so that we could compare with other Tuner~


|[__GP Tuner__](#GPTuner)|Gaussian Process Tuner is a sequential model-based optimization (SMBO) approach with Gaussian Process as the surrogate. [Reference Paper, ](https://papers.nips.cc/paper/4443-algorithms-for-hyper-parameter-optimization.pdf)[Github Repo](https://github.com/fmfn/BayesianOptimization)|
<br>

## Usage of Builtin Tuners
Expand Down Expand Up @@ -366,3 +366,45 @@ advisor:
max_budget: 27
eta: 3
```
<br>

<a name="GPTuner"></a>

![](https://placehold.it/15/1589F0/000000?text=+) `GP Tuner`

> Builtin Tuner Name: **GPTuner**

Note that the only acceptable types of search space are `choice`, `quniform`, `uniform` and `randint`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why cannot support other type?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

loguniform, qloguniform added. Types like normal are not supported here since a limited bound is needed.


**Suggested scenario**

GP Tuner is uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) and common tools can be employed. Therefore GP Tuner is most adequate for situations where sampling the function to be optimized is a very expensive endeavor. GP Tuner has a computationoal cost that grows at *O(N^3)* due to the requirement of inverting the Gram matrix. [Detailed Description](./GPTuner.md)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Therefore GP Tuner is most adequate for situations where sampling the function to be optimized is a very expensive endeavor", can you explain more?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Explanation added.


**Requirement of classArg**

* **optimize_mode** (*'maximize' or 'minimize', optional, default = 'maximize'*) - If 'maximize', the tuner will target to maximize metrics. If 'minimize', the tuner will target to minimize metrics.
* **utility** (*'ei', 'ucb' or 'poi', optional, default = 'ei'*) - The kind of utility function. 'ei', 'ucb' and 'poi' corresponds to 'Expected Improvement', 'Upper Confidence Bound' and 'Probability of Improvement' respectively.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how to select these choice?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is no fixed rule for choosing utility function since the block-box function to be optimized varies. Normally 'ei' is good choice who balances exploration and exploitation well. I think it interesting to expose theses choices to users who are interested in the tuning algorithm.

* **kappa** (*float, optional, default = 5*) - Used by utility function 'ucb'. The bigger `kappa` is, the more the tuner will be exploratory.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think part of them are optional classArg(if they have default value)...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of them are optimal. Normally no need to change them except for "optimize_mode" .

* **xi** (*float, optional, default = 0*) - Used by utility function 'ei' and 'poi'. The bigger `xi` is, the more the tuner will be exploratory.
* **nu** (*float, optional, default = 2.5*) - Used to specify Matern kernel. The smaller nu, the less smooth the approximated function is.
* **alpha** (*float, optional, default = 1e-6*) - Used to specify Gaussian Process Regressor. Larger values correspond to increased noise level in the observations.
* **cold_start_num** (*int, optional, default = 10*) - Number of random exploration to perform before Gaussian Process. Random exploration can help by diversifying the exploration space.
* **selection_num_warm_up** (*int, optional, default = 1e5*) - Number of random points to evaluate for getting the point which maximizes the acquisition function.
* **selection_num_starting_points** (*int, optional, default = 250*) - Nnumber of times to run L-BFGS-B from a random starting point after the warmup.

**Usage example**

```yaml
# config.yml
tuner:
builtinTunerName: GPTuner
classArgs:
optimize_mode: maximize
kappa: 5
xi: 0
nu: 2.5
alpha: 1e-6
cold_start_num: 10
selection_num_warm_up: 1e5
selection_num_starting_points: 250
```
10 changes: 10 additions & 0 deletions docs/en_US/GPTuner.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
GP Tuner on NNI
===

## GP Tuner

Bayesian optimization works by constructing a posterior distribution of functions (Gaussian Process here) that best describes the function you want to optimize. As the number of observations grows, the posterior distribution improves, and the algorithm becomes more certain of which regions in parameter space are worth exploring and which are not.

GP Tuner is designed to minimize/maximize the number of steps required to find a combination of parameters that are close to the optimal combination. To do so, this method uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) and common tools can be employed. Therefore Bayesian Optimization is most adequate for situations where sampling the function to be optimized is a very expensive endeavor.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reference paper?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

paper link added


Note that the only acceptable types of search space are `choice`, `quniform`, `uniform` and `randint`.
1 change: 1 addition & 0 deletions docs/en_US/SearchSpaceSpec.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,7 @@ All types of sampling strategies and their parameter are listed here:
| Grid Search Tuner | &#10003; | | | &#10003; | | &#10003; | | | | |
| Hyperband Advisor | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; | &#10003; |
| Metis Tuner | &#10003; | &#10003; | &#10003; | &#10003; | | | | | | |
| GP Tuner | &#10003; | &#10003; | &#10003; | &#10003; | | | | | | |


Known Limitations:
Expand Down
2 changes: 1 addition & 1 deletion examples/trials/mnist/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ searchSpacePath: search_space.json
#choice: true, false
useAnnotation: false
tuner:
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner, GPTuner
#SMAC (SMAC should be installed through nnictl)
builtinTunerName: TPE
classArgs:
Expand Down
2 changes: 1 addition & 1 deletion examples/trials/mnist/config_assessor.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ searchSpacePath: search_space.json
#choice: true, false
useAnnotation: false
tuner:
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner, GPTuner
#SMAC (SMAC should be installed through nnictl)
builtinTunerName: TPE
classArgs:
Expand Down
2 changes: 1 addition & 1 deletion examples/trials/mnist/config_frameworkcontroller.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ searchSpacePath: search_space.json
#choice: true, false
useAnnotation: false
tuner:
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner, GPTuner
builtinTunerName: TPE
classArgs:
#choice: maximize, minimize
Expand Down
2 changes: 1 addition & 1 deletion examples/trials/mnist/config_kubeflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ searchSpacePath: search_space.json
#choice: true, false
useAnnotation: false
tuner:
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner, GPTuner
builtinTunerName: TPE
classArgs:
#choice: maximize, minimize
Expand Down
2 changes: 1 addition & 1 deletion examples/trials/mnist/config_pai.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ searchSpacePath: search_space.json
#choice: true, false
useAnnotation: false
tuner:
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner, GPTuner
#SMAC (SMAC should be installed through nnictl)
builtinTunerName: TPE
classArgs:
Expand Down
2 changes: 1 addition & 1 deletion examples/trials/mnist/config_windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ searchSpacePath: search_space.json
#choice: true, false
useAnnotation: false
tuner:
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner
#choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner, GPTuner
#SMAC (SMAC should be installed through nnictl)
builtinTunerName: TPE
classArgs:
Expand Down
2 changes: 1 addition & 1 deletion src/nni_manager/rest_server/restValidationSchemas.ts
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,7 @@ export namespace ValidationSchemas {
checkpointDir: joi.string().allow('')
}),
tuner: joi.object({
builtinTunerName: joi.string().valid('TPE', 'Random', 'Anneal', 'Evolution', 'SMAC', 'BatchTuner', 'GridSearch', 'NetworkMorphism', 'MetisTuner'),
builtinTunerName: joi.string().valid('TPE', 'Random', 'Anneal', 'Evolution', 'SMAC', 'BatchTuner', 'GridSearch', 'NetworkMorphism', 'MetisTuner', 'GPTuner'),
codeDir: joi.string(),
classFileName: joi.string(),
className: joi.string(),
Expand Down
112 changes: 42 additions & 70 deletions src/nni_manager/yarn.lock
Original file line number Diff line number Diff line change
Expand Up @@ -161,9 +161,10 @@
version "10.5.2"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@chicm-ms pls take a look. Seems that we should not update this file in this commit.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@suiguoxin pls drop updates of this file.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've undo the commit of this file and added it to .gitignore

resolved "https://registry.yarnpkg.com/@types/node/-/node-10.5.2.tgz#f19f05314d5421fe37e74153254201a7bf00a707"

"@types/node@^10.5.5":
version "10.5.5"
resolved "https://registry.yarnpkg.com/@types/node/-/node-10.5.5.tgz#8e84d24e896cd77b0d4f73df274027e3149ec2ba"
"@types/node@10.12.18":
version "10.12.18"
resolved "https://registry.yarnpkg.com/@types/node/-/node-10.12.18.tgz#1d3ca764718915584fcd9f6344621b7672665c67"
integrity sha512-fh+pAqt4xRzPfqA6eh3Z2y6fyZavRIumvjhaCL753+TVkGKGhpPeyrJG2JftD0T9q4GF00KjefsQ+PQNDdWQaQ==

"@types/range-parser@*":
version "1.2.2"
Expand Down Expand Up @@ -342,10 +343,6 @@ ansi-regex@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-3.0.0.tgz#ed0317c322064f79466c02966bddb605ab37d998"

ansi-styles@^2.2.1:
version "2.2.1"
resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-2.2.1.tgz#b432dd3358b634cf75e1e4664368240533c1ddbe"

ansi-styles@^3.2.1:
version "3.2.1"
resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-3.2.1.tgz#41fbb20243e50b12be0f04b8dedbf07520ce841d"
Expand Down Expand Up @@ -451,14 +448,6 @@ azure-storage@^2.10.2:
xml2js "0.2.8"
xmlbuilder "^9.0.7"

babel-code-frame@^6.22.0:
version "6.26.0"
resolved "https://registry.yarnpkg.com/babel-code-frame/-/babel-code-frame-6.26.0.tgz#63fd43f7dc1e3bb7ce35947db8fe369a3f58c74b"
dependencies:
chalk "^1.1.3"
esutils "^2.0.2"
js-tokens "^3.0.2"

balanced-match@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/balanced-match/-/balanced-match-1.0.0.tgz#89b4d199ab2bee49de164ea02b89ce462d71b767"
Expand Down Expand Up @@ -575,16 +564,6 @@ chai@^4.1.2:
pathval "^1.0.0"
type-detect "^4.0.0"

chalk@^1.1.3:
version "1.1.3"
resolved "https://registry.yarnpkg.com/chalk/-/chalk-1.1.3.tgz#a8115c55e4a702fe4d150abd3872822a7e09fc98"
dependencies:
ansi-styles "^2.2.1"
escape-string-regexp "^1.0.2"
has-ansi "^2.0.0"
strip-ansi "^3.0.0"
supports-color "^2.0.0"

chalk@^2.0.0, chalk@^2.3.0:
version "2.4.1"
resolved "https://registry.yarnpkg.com/chalk/-/chalk-2.4.1.tgz#18c49ab16a037b6eb0152cc83e3471338215b66e"
Expand Down Expand Up @@ -854,7 +833,7 @@ escape-html@~1.0.3:
version "1.0.3"
resolved "https://registry.yarnpkg.com/escape-html/-/escape-html-1.0.3.tgz#0258eae4d3d0c0974de1c169188ef0051d1d1988"

escape-string-regexp@1.0.5, escape-string-regexp@^1.0.2, escape-string-regexp@^1.0.5:
escape-string-regexp@1.0.5, escape-string-regexp@^1.0.5:
version "1.0.5"
resolved "https://registry.yarnpkg.com/escape-string-regexp/-/escape-string-regexp-1.0.5.tgz#1b61c0562190a8dff6ae3bb2cf0200ca130b86d4"

Expand Down Expand Up @@ -1163,12 +1142,6 @@ har-validator@~5.1.0:
ajv "^5.3.0"
har-schema "^2.0.0"

has-ansi@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/has-ansi/-/has-ansi-2.0.0.tgz#34f5049ce1ecdf2b0649af3ef24e45ed35416d91"
dependencies:
ansi-regex "^2.0.0"

has-flag@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/has-flag/-/has-flag-3.0.0.tgz#b5d454dc2199ae225699f3467e5a07f3b955bafd"
Expand Down Expand Up @@ -1426,21 +1399,25 @@ js-base64@^2.4.9:
version "2.5.0"
resolved "https://registry.yarnpkg.com/js-base64/-/js-base64-2.5.0.tgz#42255ba183ab67ce59a0dee640afdc00ab5ae93e"

js-tokens@^3.0.2:
version "3.0.2"
resolved "https://registry.yarnpkg.com/js-tokens/-/js-tokens-3.0.2.tgz#9866df395102130e38f7f996bceb65443209c25b"

js-tokens@^4.0.0:
version "4.0.0"
resolved "https://registry.yarnpkg.com/js-tokens/-/js-tokens-4.0.0.tgz#19203fb59991df98e3a287050d4647cdeaf32499"

js-yaml@^3.10.0, js-yaml@^3.7.0:
js-yaml@^3.10.0:
version "3.12.0"
resolved "https://registry.yarnpkg.com/js-yaml/-/js-yaml-3.12.0.tgz#eaed656ec8344f10f527c6bfa1b6e2244de167d1"
dependencies:
argparse "^1.0.7"
esprima "^4.0.0"

js-yaml@^3.13.1:
version "3.13.1"
resolved "https://registry.yarnpkg.com/js-yaml/-/js-yaml-3.13.1.tgz#aff151b30bfdfa8e49e05da22e7415e9dfa37847"
integrity sha512-YfbcO7jXDdyj0DGxYVSlSeQNHbD7XPWvrVWeVUujrQEoZzWJIRrCPoyk6kL6IAjAG2IolMK4T0hNUe0HOUs5Jw==
dependencies:
argparse "^1.0.7"
esprima "^4.0.0"

jsbn@~0.1.0:
version "0.1.1"
resolved "https://registry.yarnpkg.com/jsbn/-/jsbn-0.1.1.tgz#a5e654c2e5a2deb5f201d96cefbca80c0ef2f513"
Expand Down Expand Up @@ -1797,12 +1774,6 @@ node-jose@^1.1.0:
node-forge "^0.7.6"
uuid "^3.3.2"

node-nvidia-smi@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/node-nvidia-smi/-/node-nvidia-smi-1.0.0.tgz#6aa57574540b2bed91c9a80218516ffa686e5ac7"
dependencies:
xml2js "^0.4.17"

node-pre-gyp@^0.10.3:
version "0.10.3"
resolved "https://registry.yarnpkg.com/node-pre-gyp/-/node-pre-gyp-0.10.3.tgz#3070040716afdc778747b61b6887bf78880b80fc"
Expand Down Expand Up @@ -2384,7 +2355,7 @@ sax@0.5.x:
version "0.5.8"
resolved "http://registry.npmjs.org/sax/-/sax-0.5.8.tgz#d472db228eb331c2506b0e8c15524adb939d12c1"

sax@>=0.6.0, sax@^1.2.4:
sax@^1.2.4:
version "1.2.4"
resolved "https://registry.yarnpkg.com/sax/-/sax-1.2.4.tgz#2816234e2378bddc4e5354fab5caa895df7100d9"

Expand Down Expand Up @@ -2619,10 +2590,6 @@ supports-color@5.4.0, supports-color@^5.3.0:
dependencies:
has-flag "^3.0.0"

supports-color@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/supports-color/-/supports-color-2.0.0.tgz#535d045ce6b6363fa40117084629995e9df324c7"

supports-color@^5.4.0:
version "5.5.0"
resolved "https://registry.yarnpkg.com/supports-color/-/supports-color-5.5.0.tgz#e2e69a44ac8772f78a1ec0b35b689df6530efc8f"
Expand Down Expand Up @@ -2716,32 +2683,43 @@ tslib@^1.8.0, tslib@^1.8.1:
version "1.9.3"
resolved "https://registry.yarnpkg.com/tslib/-/tslib-1.9.3.tgz#d7e4dd79245d85428c4d7e4822a79917954ca286"

tslint-microsoft-contrib@^5.1.0:
version "5.1.0"
resolved "https://registry.yarnpkg.com/tslint-microsoft-contrib/-/tslint-microsoft-contrib-5.1.0.tgz#777c32d51aba16f4565e47aac749a1631176cd9f"
tslint-microsoft-contrib@^6.0.0:
version "6.2.0"
resolved "https://registry.yarnpkg.com/tslint-microsoft-contrib/-/tslint-microsoft-contrib-6.2.0.tgz#8aa0f40584d066d05e6a5e7988da5163b85f2ad4"
integrity sha512-6tfi/2tHqV/3CL77pULBcK+foty11Rr0idRDxKnteTaKm6gWF9qmaCNU17HVssOuwlYNyOmd9Jsmjd+1t3a3qw==
dependencies:
tsutils "^2.12.1"
tsutils "^2.27.2 <2.29.0"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why change here?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've undo the commit of this file and added it to .gitignore


tslint@^5.11.0:
version "5.11.0"
resolved "https://registry.yarnpkg.com/tslint/-/tslint-5.11.0.tgz#98f30c02eae3cde7006201e4c33cb08b48581eed"
tslint@^5.12.0:
version "5.17.0"
resolved "https://registry.yarnpkg.com/tslint/-/tslint-5.17.0.tgz#f9f0ce2011d8e90debaa6e9b4975f24cd16852b8"
integrity sha512-pflx87WfVoYepTet3xLfDOLDm9Jqi61UXIKePOuca0qoAZyrGWonDG9VTbji58Fy+8gciUn8Bt7y69+KEVjc/w==
dependencies:
babel-code-frame "^6.22.0"
"@babel/code-frame" "^7.0.0"
builtin-modules "^1.1.1"
chalk "^2.3.0"
commander "^2.12.1"
diff "^3.2.0"
glob "^7.1.1"
js-yaml "^3.7.0"
js-yaml "^3.13.1"
minimatch "^3.0.4"
mkdirp "^0.5.1"
resolve "^1.3.2"
semver "^5.3.0"
tslib "^1.8.0"
tsutils "^2.27.2"
tsutils "^2.29.0"

"tsutils@^2.27.2 <2.29.0":
version "2.28.0"
resolved "https://registry.yarnpkg.com/tsutils/-/tsutils-2.28.0.tgz#6bd71e160828f9d019b6f4e844742228f85169a1"
integrity sha512-bh5nAtW0tuhvOJnx1GLRn5ScraRLICGyJV5wJhtRWOLsxW70Kk5tZtpK3O/hW6LDnqKS9mlUMPZj9fEMJ0gxqA==
dependencies:
tslib "^1.8.1"

tsutils@^2.12.1, tsutils@^2.27.2:
tsutils@^2.29.0:
version "2.29.0"
resolved "https://registry.yarnpkg.com/tsutils/-/tsutils-2.29.0.tgz#32b488501467acbedd4b85498673a0812aca0b99"
integrity sha512-g5JVHCIJwzfISaXpXE1qvNalca5Jwob6FjI4AoPlqMusJ6ftFE7IkkFoMhVLRgK+4Kx3gkzb8UZK5t5yTTvEmA==
dependencies:
tslib "^1.8.1"

Expand Down Expand Up @@ -2777,9 +2755,10 @@ typescript-string-operations@^1.3.1:
version "1.3.1"
resolved "https://registry.yarnpkg.com/typescript-string-operations/-/typescript-string-operations-1.3.1.tgz#461b886cc9ccd4dd16810b1f248b2e6f6580956b"

typescript@^3.0.1:
version "3.0.1"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.0.1.tgz#43738f29585d3a87575520a4b93ab6026ef11fdb"
typescript@^3.2.2:
version "3.5.2"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-3.5.2.tgz#a09e1dc69bc9551cadf17dba10ee42cf55e5d56c"
integrity sha512-7KxJovlYhTX5RaRbUdkAXN1KUZ8PwWlTzQdHV6xNqvuFOs7+WBo10TQUqT19Q/Jz2hk5v9TQDIhyLhhJY4p5AA==

uglify-js@^3.1.4:
version "3.4.9"
Expand Down Expand Up @@ -2900,14 +2879,7 @@ xml2js@0.2.8:
dependencies:
sax "0.5.x"

xml2js@^0.4.17:
version "0.4.19"
resolved "https://registry.yarnpkg.com/xml2js/-/xml2js-0.4.19.tgz#686c20f213209e94abf0d1bcf1efaa291c7827a7"
dependencies:
sax ">=0.6.0"
xmlbuilder "~9.0.1"

xmlbuilder@^9.0.7, xmlbuilder@~9.0.1:
xmlbuilder@^9.0.7:
version "9.0.7"
resolved "http://registry.npmjs.org/xmlbuilder/-/xmlbuilder-9.0.7.tgz#132ee63d2ec5565c557e20f4c22df9aca686b10d"

Expand Down
4 changes: 3 additions & 1 deletion src/sdk/pynni/nni/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,8 @@
'GridSearch': 'nni.gridsearch_tuner.gridsearch_tuner',
'NetworkMorphism': 'nni.networkmorphism_tuner.networkmorphism_tuner',
'Curvefitting': 'nni.curvefitting_assessor.curvefitting_assessor',
'MetisTuner': 'nni.metis_tuner.metis_tuner'
'MetisTuner': 'nni.metis_tuner.metis_tuner',
'GPTuner': 'nni.gp_tuner.gp_tuner'
}

ClassName = {
Expand All @@ -42,6 +43,7 @@
'GridSearch': 'GridSearchTuner',
'NetworkMorphism':'NetworkMorphismTuner',
'MetisTuner':'MetisTuner',
'GPTuner':'GPTuner',

'Medianstop': 'MedianstopAssessor',
'Curvefitting': 'CurvefittingAssessor'
Expand Down
Empty file.
Loading