Skip to content

Commit

Permalink
Merge pull request #152 from Microsoft/master
Browse files Browse the repository at this point in the history
merge master
  • Loading branch information
SparkSnail authored Apr 1, 2019
2 parents f0a2d39 + dbace00 commit 77526d3
Show file tree
Hide file tree
Showing 15 changed files with 7 additions and 30 deletions.
6 changes: 0 additions & 6 deletions docs/en_US/Builtin_Tuner.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,8 +73,6 @@ Random search is suggested when each trial does not take too long (e.g., each tr
# config.yml
tuner:
builtinTunerName: Random
classArgs:
optimize_mode: maximize
```

<br>
Expand Down Expand Up @@ -115,10 +113,6 @@ tuner:

Its requirement of computation resource is relatively high. Specifically, it requires large initial population to avoid falling into local optimum. If your trial is short or leverages assessor, this tuner is a good choice. And, it is more suggested when your trial code supports weight transfer, that is, the trial could inherit the converged weights from its parent(s). This can greatly speed up the training progress.

**Requirement of classArg**

* **optimize_mode** (*maximize or minimize, optional, default = maximize*) - If 'maximize', tuners will return the hyperparameter set with larger expectation. If 'minimize', tuner will return the hyperparameter set with smaller expectation.

**Usage example**

```yaml
Expand Down
2 changes: 1 addition & 1 deletion docs/en_US/PAIMode.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ Compared with LocalMode and [RemoteMachineMode](RemoteMachineMode.md), trial con
* Optional key. It specifies the HDFS data direcotry for trial to download data. The format should be something like hdfs://{your HDFS host}:9000/{your data directory}
* outputDir
* Optional key. It specifies the HDFS output directory for trial. Once the trial is completed (either succeed or fail), trial's stdout, stderr will be copied to this directory by NNI sdk automatically. The format should be something like hdfs://{your HDFS host}:9000/{your output directory}
* virturlCluster
* virtualCluster
* Optional key. Set the virtualCluster of OpenPAI. If omitted, the job will run on default virtual cluster.
* shmMB
* Optional key. Set the shmMB configuration of OpenPAI, it set the shared memory for one task in the task role.
Expand Down
3 changes: 1 addition & 2 deletions docs/en_US/SQuAD_evolution_examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,7 @@
This example shows us how to use Genetic Algorithm to find good model architectures for Reading Comprehension.

## 1. Search Space
Since attention and recurrent neural network (RNN) have been proven effective in Reading Comprehension.
We conclude the search space as follow:
Since attention and RNN have been proven effective in Reading Comprehension, we conclude the search space as follow:

1. IDENTITY (Effectively means keep training).
2. INSERT-RNN-LAYER (Inserts a LSTM. Comparing the performance of GRU and LSTM in our experiment, we decided to use LSTM here.)
Expand Down
2 changes: 1 addition & 1 deletion src/sdk/pynni/nni/hyperopt_tuner/hyperopt_tuner.py
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@ class HyperoptTuner(Tuner):
HyperoptTuner is a tuner which using hyperopt algorithm.
"""

def __init__(self, algorithm_name, optimize_mode):
def __init__(self, algorithm_name, optimize_mode = 'minimize'):
"""
Parameters
----------
Expand Down
2 changes: 0 additions & 2 deletions test/config_test/examples/cifar10-pytorch.test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,6 @@ searchSpacePath: ./cifar10_search_space.json

tuner:
builtinTunerName: Random
classArgs:
optimize_mode: maximize
assessor:
builtinAssessorName: Medianstop
classArgs:
Expand Down
2 changes: 0 additions & 2 deletions test/config_test/examples/mnist-annotation.test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@ trialConcurrency: 2

tuner:
builtinTunerName: Random
classArgs:
optimize_mode: maximize
assessor:
builtinAssessorName: Medianstop
classArgs:
Expand Down
2 changes: 0 additions & 2 deletions test/config_test/examples/mnist-keras.test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,6 @@ searchSpacePath: ../../../examples/trials/mnist-keras/search_space.json

tuner:
builtinTunerName: Random
classArgs:
optimize_mode: maximize
assessor:
builtinAssessorName: Medianstop
classArgs:
Expand Down
2 changes: 0 additions & 2 deletions test/config_test/examples/mnist.test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,6 @@ searchSpacePath: ./mnist_search_space.json

tuner:
builtinTunerName: Random
classArgs:
optimize_mode: maximize
assessor:
builtinAssessorName: Medianstop
classArgs:
Expand Down
2 changes: 0 additions & 2 deletions test/config_test/examples/sklearn-classification.test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,6 @@ searchSpacePath: ../../../examples/trials/sklearn/classification/search_space.js

tuner:
builtinTunerName: Random
classArgs:
optimize_mode: maximize
assessor:
builtinAssessorName: Medianstop
classArgs:
Expand Down
2 changes: 0 additions & 2 deletions test/config_test/examples/sklearn-regression.test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,6 @@ searchSpacePath: ../../../examples/trials/sklearn/regression/search_space.json

tuner:
builtinTunerName: Random
classArgs:
optimize_mode: maximize
assessor:
builtinAssessorName: Medianstop
classArgs:
Expand Down
2 changes: 0 additions & 2 deletions test/config_test/tuners/mnist-annotation-random.test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,6 @@ trialConcurrency: 1

tuner:
builtinTunerName: Random
classArgs:
optimize_mode: maximize
assessor:
builtinAssessorName: Medianstop
classArgs:
Expand Down
2 changes: 0 additions & 2 deletions test/metrics_test/metrics.test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,6 @@ searchSpacePath: ./search_space.json

tuner:
builtinTunerName: Random
classArgs:
optimize_mode: maximize

trial:
codeDir: .
Expand Down
2 changes: 1 addition & 1 deletion test/tuner_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ def switch(dispatch_type, dispatch_name):
'''Change dispatch in config.yml'''
config_path = 'tuner_test/local.yml'
experiment_config = get_yml_content(config_path)
if dispatch_name in ['GridSearch', 'BatchTuner']:
if dispatch_name in ['GridSearch', 'BatchTuner', 'Random']:
experiment_config[dispatch_type.lower()] = {
'builtin' + dispatch_type + 'Name': dispatch_name
}
Expand Down
2 changes: 1 addition & 1 deletion test/tuner_test/naive_trial.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import nni

params = nni.get_parameters()
params = nni.get_next_parameter()
print('params:', params)
x = params['x']

Expand Down
4 changes: 2 additions & 2 deletions tools/nni_cmd/config_schema.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,14 +54,14 @@
Optional('gpuNum'): And(int, lambda x: 0 <= x <= 99999),
}),
Optional('tuner'): Or({
'builtinTunerName': Or('TPE', 'Random', 'Anneal', 'SMAC', 'Evolution'),
'builtinTunerName': Or('TPE', 'Anneal', 'SMAC', 'Evolution'),
Optional('classArgs'): {
'optimize_mode': Or('maximize', 'minimize')
},
Optional('includeIntermediateResults'): bool,
Optional('gpuNum'): And(int, lambda x: 0 <= x <= 99999),
},{
'builtinTunerName': Or('BatchTuner', 'GridSearch'),
'builtinTunerName': Or('BatchTuner', 'GridSearch', 'Random'),
Optional('gpuNum'): And(int, lambda x: 0 <= x <= 99999),
},{
'builtinTunerName': 'NetworkMorphism',
Expand Down

0 comments on commit 77526d3

Please sign in to comment.