Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Update multiphase doc #1242

Merged
merged 16 commits into from
Jul 22, 2019
Merged
Changes from 14 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 30 additions & 4 deletions docs/en_US/AdvancedFeature/MultiPhase.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,9 +32,31 @@ It is pretty simple to use multi-phase in trial code, an example is shown below:
# ...
```

__2. Modify experiment configuration__
__2. Experiment configuration__

To enable multi-phase, you should also add `multiPhase: true` in your experiment YAML configure file. If this line is not added, `nni.get_next_parameter()` would always return the same configuration. For all the built-in tuners/advisors, you can use multi-phase in your trial code without modification of tuner/advisor spec in the YAML configure file.
To enable multi-phase, you should also add `multiPhase: true` in your experiment YAML configure file. If this line is not added, `nni.get_next_parameter()` would always return the same configuration.

Multi-phase experiment configuration example:

```
authorName: default
experimentName: multiphase experiment
trialConcurrency: 2
maxExecDuration: 1h
maxTrialNum: 8
trainingServicePlatform: local
searchSpacePath: search_space.json
multiPhase: true
useAnnotation: false
tuner:
builtinTunerName: TPE
classArgs:
optimize_mode: maximize
trial:
command: python3 mytrial.py
codeDir: .
gpuNum: 0
```

### Write a tuner that leverages multi-phase:

Expand All @@ -48,6 +70,10 @@ trial_end
```
With this information, the tuner could know which trial is requesting a configuration, and which trial is reporting results. This information provides enough flexibility for your tuner to deal with different trials and different phases. For example, you may want to use the trial_job_id parameter of generate_parameters method to generate hyperparameters for a specific trial job.

Of course, to use your multi-phase tuner, __you should add `multiPhase: true` in your experiment YAML configure file__.

[ENAS tuner](https://github.com/countif/enas_nni/blob/master/nni/examples/tuners/enas/nni_controller_ptb.py) is an example of a multi-phase tuner.
### Tuners support multi-phase experiments:

[TPE](../Tuner/BuiltinTuner.md#TPE), [Random](../Tuner/BuiltinTuner.md#Random), [Anneal](../Tuner/BuiltinTuner.md#Anneal), [Evolution](../Tuner/BuiltinTuner.md#Evolution), [SMAC](../Tuner/BuiltinTuner.md#SMAC), [NetworkMorphism](../Tuner/BuiltinTuner.md#NetworkMorphism), [MetisTuner](../Tuner/BuiltinTuner.md#MetisTuner), [BOHB](../Tuner/BuiltinTuner.md#BOHB), [Hyperband](../Tuner/BuiltinTuner.md#Hyperband), [ENAS tuner](https://github.com/countif/enas_nni/blob/master/nni/examples/tuners/enas/nni_controller_ptb.py).
chicm-ms marked this conversation as resolved.
Show resolved Hide resolved

### Training services support multi-phase experiment:
[Local Machine](../TrainingService/LocalMode.md), [Remote Servers](../TrainingService/RemoteMachineMode.md), [OpenPAI](../TrainingService/PaiMode.md)