Skip to content

Commit

Permalink
Sdk update (#272)
Browse files Browse the repository at this point in the history
* Rename get_parameters to get_next_parameter

* annotations add get_next_parameter

* updates

* updates

* updates

* updates

* updates
  • Loading branch information
chicm-ms authored Oct 26, 2018
1 parent a101461 commit 0c17e2d
Show file tree
Hide file tree
Showing 25 changed files with 334 additions and 317 deletions.
13 changes: 8 additions & 5 deletions docs/AnnotationSpec.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,23 +4,26 @@ For good user experience and reduce user effort, we need to design a good annota

If users use NNI system, they only need to:

1. Annotation variable in code as:
1. Use nni.get_next_parameter() to retrieve hyper parameters from Tuner, before using other annotation, use following annotation at the begining of trial code:
'''@nni.get_next_parameter()'''

2. Annotation variable in code as:

'''@nni.variable(nni.choice(2,3,5,7),name=self.conv_size)'''

2. Annotation intermediate in code as:
3. Annotation intermediate in code as:

'''@nni.report_intermediate_result(test_acc)'''

3. Annotation output in code as:
4. Annotation output in code as:

'''@nni.report_final_result(test_acc)'''

4. Annotation `function_choice` in code as:
5. Annotation `function_choice` in code as:

'''@nni.function_choice(max_pool(h_conv1, self.pool_size),avg_pool(h_conv1, self.pool_size),name=max_pool)'''

In this way, they can easily realize automatic tuning on NNI.
In this way, they can easily implement automatic tuning on NNI.

For `@nni.variable`, `nni.choice` is the type of search space and there are 10 types to express your search space as follows:

Expand Down
2 changes: 1 addition & 1 deletion docs/howto_1_WriteTrial.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ Refer to [SearchSpaceSpec.md](SearchSpaceSpec.md) to learn more about search spa
2.2 Get predefined parameters
Use the following code snippet:
RECEIVED_PARAMS = nni.get_parameters()
RECEIVED_PARAMS = nni.get_next_parameter()
to get hyper-parameters' values assigned by tuner. `RECEIVED_PARAMS` is an object, for example:
Expand Down
2 changes: 1 addition & 1 deletion docs/howto_2_CustomizedTuner.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ If the you implement the ```generate_parameters``` like this:
# your code implements here.
return {"dropout": 0.3, "learning_rate": 0.4}
```
It's means your Tuner will always generate parameters ```{"dropout": 0.3, "learning_rate": 0.4}```. Then Trial will receive ```{"dropout": 0.3, "learning_rate": 0.4}``` this object will using ```nni.get_parameters()``` API from NNI SDK. After training of Trial, it will send result to Tuner by calling ```nni.report_final_result(0.93)```. Then ```receive_trial_result``` will function will receied these parameters like:
It means your Tuner will always generate parameters ```{"dropout": 0.3, "learning_rate": 0.4}```. Then Trial will receive ```{"dropout": 0.3, "learning_rate": 0.4}``` by calling API ```nni.get_next_parameter()```. Once the trial ends with a result (normally some kind of metrics), it can send the result to Tuner by calling API ```nni.report_final_result()```, for example ```nni.report_final_result(0.93)```. Then your Tuner's ```receive_trial_result``` function will receied the result like:
```
parameter_id = 82347
parameters = {"dropout": 0.3, "learning_rate": 0.4}
Expand Down
Loading

0 comments on commit 0c17e2d

Please sign in to comment.