Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

update dev-doc to sphinx #630

Merged
merged 57 commits into from
Jan 21, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
6224a4f
add trigger (#544)
SparkSnail Dec 29, 2018
cb83ac0
NNI logging architecture improvement (#539)
yds05 Dec 29, 2018
ce1bc48
[WebUI] Fix issue#517 & issue#459 (#524)
lvybriage Dec 29, 2018
37354df
[Logging architecture refactor] Remove unused metrics related code in…
yds05 Jan 2, 2019
d5f808b
Doc typo and format fixes (#560)
squirrelsc Jan 3, 2019
1fec96c
fix state transition (#504)
QuanluZhang Jan 3, 2019
17a39d4
Add enas nni version from contributor (#557)
xuehui1991 Jan 3, 2019
50697a2
Code coverage report (#559)
chicm-ms Jan 3, 2019
949aacb
Updating Readme to add the Related Projects like PAI, KubeLauncher an…
scarlett2018 Jan 4, 2019
e333264
Fix remote TrainingService bug, change forEach to "for of" (#564)
SparkSnail Jan 4, 2019
1159c85
To install the whole nni in an virtual environment (#538)
Crysple Jan 6, 2019
b7729f0
Make it feasible for annotation whether to add an extra line "nni.get…
Crysple Jan 6, 2019
573f23c
fix permision deny (#567)
Crysple Jan 6, 2019
816dd60
Add Metis Tuner (#534)
xuehui1991 Jan 7, 2019
6b8d385
Change WARNING to INFO (#574)
SparkSnail Jan 8, 2019
0a3cc45
Fix some bugs in doc and log (#561)
Jan 8, 2019
95d1947
Fix a race condidtion issue in trial_keeper for reading log from pip…
yds05 Jan 8, 2019
f437107
[WebUI] Fix issue#458 about final result as dict (#563)
lvybriage Jan 8, 2019
e6eb6ea
support frameworkcontroller log (#572)
SparkSnail Jan 8, 2019
358efb2
Dev weight sharing (#568) (#576)
leckie-chn Jan 8, 2019
d13964d
Add frameworkcontroller document (#530)
SparkSnail Jan 8, 2019
c288a16
[WebUI] Show trial log for pai and k8s (#580)
lvybriage Jan 8, 2019
5f4652d
[WebUI] Show trial log for pai and k8s (#580)
lvybriage Jan 8, 2019
dbe7c21
add __init__.py to metis_tuner (#588)
xuehui1991 Jan 9, 2019
c76dcce
[Document] Update webui doc (#587)
lvybriage Jan 10, 2019
e209869
Update Dockerfile and README (#589)
Jan 11, 2019
db973f9
[WebUI] Fix bug (#591)
lvybriage Jan 11, 2019
e33b14d
update README in metis and update RuntimeError info (#595)
xuehui1991 Jan 11, 2019
a18b26f
udpate NFS setup tutorial (#597)
leckie-chn Jan 11, 2019
63aa63b
Remove unused example (#600)
xuehui1991 Jan 14, 2019
3988b69
Update release note (#603)
xuehui1991 Jan 14, 2019
48ff5c5
Merge remote-tracking branch 'upstream/v0.5' into dev-doc
leckie-chn Jan 15, 2019
310318e
sphinx init
leckie-chn Jan 15, 2019
698c02b
Merge remote-tracking branch 'upstream/dev-doc' into dev-sphinx
leckie-chn Jan 15, 2019
4528ed3
migration to sphinx
leckie-chn Jan 16, 2019
93cc776
update requirements
leckie-chn Jan 16, 2019
f29ca41
fix read the docs bug
leckie-chn Jan 16, 2019
75932dd
fix reference
leckie-chn Jan 16, 2019
2b41372
fix bug
leckie-chn Jan 16, 2019
fab9265
fix logo for homepage
leckie-chn Jan 17, 2019
c3d9941
remote github root
leckie-chn Jan 17, 2019
8de482f
update conf
leckie-chn Jan 17, 2019
3fe1370
Revert "remote github root"
leckie-chn Jan 17, 2019
5bf6d91
Revert "update conf"
leckie-chn Jan 17, 2019
5250870
Revert "Revert "remote github root""
leckie-chn Jan 17, 2019
c74118b
update tutorial
leckie-chn Jan 18, 2019
de40f84
add reference for nni.tuner.Tuner
leckie-chn Jan 18, 2019
0a72648
udpate sdk reference
leckie-chn Jan 18, 2019
c5270ae
update assessors
leckie-chn Jan 18, 2019
8c8b6b4
update index
leckie-chn Jan 18, 2019
6745449
fix bug
leckie-chn Jan 18, 2019
f0c04a7
fix doc bug
leckie-chn Jan 18, 2019
95a83fb
update
leckie-chn Jan 18, 2019
8757441
Merge remote-tracking branch 'upstream/dev-doc' into dev-sphinx
leckie-chn Jan 18, 2019
c3b7a66
delete yml
leckie-chn Jan 18, 2019
99e897f
Revert "Merge remote-tracking branch 'upstream/v0.5' into dev-doc"
leckie-chn Jan 18, 2019
997f0fc
fix doc
leckie-chn Jan 18, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion _config.yml

This file was deleted.

3 changes: 3 additions & 0 deletions docs/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
_build
_static
_templates
14 changes: 1 addition & 13 deletions docs/Assessors.md → docs/Builtin_Assessors.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,4 @@
# Assessors

## Overview

In order to save our computing resources, NNI supports an early stop policy and creates **Assessor** to finish this job.

Assessor receives the intermediate result from Trial and decides whether the Trial should be killed by specific algorithm. Once the Trial experiment meets the early stop conditions(which means assessor is pessimistic about the final results), the assessor will kill the trial and the status of trial will be `"EARLY_STOPPED"`.

Here is an experimental result of MNIST after using 'Curvefitting' Assessor in 'maximize' mode, you can see that assessor successfully **early stopped** many trials with bad hyperparameters in advance. If you use assessor, we may get better hyperparameters under the same computing resources.

*Implemented code directory: [config_assessor.yml][5]*

![](./img/Assessor.png)
# Builtin Assessors

NNI provides the-state-of-art tuning algorithm in our builtin-assessors and makes them easy to use. Below is the brief overview of NNI current builtin Assessors:

Expand Down
10 changes: 1 addition & 9 deletions docs/Tuners.md → docs/Builtin_Tuner.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,4 @@
# Tuners

## Overview

NNI provides an easy way to adopt an approach to set up parameter tuning algorithms, we call them **Tuner**.

Tuner receives the result from `Trial` as a matrix to evaluate the performance of a specific parameters/architecture configures. And tuner sends next hyper-parameter or architecture configure to Trial.

In NNI, we support two approaches to set the tuner: first is directly use builtin tuner provided by nni sdk, second is customize a tuner file by yourself.
# Builtin Tuners

NNI provides the-state-of-art tuning algorithm in our builtin-tuners and makes them easy to use. Below is the brief overview of NNI current builtin Tuners:

Expand Down
8 changes: 8 additions & 0 deletions docs/Contribution.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
###############################
Contribution to NNI
###############################

.. toctree::
Development Setup<SetupNNIDeveloperEnvironment>
Contribution Guide<CONTRIBUTING>
Debug HowTo<HowToDebug>
6 changes: 6 additions & 0 deletions docs/Examples.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
######################
Examples
######################

.. toctree::
MNIST_Examples<mnist_examples>
2 changes: 2 additions & 0 deletions docs/FAQ.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
# FAQ

This page is for frequent asked questions and answers.


Expand Down
3 changes: 1 addition & 2 deletions docs/GetStarted.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
**Get Started with NNI**
===
# Get Started with NNI

## **Installation**
* __Dependencies__
Expand Down
3 changes: 1 addition & 2 deletions docs/Installation.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
**Installation of NNI**
===
# Installation of NNI

Currently we only support installation on Linux & Mac.

Expand Down
19 changes: 19 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SOURCEDIR = .
BUILDDIR = _build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
28 changes: 16 additions & 12 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -1,22 +1,24 @@
# Overview

<p align="center">
<img src="img/nni_logo.png" alt="drawing" width="300"/>
<img src="https://raw.githubusercontent.com/Microsoft/nni/master/docs/img/nni_logo.png" alt="drawing" width="300"/>
</p>

-----------

[![MIT licensed](https://img.shields.io/badge/license-MIT-yellow.svg)](https://github.com/Microsoft/nni/blob/master/LICENSE)
[![Build Status](https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/Microsoft.nni)](https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=6)
[![Issues](https://img.shields.io/github/issues-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen)
[![Bugs](https://img.shields.io/github/issues/Microsoft/nni/bug.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3Abug)
[![Pull Requests](https://img.shields.io/github/issues-pr-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/pulls?q=is%3Apr+is%3Aopen)
[![Version](https://img.shields.io/github/release/Microsoft/nni.svg)](https://github.com/Microsoft/nni/releases) [![Join the chat at https://gitter.im/Microsoft/nni](https://badges.gitter.im/Microsoft/nni.svg)](https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[![](https://img.shields.io/badge/license-MIT-yellow.svg)](https://github.com/Microsoft/nni/blob/master/LICENSE)
[![](https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/Microsoft.nni)](https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=6)
[![](https://img.shields.io/github/issues-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen)
[![](https://img.shields.io/github/issues/Microsoft/nni/bug.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3Abug)
[![](https://img.shields.io/github/issues-pr-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/pulls?q=is%3Apr+is%3Aopen)
[![](https://img.shields.io/github/release/Microsoft/nni.svg)](https://github.com/Microsoft/nni/releases)
[![](https://badges.gitter.im/Microsoft/nni.svg)](https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)

NNI (Neural Network Intelligence) is a toolkit to help users run automated machine learning (AutoML) experiments.
The tool dispatches and runs trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different environments like local machine, remote servers and cloud.

<p align="center">
<img src="img/nni_arch_overview.png" alt="drawing"/>
</p>

![](img/nni_arch_overview.png)

## **Who should consider using NNI**
* Those who want to try different AutoML algorithms in their training code (model) at their local machine.
Expand Down Expand Up @@ -86,11 +88,13 @@ You can use these commands to get more information about the experiment

* Open the `Web UI url` in your browser, you can view detail information of the experiment and all the submitted trial jobs as shown below. [Here](WebUI.md) are more Web UI pages.


<table style="border: none">
<th><img src="img/webui_overview_page.png" alt="drawing" width="395"/></th>
<th><img src="img/webui_trialdetail_page.png" alt="drawing" width="410"/></th>
<th><img src="https://raw.githubusercontent.com/Microsoft/nni/dev-doc/docs/img/webui_overview_page.png" alt="drawing" width="395"/></th>
<th><img src="https://raw.githubusercontent.com/Microsoft/nni/dev-doc/docs/img/webui_trialdetail_page.png" alt="drawing" width="410"/></th>
</table>


## **Documentation**
* [NNI overview](Overview.md)
* [Quick start](GetStarted.md)
Expand Down
61 changes: 45 additions & 16 deletions docs/RELEASE.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,35 @@
# Release 0.4 - 12/6/2018

## Major Features
# ChangeLog

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please add v0.5 release...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually I reverted the merge with v0.5 branch before initiating this PR. changes for v0.5 should not come in dev-doc branch currently. At least there should be a seperate merge PR.

## Release 0.4.1 - 12/14/2018
### Major Features
#### New tuner supports
* Support [network morphism](./HowToChooseTuner.md#NetworkMorphism) as a new tuner

#### Training Service improvements
* Migrate [Kubeflow training service](https://github.com/Microsoft/nni/blob/master/docs/KubeflowMode.md)'s dependency from kubectl CLI to [Kubernetes API](https://kubernetes.io/docs/concepts/overview/kubernetes-api/) client
* [Pytorch-operator](https://github.com/kubeflow/pytorch-operator) support for Kubeflow training service
* Improvement on local code files uploading to OpenPAI HDFS
* Fixed OpenPAI integration WebUI bug: WebUI doesn't show latest trial job status, which is caused by OpenPAI token expiration

#### NNICTL improvements
* Show version information both in nnictl and WebUI. You can run **nnictl -v** to show your current installed NNI version

#### WebUI improvements
* Enable modify concurrency number during experiment
* Add feedback link to NNI github 'create issue' page
* Enable customize top 10 trials regarding to metric numbers (largest or smallest)
* Enable download logs for dispatcher & nnimanager
* Enable automatic scaling of axes for metric number
* Update annotation to support displaying real choice in searchspace

### New examples
* [FashionMnist](https://github.com/Microsoft/nni/tree/master/examples/trials/network_morphism), work together with network morphism tuner
* [Distributed MNIST example](https://github.com/Microsoft/nni/tree/master/examples/trials/mnist-distributed-pytorch) written in PyTorch


## Release 0.4 - 12/6/2018

### Major Features
* [Kubeflow Training service](./KubeflowMode.md)
* Support tf-operator
* [Distributed trial example](../examples/trials/mnist-distributed/dist_mnist.py) on Kubeflow
Expand All @@ -15,7 +44,7 @@
* Support search a specific trial by trial number
* Show trial's hdfsLogPath
* Download experiment parameters
## Others
### Others
* Asynchronous dispatcher
* Docker file update, add pytorch library
* Refactor 'nnictl stop' process, send SIGTERM to nni manager process, rather than calling stop Rest API.
Expand All @@ -25,8 +54,8 @@
* Don’t print useless ‘metrics is empty’ log int PAI job’s stdout. Only print useful message once new metrics are recorded, to reduce confusion when user checks PAI trial’s output for debugging purpose
* Add timestamp at the beginning of each log entry in trial keeper.

# Release 0.3.0 - 11/2/2018
## NNICTL new features and updates
## Release 0.3.0 - 11/2/2018
### NNICTL new features and updates
* Support running multiple experiments simultaneously.

Before v0.3, NNI only supports running single experiment once a time. After this realse, users are able to run multiple experiments simultaneously. Each experiment will require a unique port, the 1st experiment will be set to the default port as previous versions. You can specify a unique port for the rest experiments as below:
Expand All @@ -35,7 +64,7 @@
* Support updating max trial number.
use ```nnictl update --help``` to learn more. Or refer to [NNICTL Spec](https://github.com/Microsoft/nni/blob/master/docs/NNICTLDOC.md) for the fully usage of NNICTL.

## API new features and updates
### API new features and updates
* <span style="color:red">**breaking change**</span>: nn.get_parameters() is refactored to nni.get_next_parameter. All examples of prior releases can not run on v0.3, please clone nni repo to get new examples. If you had applied NNI to your own codes, please update the API accordingly.

* New API **nni.get_sequence_id()**.
Expand All @@ -48,23 +77,23 @@
* float
* A python dict containing 'default' key, the value of 'default' key should be of type int or float. The dict can contain any other key value pairs.

## New tuner support
### New tuner support
* **Batch Tuner** which iterates all parameter combination, can be used to submit batch trial jobs.

## New examples
### New examples
* A NNI Docker image for public usage:
```docker pull msranni/nni:latest```
* New trial example: [NNI Sklearn Example](https://github.com/Microsoft/nni/tree/master/examples/trials/sklearn)
* New competition example: [Kaggle Competition TGS Salt Example](https://github.com/Microsoft/nni/tree/master/examples/trials/kaggle-tgs-salt)

## Others
### Others
* UI refactoring, refer to [WebUI doc](WebUI.md) for how to work with the new UI.
* Continuous Integration: NNI had switched to Azure pipelines
* [Known Issues in release 0.3.0](https://github.com/Microsoft/nni/labels/nni030knownissues).


# Release 0.2.0 - 9/29/2018
## Major Features
## Release 0.2.0 - 9/29/2018
### Major Features
* Support [OpenPAI](https://github.com/Microsoft/pai) (aka pai) Training Service (See [here](./PAIMode.md) for instructions about how to submit NNI job in pai mode)
* Support training services on pai mode. NNI trials will be scheduled to run on OpenPAI cluster
* NNI trial's output (including logs and model file) will be copied to OpenPAI HDFS for further debugging and checking
Expand All @@ -75,14 +104,14 @@
* Update ga squad example and related documentation
* WebUI UX small enhancement and bug fix

## Known Issues
### Known Issues
[Known Issues in release 0.2.0](https://github.com/Microsoft/nni/labels/nni020knownissues).

# Release 0.1.0 - 9/10/2018 (initial release)
## Release 0.1.0 - 9/10/2018 (initial release)

Initial release of Neural Network Intelligence (NNI).

## Major Features
### Major Features
* Installation and Deployment
* Support pip install and source codes install
* Support training services on local mode(including Multi-GPU mode) as well as multi-machines mode
Expand All @@ -99,7 +128,7 @@ Initial release of Neural Network Intelligence (NNI).
* Others
* Support simple GPU job scheduling

## Known Issues
### Known Issues
[Known Issues in release 0.1.0](https://github.com/Microsoft/nni/labels/nni010knownissues).


11 changes: 11 additions & 0 deletions docs/Reference.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
References
==================

.. toctree::
:maxdepth: 3

Command Line <NNICTLDOC>
Python API <sdk_reference>
Annotation <AnnotationSpec>
Configuration<ExperimentConfig>
Search Space <SearchSpaceSpec>
13 changes: 13 additions & 0 deletions docs/Tutorials.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
######################
Tutorials
######################

.. toctree::
QuickStart
Installation
Trial<howto_1_WriteTrial>
Tuners<tuners>
Assessors<assessors>
WebUI
Training Platform<training_services>
AdvancedNAS
19 changes: 19 additions & 0 deletions docs/assessors.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
Assessors
==============
Overview
--------------
In order to save our computing resources, NNI supports an early stop policy and creates **Assessor** to finish this job.

Assessor receives the intermediate result from Trial and decides whether the Trial should be killed by specific algorithm. Once the Trial experiment meets the early stop conditions(which means assessor is pessimistic about the final results), the assessor will kill the trial and the status of trial will be `"EARLY_STOPPED"`.

Here is an experimental result of MNIST after using 'Curvefitting' Assessor in 'maximize' mode, you can see that assessor successfully **early stopped** many trials with bad hyperparameters in advance. If you use assessor, we may get better hyperparameters under the same computing resources.

*Implemented code directory: [config_assessor.yml][5]*

![](./img/Assessor.png)

Like Tuners, users can either use built-in Assessors, or customize an Assessor on their own. Please refer to the following tutorials for detail:

.. toctree::
Builtin Assessors<Builtin_Assessors>
Customized Assessors<Customize_Assessor>
Loading