Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Doc update index #2017

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/en_US/Compressor/Overview.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Model Compression with NNI
As larger neural networks with more layers and nodes are considered, reducing their storage and computational cost becomes critical, especially for some real-time applications. Model compression can be used to address this problem.

We are glad to announce the alpha release for model compression toolkit on top of NNI, it's still in the experiment phase which might evolve based on usage feedback. We'd like to invite you to use, feedback and even contribute.
We are glad to introduce model compression toolkit on top of NNI, it's still in the experiment phase which might evolve based on usage feedback. We'd like to invite you to use, feedback and even contribute.

NNI provides an easy-to-use toolkit to help user design and use compression algorithms. It currently supports PyTorch with unified interface. For users to compress their models, they only need to add several lines in their code. There are some popular model compression algorithms built-in in NNI. Users could further use NNI's auto tuning power to find the best compressed model, which is detailed in [Auto Model Compression](./AutoCompression.md). On the other hand, users could easily customize their new compression algorithms using NNI's interface, refer to the tutorial [here](#customize-new-compression-algorithms).

Expand Down Expand Up @@ -335,7 +335,7 @@ class YourQuantizer(Quantizer):
If you do not customize `QuantGrad`, the default backward is Straight-Through Estimator.
_Coming Soon_ ...

## **Reference and Feedback**
## Reference and Feedback
* To [report a bug](https://github.com/microsoft/nni/issues/new?template=bug-report.md) for this feature in GitHub;
* To [file a feature or improvement request](https://github.com/microsoft/nni/issues/new?template=enhancement.md) for this feature in GitHub;
* To know more about [Feature Engineering with NNI](https://github.com/microsoft/nni/blob/master/docs/en_US/FeatureEngineering/Overview.md);
Expand Down
80 changes: 80 additions & 0 deletions docs/en_US/autotune_ref.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
# Python API Reference of Auto Tune
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file can be an rst.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just be consistent with NasReference.md, what is the reason of preferring rst?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can be. Not preferred to be. rst would save a few lines.


```eval_rst
.. contents::
```

## Trial

```eval_rst
.. autofunction:: nni.get_next_parameter
.. autofunction:: nni.get_current_parameter
.. autofunction:: nni.report_intermediate_result
.. autofunction:: nni.report_final_result
.. autofunction:: nni.get_experiment_id
.. autofunction:: nni.get_trial_id
.. autofunction:: nni.get_sequence_id
```

## Tuner

```eval_rst
.. autoclass:: nni.tuner.Tuner
:members:

.. autoclass:: nni.hyperopt_tuner.hyperopt_tuner.HyperoptTuner
:members:

.. autoclass:: nni.evolution_tuner.evolution_tuner.EvolutionTuner
:members:

.. autoclass:: nni.smac_tuner.SMACTuner
:members:

.. autoclass:: nni.gridsearch_tuner.GridSearchTuner
:members:

.. autoclass:: nni.networkmorphism_tuner.networkmorphism_tuner.NetworkMorphismTuner
:members:

.. autoclass:: nni.metis_tuner.metis_tuner.MetisTuner
:members:

.. autoclass:: nni.ppo_tuner.PPOTuner
:members:

.. autoclass:: nni.batch_tuner.batch_tuner.BatchTuner
:members:

.. autoclass:: nni.gp_tuner.gp_tuner.GPTuner
:members:
```

## Assessor

```eval_rst
.. autoclass:: nni.assessor.Assessor
:members:

.. autoclass:: nni.assessor.AssessResult
:members:

.. autoclass:: nni.curvefitting_assessor.CurvefittingAssessor
:members:

.. autoclass:: nni.medianstop_assessor.MedianstopAssessor
:members:
```

## Advisor

```eval_rst
.. autoclass:: nni.msg_dispatcher_base.MsgDispatcherBase
:members:

.. autoclass:: nni.hyperband_advisor.hyperband_advisor.Hyperband
:members:

.. autoclass:: nni.bohb_advisor.bohb_advisor.BOHB
:members:
```
7 changes: 5 additions & 2 deletions docs/en_US/feature_engineering.rst
Original file line number Diff line number Diff line change
@@ -1,13 +1,16 @@
###################
Feature Engineering
===================
###################

We are glad to announce the alpha release for Feature Engineering toolkit on top of NNI,
We are glad to introduce Feature Engineering toolkit on top of NNI,
it's still in the experiment phase which might evolve based on usage feedback.
We'd like to invite you to use, feedback and even contribute.

For details, please refer to the following tutorials:

.. toctree::
:maxdepth: 2

Overview <FeatureEngineering/Overview>
GradientFeatureSelector <FeatureEngineering/GradientFeatureSelector>
GBDTSelector <FeatureEngineering/GBDTSelector>
18 changes: 15 additions & 3 deletions docs/en_US/hyperparameter_tune.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,18 @@
######################
Hyper-parameter Tuning
######################
#############################
Auto (Hyper-parameter) Tuning
#############################

Auto tuning is one of the key features provided by NNI, a main application scenario is
hyper-parameter tuning. Trial code is the one to be tuned, we provide a lot of popular
auto tuning algorithms (called Tuner), and some early stop algorithms (called Assessor).
NNI supports running trial on various training platforms, for example, on a local machine,
on several servers in a distributed manner, or on platforms such as OpenPAI, Kubernetes.

Other key features of NNI, such as model compression, feature engineering, can also be further
enhanced by auto tuning, which is described when introduing those features.

NNI has high extensibility, advanced users could customized their own Tuner, Assessor, and Training Service
according to their needs.

.. toctree::
:maxdepth: 2
Expand Down
5 changes: 1 addition & 4 deletions docs/en_US/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,6 @@
Neural Network Intelligence
###########################

********
Contents
********

.. toctree::
:caption: Table of Contents
Expand All @@ -14,7 +11,7 @@ Contents
Overview
Installation <installation>
QuickStart <Tutorial/QuickStart>
Hyper-parameter Tuning <hyperparameter_tune>
Auto (Hyper-parameter) Tuning <hyperparameter_tune>
Neural Architecture Search <nas>
Model Compression <model_compression>
Feature Engineering <feature_engineering>
Expand Down
13 changes: 4 additions & 9 deletions docs/en_US/model_compression.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,9 @@ On the other hand, users could easily customize their new compression algorithms
For details, please refer to the following tutorials:

.. toctree::
:maxdepth: 2

Overview <Compressor/Overview>
Level Pruner <Compressor/Pruner>
AGP Pruner <Compressor/Pruner>
L1Filter Pruner <Compressor/l1filterpruner>
Slim Pruner <Compressor/SlimPruner>
Lottery Ticket Pruner <Compressor/LotteryTicketHypothesis>
FPGM Pruner <Compressor/Pruner>
Naive Quantizer <Compressor/Quantizer>
QAT Quantizer <Compressor/Quantizer>
DoReFa Quantizer <Compressor/Quantizer>
Pruners <pruners>
Quantizers <quantizers>
Automatic Model Compression <Compressor/AutoCompression>
21 changes: 11 additions & 10 deletions docs/en_US/nas.rst
Original file line number Diff line number Diff line change
@@ -1,26 +1,27 @@
##############
NAS Algorithms
##############
##########################
Neural Architecture Search
##########################

Automatic neural architecture search is taking an increasingly important role on finding better models.
Recent research works have proved the feasibility of automatic NAS, and also found some models that could beat manually designed and tuned models.
Some of representative works are NASNet, ENAS, DARTS, Network Morphism, and Evolution. There are new innovations keeping emerging.
Recent research works have proved the feasibility of automatic NAS, and also found some models that could beat manually tuned models.
Some of representative works are NASNet, ENAS, DARTS, Network Morphism, and Evolution. Moreover, new innovations keep emerging.

However, it takes great efforts to implement NAS algorithms, and it is hard to reuse code base of existing algorithms in new one.
However, it takes great efforts to implement NAS algorithms, and it is hard to reuse code base of existing algorithms in a new one.
To facilitate NAS innovations (e.g., design and implement new NAS models, compare different NAS models side-by-side),
an easy-to-use and flexible programming interface is crucial.

With this motivation, our ambition is to provide a unified architecture in NNI,
Therefore, we provide a unified interface for NAS,
to accelerate innovations on NAS, and apply state-of-art algorithms on real world problems faster.

For details, please refer to the following tutorials:

.. toctree::
:maxdepth: 2

Overview <NAS/Overview>
Guide <NAS/NasGuide>
API Reference <NAS/NasReference>
Tutorial <NAS/NasGuide>
ENAS <NAS/ENAS>
DARTS <NAS/DARTS>
P-DARTS <NAS/PDARTS>
SPOS <NAS/SPOS>
CDARTS <NAS/CDARTS>
API Reference <NAS/NasReference>
16 changes: 16 additions & 0 deletions docs/en_US/pruners.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
############################
Supported Pruning Algorithms
############################

.. toctree::
:maxdepth: 1

Level Pruner <Compressor/Pruner>
AGP Pruner <Compressor/Pruner>
Lottery Ticket Pruner <Compressor/LotteryTicketHypothesis>
FPGM Pruner <Compressor/Pruner>
L1Filter Pruner <Compressor/l1filterpruner>
L2Filter Pruner <Compressor/Pruner>
ActivationAPoZRankFilterPruner <Compressor/Pruner>
ActivationMeanRankFilterPruner <Compressor/Pruner>
Slim Pruner <Compressor/SlimPruner>
11 changes: 11 additions & 0 deletions docs/en_US/quantizers.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
#################################
Supported Quantization Algorithms
#################################

.. toctree::
:maxdepth: 1

Naive Quantizer <Compressor/Quantizer>
QAT Quantizer <Compressor/Quantizer>
DoReFa Quantizer <Compressor/Quantizer>
BNN Quantizer <Compressor/Quantizer>
74 changes: 6 additions & 68 deletions docs/en_US/sdk_reference.rst
Original file line number Diff line number Diff line change
@@ -1,72 +1,10 @@
###########################
####################
Python API Reference
###########################
####################

Trial
------------------------
.. autofunction:: nni.get_next_parameter
.. autofunction:: nni.get_current_parameter
.. autofunction:: nni.report_intermediate_result
.. autofunction:: nni.report_final_result
.. autofunction:: nni.get_experiment_id
.. autofunction:: nni.get_trial_id
.. autofunction:: nni.get_sequence_id

.. toctree::
:maxdepth: 1

Tuner
------------------------
.. autoclass:: nni.tuner.Tuner
:members:

.. autoclass:: nni.hyperopt_tuner.hyperopt_tuner.HyperoptTuner
:members:

.. autoclass:: nni.evolution_tuner.evolution_tuner.EvolutionTuner
:members:

.. autoclass:: nni.smac_tuner.SMACTuner
:members:

.. autoclass:: nni.gridsearch_tuner.GridSearchTuner
:members:

.. autoclass:: nni.networkmorphism_tuner.networkmorphism_tuner.NetworkMorphismTuner
:members:

.. autoclass:: nni.metis_tuner.metis_tuner.MetisTuner
:members:

.. autoclass:: nni.ppo_tuner.PPOTuner
:members:

.. autoclass:: nni.batch_tuner.batch_tuner.BatchTuner
:members:

.. autoclass:: nni.gp_tuner.gp_tuner.GPTuner
:members:

Assessor
------------------------
.. autoclass:: nni.assessor.Assessor
:members:

.. autoclass:: nni.assessor.AssessResult
:members:

.. autoclass:: nni.curvefitting_assessor.CurvefittingAssessor
:members:

.. autoclass:: nni.medianstop_assessor.MedianstopAssessor
:members:


Advisor
------------------------
.. autoclass:: nni.msg_dispatcher_base.MsgDispatcherBase
:members:

.. autoclass:: nni.hyperband_advisor.hyperband_advisor.Hyperband
:members:

.. autoclass:: nni.bohb_advisor.bohb_advisor.BOHB
:members:
Auto Tune <autotune_ref>
NAS <NAS/NasReference>