Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

merge master #261

Merged
merged 25 commits into from
Jul 10, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
a767b79
Fix win32 build (#2623)
chicm-ms Jul 1, 2020
f761b91
add execute permission to run.sh for opevo (#2639)
XiaotianGao Jul 6, 2020
66fec2c
Bugbash for speedup module. (#2634)
zheng-ningxin Jul 6, 2020
9c4a1c7
fix docker in aml (#2633)
SparkSnail Jul 6, 2020
00ddf3a
[bug bash] Rm default key from table column(because had "Default" col…
Lijiaoa Jul 7, 2020
c9f70a6
Fix customized tuner doc (#2628)
chicm-ms Jul 7, 2020
3cffe34
remove gupNum in accessor (#2641)
SparkSnail Jul 7, 2020
c525566
FLOPs/Params counter refinement (#2632)
colorjam Jul 7, 2020
4a76b68
refine opevo doc (#2646)
XiaotianGao Jul 7, 2020
5cbe425
Fix aml message index (#2648)
SparkSnail Jul 7, 2020
0b78b8c
[bug bash] fix a small bug. (#2649)
zheng-ningxin Jul 7, 2020
fe6111d
Eject docker usages in NAS benchmarks (#2645)
ultmaster Jul 7, 2020
e4a6427
fix empty nniManagerIp doesn't work (#2650)
squirrelsc Jul 7, 2020
a77f52e
Update search space documentation (#2637)
tomzx Jul 8, 2020
c4fd715
Update docs for aml and nasbenchmark, re-submit from #2647 (#2651)
JunweiSUN Jul 8, 2020
688feed
Typo fixes in the documentation (#2638)
tomzx Jul 8, 2020
7922875
specific branch version of `git clone` (#2659)
JunweiSUN Jul 8, 2020
47b75d8
Add nniignore example link in training service docs (#2660)
ultmaster Jul 8, 2020
954800e
V1.7 release note (#2662)
chicm-ms Jul 8, 2020
a38df50
Add torch requirement to nasbench201 (#2663)
JunweiSUN Jul 8, 2020
51aebf1
Fix aml doc (#2631)
SparkSnail Jul 8, 2020
755e313
Merge pull request #2664 from microsoft/v1.7
chicm-ms Jul 8, 2020
f80f30d
update version number (#2665)
chicm-ms Jul 8, 2020
44fdc01
Add crowdin integration action (#2654)
squirrelsc Jul 9, 2020
810efed
Update main.yml (#2670)
squirrelsc Jul 9, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
83 changes: 83 additions & 0 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
# This is a basic workflow to help you get started with Actions

name: CI

# Controls when the action will run. Triggers the workflow on push or pull request
# events but only for the master branch
on:
push:
branches: [ master ]
paths:
- '**.md'
- '**.rst'
- '/docs/en_US/**'
- '.github/worklflows/**'
- '/crowdin.yml'
schedule:
- cron: '*/30 * * * *'


# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
translation-syncup:
# The type of runner that the job will run on
runs-on: ubuntu-latest

# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v2

# crowin-translation
- name: crowdin-action
uses: crowdin/github-action@1.0.8
with:
# Upload sources to Crowdin
upload_sources: true # optional, default is true
# Upload translations to Crowdin
upload_translations: false # optional, default is false
# Automatically approves uploaded translations
auto_approve_imported: false # optional, default is false
# Defines whether to add translation if it is equal to source string in Crowdin project
import_eq_suggestions: false # optional, default is false
# Make pull request of Crowdin translations
download_translations: true # optional, default is false
# Use this option to download translations for a single specified language
# download_language: # optional
# Skip untranslated strings in exported files (does not work with .docx, .html, .md and other document files)
# skip_untranslated_strings: # optional, default is false
# Omit downloading not fully downloaded files
# skip_untranslated_files: # optional, default is false
# Include approved translations only in exported files. If not combined with --skip-untranslated-strings option, strings without approval are fulfilled with the source language
export_only_approved: false # optional, default is false
# Download translations with pushing to branch
push_translations: true # optional, default is true
# To download translations to the specified version branch
localization_branch_name: l10n_${{ github.base_ref }} # optional, default is l10n_crowdin_action
# Create pull request after pushing to branch
create_pull_request: false # optional, default is true
# Option to upload or download files to the specified version branch in your Crowdin project
crowdin_branch_name: ${{ github.base_ref }} # optional
# Option to specify a path to user-specific credentials
# identity: # optional
# Option to specify a path to the configuration file
config: crowdin.yml # optional
# Option to preview the list of managed files
# dryrun_action: true # optional, default is false
# Numerical ID of the project
project_id: 304950 # optional
# Personal access token required for authentication
token: ${{ secrets.CROWDIN_TOKEN }} # optional
# Base URL of Crowdin server for API requests execution
# base_url: # optional
# Path to your project directory on a local machine
# base_path: # optional
# Path to the source files
# source: # optional
# Path to the translation files
# translation: # optional
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
CROWDIN_PROJECT_ID: 304950
CROWDIN_PERSONAL_TOKEN: ${{ secrets.CROWDIN_TOKEN }}
7 changes: 4 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@

**NNI (Neural Network Intelligence)** is a lightweight but powerful toolkit to help users **automate** <a href="docs/en_US/FeatureEngineering/Overview.md">Feature Engineering</a>, <a href="docs/en_US/NAS/Overview.md">Neural Architecture Search</a>, <a href="docs/en_US/Tuner/BuiltinTuner.md">Hyperparameter Tuning</a> and <a href="docs/en_US/Compressor/Overview.md">Model Compression</a>.

The tool manages automated machine learning (AutoML) experiments, **dispatches and runs** experiments' trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in **different training environments** like <a href="docs/en_US/TrainingService/LocalMode.md">Local Machine</a>, <a href="docs/en_US/TrainingService/RemoteMachineMode.md">Remote Servers</a>, <a href="docs/en_US/TrainingService/PaiMode.md">OpenPAI</a>, <a href="docs/en_US/TrainingService/KubeflowMode.md">Kubeflow</a>, <a href="docs/en_US/TrainingService/FrameworkControllerMode.md">FrameworkController on K8S (AKS etc.)</a>, <a href="docs/en_US/TrainingService/DLTSMode.md">DLWorkspace (aka. DLTS)</a> and other cloud options.
The tool manages automated machine learning (AutoML) experiments, **dispatches and runs** experiments' trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in **different training environments** like <a href="docs/en_US/TrainingService/LocalMode.md">Local Machine</a>, <a href="docs/en_US/TrainingService/RemoteMachineMode.md">Remote Servers</a>, <a href="docs/en_US/TrainingService/PaiMode.md">OpenPAI</a>, <a href="docs/en_US/TrainingService/KubeflowMode.md">Kubeflow</a>, <a href="docs/en_US/TrainingService/FrameworkControllerMode.md">FrameworkController on K8S (AKS etc.)</a>, <a href="docs/en_US/TrainingService/DLTSMode.md">DLWorkspace (aka. DLTS)</a>, <a href="docs/en_US/TrainingService/AMLMode.md">AML (Azure Machine Learning)</a> and other cloud options.

## **Who should consider using NNI**

Expand All @@ -25,7 +25,7 @@ The tool manages automated machine learning (AutoML) experiments, **dispatches a
* Researchers and data scientists who want to easily **implement and experiment new AutoML algorithms**, may it be: hyperparameter tuning algorithm, neural architect search algorithm or model compression algorithm.
* ML Platform owners who want to **support AutoML in their platform**.

### **[NNI v1.6 has been released!](https://github.com/microsoft/nni/releases) &nbsp;<a href="#nni-released-reminder"><img width="48" src="docs/img/release_icon.png"></a>**
### **[NNI v1.7 has been released!](https://github.com/microsoft/nni/releases) &nbsp;<a href="#nni-released-reminder"><img width="48" src="docs/img/release_icon.png"></a>**

## **NNI capabilities in a glance**

Expand Down Expand Up @@ -170,6 +170,7 @@ Within the following table, we summarized the current NNI capabilities, we are g
<ul>
<li><a href="docs/en_US/TrainingService/LocalMode.md">Local Machine</a></li>
<li><a href="docs/en_US/TrainingService/RemoteMachineMode.md">Remote Servers</a></li>
<li><a href="docs/en_US/TrainingService/AMLMode.md">AML(Azure Machine Learning)</a></li>
<li><b>Kubernetes based services</b></li>
<ul><li><a href="docs/en_US/TrainingService/PaiMode.md">OpenPAI</a></li>
<li><a href="docs/en_US/TrainingService/KubeflowMode.md">Kubeflow</a></li>
Expand Down Expand Up @@ -245,7 +246,7 @@ The following example is built on TensorFlow 1.x. Make sure **TensorFlow 1.x is
* Download the examples via clone the source code.

```bash
git clone -b v1.6 https://github.com/Microsoft/nni.git
git clone -b v1.7 https://github.com/Microsoft/nni.git
```

* Run the MNIST example.
Expand Down
3 changes: 2 additions & 1 deletion azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ jobs:
python3 -m pip install torch==1.5.0+cpu torchvision==0.6.0+cpu -f https://download.pytorch.org/whl/torch_stable.html --user
python3 -m pip install tensorflow==2.2.0 --user
python3 -m pip install keras==2.4.2 --user
python3 -m pip install gym onnx peewee --user
python3 -m pip install gym onnx peewee thop --user
python3 -m pip install sphinx==1.8.3 sphinx-argparse==0.2.5 sphinx-markdown-tables==0.0.9 sphinx-rtd-theme==0.4.2 sphinxcontrib-websupport==1.1.0 recommonmark==0.5.0 nbsphinx --user
sudo apt-get install swig -y
nnictl package install --name=SMAC
Expand Down Expand Up @@ -59,6 +59,7 @@ jobs:
python3 -m pip install --upgrade pip setuptools --user
python3 -m pip install pylint==2.3.1 astroid==2.2.5 --user
python3 -m pip install coverage --user
python3 -m pip install thop --user
echo "##vso[task.setvariable variable=PATH]${HOME}/.local/bin:${PATH}"
displayName: 'Install python tools'
- script: |
Expand Down
15 changes: 9 additions & 6 deletions crowdin.yml
Original file line number Diff line number Diff line change
@@ -1,12 +1,15 @@
project_id_env: CROWDIN_PROJECT_ID
api_token_env: CROWDIN_PERSONAL_TOKEN
preserve_hierarchy: true
files:
- source: /docs/en_US/**/*
ignore:
- /docs/zh_CN/**/*
translation: /docs/%locale_with_underscore%/**/%original_file_name%
- source: '/**/*.[mM][dD]'
ignore:
- '*_%locale_with_underscore%.md'
- '/**/*_%locale_with_underscore%.md'
- /docs
- /%locale_with_underscore%
- '**/ISSUE_TEMPLATE/**'
- /.github
translation: /%original_path%/%file_name%_%locale_with_underscore%.md
- source: /docs/en_US/**/*
ignore:
- /docs/%locale_with_underscore%/**/*.*
translation: /docs/%locale_with_underscore%/**/%original_file_name%
12 changes: 12 additions & 0 deletions deployment/docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,11 @@ RUN DEBIAN_FRONTEND=noninteractive && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

#
# generate python script
#
RUN cp /usr/bin/python3 /usr/bin/python

#
# update pip
#
Expand Down Expand Up @@ -69,6 +74,13 @@ RUN python3 -m pip --no-cache-dir install pandas==0.23.4 lightgbm==2.2.2
#
RUN python3 -m pip --no-cache-dir install nni

#
# install aml package
#
RUN python3 -m pip --no-cache-dir install azureml
RUN python3 -m pip --no-cache-dir install azureml-sdk


ENV PATH=/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/root/.local/bin:/usr/bin:/bin:/sbin

WORKDIR /root
4 changes: 2 additions & 2 deletions docs/en_US/CommunitySharings/community_sharings.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,8 @@ In addtion to the official tutorilas and examples, we encourage community contri

NNI in Recommenders <RecommendersSvd>
Automatically tuning SPTAG with NNI <SptagAutoTune>
Neural Architecture Search Comparison <NasComparision>
Hyper-parameter Tuning Algorithm Comparsion <HpoComparision>
Neural Architecture Search Comparison <NasComparison>
Hyper-parameter Tuning Algorithm Comparison <HpoComparison>
Parallelizing Optimization for TPE <ParallelizingTpeSearch>
Automatically tune systems with NNI <TuningSystems>
NNI review article from Zhihu: - By Garvin Li <NNI_AutoFeatureEng>
3 changes: 2 additions & 1 deletion docs/en_US/Compressor/CompressionUtils.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,8 @@ We provide a model counter for calculating the model FLOPs and parameters. This
```
from nni.compression.torch.utils.counter import count_flops_params

# Given input size (1, 1, 28, 28)
# Given input size (1, 1, 28, 28)
flops, params = count_flops_params(model, (1, 1, 28, 28))
# Format output size to M (i.e., 10^6)
print(f'FLOPs: {flops/1e6:.3f}M, Params: {params/1e6:.3f}M)
```
35 changes: 11 additions & 24 deletions docs/en_US/NAS/Benchmarks.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,40 +7,27 @@
Example Usages <BenchmarksExample>
```

## Introduction
To imporve the reproducibility of NAS algorithms as well as reducing computing resource requirements, researchers proposed a series of NAS benchmarks such as [NAS-Bench-101](https://arxiv.org/abs/1902.09635), [NAS-Bench-201](https://arxiv.org/abs/2001.00326), [NDS](https://arxiv.org/abs/1905.13214), etc. NNI provides a query interface for users to acquire these benchmarks. Within just a few lines of code, researcher are able to evaluate their NAS algorithms easily and fairly by utilizing these benchmarks.

## Prerequisites

* Please prepare a folder to household all the benchmark databases. By default, it can be found at `${HOME}/.nni/nasbenchmark`. You can place it anywhere you like, and specify it in `NASBENCHMARK_DIR` before importing NNI.
* Please install `peewee` via `pip install peewee`, which NNI uses to connect to database.

## Data Preparation

To avoid storage and legal issues, we do not provide any prepared databases. We strongly recommend users to use docker to run the generation scripts, to ease the burden of installing multiple dependencies. Please follow the following steps.

**Step 1.** Clone NNI repo. Replace `${NNI_VERSION}` with a released version name or branch name, e.g., `v1.6`.

```bash
git clone -b ${NNI_VERSION} https://github.com/microsoft/nni
```

**Step 2.** Run docker.

For NAS-Bench-101,
To avoid storage and legality issues, we do not provide any prepared databases. Please follow the following steps.

```bash
docker run -v ${HOME}/.nni/nasbenchmark:/outputs -v /path/to/your/nni:/nni tensorflow/tensorflow:1.15.2-py3 /bin/bash /nni/examples/nas/benchmarks/nasbench101.sh
1. Clone NNI to your machine and enter `examples/nas/benchmarks` directory.
```

For NAS-Bench-201,

```bash
docker run -v ${HOME}/.nni/nasbenchmark:/outputs -v /path/to/your/nni:/nni ufoym/deepo:pytorch-cpu /bin/bash /nni/examples/nas/benchmarks/nasbench201.sh
git clone -b ${NNI_VERSION} https://github.com/microsoft/nni
cd nni/examples/nas/benchmarks
```
Replace `${NNI_VERSION}` with a released version name or branch name, e.g., `v1.7`.

For NDS,

```bash
docker run -v ${HOME}/.nni/nasbenchmark:/outputs -v /path/to/your/nni:/nni python:3.7 /bin/bash /nni/examples/nas/benchmarks/nds.sh
```
2. Install dependencies via `pip3 install -r xxx.requirements.txt`. `xxx` can be `nasbench101`, `nasbench201` or `nds`.
3. Generate the database via `./xxx.sh`. The directory that stores the benchmark file can be configured with `NASBENCHMARK_DIR` environment variable, which defaults to `~/.nni/nasbenchmark`. Note that the NAS-Bench-201 dataset will be downloaded from a google drive.

Please make sure there is at least 10GB free disk space and note that the conversion process can take up to hours to complete.

Expand Down Expand Up @@ -169,4 +156,4 @@ Here is a list of available operators used in NDS.
.. autoclass:: nni.nas.benchmarks.nds.NdsTrialStats

.. autoclass:: nni.nas.benchmarks.nds.NdsIntermediateStats
```
```
81 changes: 57 additions & 24 deletions docs/en_US/NAS/BenchmarksExample.ipynb
Original file line number Diff line number Diff line change
@@ -1,27 +1,4 @@
{
"nbformat": 4,
"nbformat_minor": 2,
"metadata": {
"language_info": {
"name": "python",
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"version": "3.6.10-final"
},
"orig_nbformat": 2,
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"npconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": 3,
"kernelspec": {
"name": "python361064bitnnilatestcondabff8d66a619a4d26af34fe0fe687c7b0",
"display_name": "Python 3.6.10 64-bit ('nnilatest': conda)"
}
},
"cells": [
{
"cell_type": "markdown",
Expand Down Expand Up @@ -53,6 +30,14 @@
"## NAS-Bench-101"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Use the following architecture as an example:<br>\n",
"![nas-101](../../img/nas-bench-101-example.png)"
]
},
{
"cell_type": "code",
"execution_count": 2,
Expand Down Expand Up @@ -82,13 +67,28 @@
" pprint.pprint(t)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"An architecture of NAS-Bench-101 could be trained more than once. Each element of the returned generator is a dict which contains one of the training results of this trial config (architecture + hyper-parameters) including train/valid/test accuracy, training time, number of epochs, etc. The results of NAS-Bench-201 and NDS follow similar formats."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## NAS-Bench-201"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Use the following architecture as an example:<br>\n",
"![nas-201](../../img/nas-bench-201-example.png)"
]
},
{
"cell_type": "code",
"execution_count": 3,
Expand Down Expand Up @@ -120,6 +120,16 @@
"## NDS"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Use the following architecture as an example:<br>\n",
"![nds](../../img/nas-bench-nds-example.png)\n",
"\n",
"Here, `bot_muls`, `ds`, `num_gs`, `ss` and `ws` stand for \"bottleneck multipliers\", \"depths\", \"number of groups\", \"strides\" and \"widths\" respectively."
]
},
{
"cell_type": "code",
"execution_count": 4,
Expand Down Expand Up @@ -273,5 +283,28 @@
"print('Elapsed time: ', time.time() - ti, 'seconds')"
]
}
]
],
"metadata": {
"language_info": {
"name": "python",
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"version": "3.6.10-final"
},
"orig_nbformat": 2,
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"npconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": 3,
"kernelspec": {
"name": "python361064bitnnilatestcondabff8d66a619a4d26af34fe0fe687c7b0",
"display_name": "Python 3.6.10 64-bit ('nnilatest': conda)"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Loading