Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

doc: automatically tune SPTAG with NNI #1501

Merged
merged 1 commit into from
Sep 2, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions docs/en_US/CommunitySharings/SptagAutoTune.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Automatically tuning SPTAG with NNI

[SPTAG](https://github.com/microsoft/SPTAG) (Space Partition Tree And Graph) is a library for large scale vector approximate nearest neighbor search scenario released by [Microsoft Research (MSR)](https://www.msra.cn/) and [Microsoft Bing](https://www.bing.com/).

This library assumes that the samples are represented as vectors and that the vectors can be compared by L2 distances or cosine distances. Vectors returned for a query vector are the vectors that have smallest L2 distance or cosine distances with the query vector.
SPTAG provides two methods: kd-tree and relative neighborhood graph (SPTAG-KDT) and balanced k-means tree and relative neighborhood graph (SPTAG-BKT). SPTAG-KDT is advantageous in index building cost, and SPTAG-BKT is advantageous in search accuracy in very high-dimensional data.

In SPTAG, there are tens of parameters that can be tuned for specified scenarios or datasets. NNI is a great tool for automatically tuning those parameters. The authors of SPTAG tried NNI for the auto tuning and found good-performing parameters easily, thus, they shared the practice of tuning SPTAG on NNI in their document [here](https://github.com/microsoft/SPTAG/blob/master/docs/Parameters.md). Please refer to it for detailed tutorial.
1 change: 1 addition & 0 deletions docs/en_US/CommunitySharings/community_sharings.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ In addtion to the official tutorilas and examples, we encourage community contri
:maxdepth: 2

NNI in Recommenders <RecommendersSvd>
Automatically tuning SPTAG with NNI <SptagAutoTune>
Neural Architecture Search Comparison <NasComparision>
Hyper-parameter Tuning Algorithm Comparsion <HpoComparision>
Parallelizing Optimization for TPE <ParallelizingTpeSearch>