Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Commit

Permalink
Typos on paragraph #28. (#184)
Browse files Browse the repository at this point in the history
* Typos on paragraph #28.

Typo on paragraph #34

* Update README.md
  • Loading branch information
kant authored and xuehui1991 committed Oct 10, 2018
1 parent 52b4478 commit 115028f
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions src/sdk/pynni/nni/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,16 +25,16 @@ Comparing with other algorithm, TPE could be achieve better result when the numb

**Random Search**

In [Random Search for Hyper-Parameter Optimization][2] show that Random Search might be surprsingly simple and effective. We suggests that we could use Random Search as basline when we have no knowledge about the prior distribution of hyper-parameters.
In [Random Search for Hyper-Parameter Optimization][2] show that Random Search might be surprisingly simple and effective. We suggests that we could use Random Search as baseline when we have no knowledge about the prior distribution of hyper-parameters.

**Anneal**

**Naive Evolution**

Naive Evolution comes from [Large-Scale Evolution of Image Classifiers][3]. Naive Evolution requir more experiments to works, but it's very simple and easily to expand new features. There are some tips for user:
Naive Evolution comes from [Large-Scale Evolution of Image Classifiers][3]. Naive Evolution require more experiments to works, but it's very simple and easily to expand new features. There are some tips for user:

1) large initial population could avoid to fall into local optimum
2) use some strategies to keep the deversity of population could be better.
2) use some strategies to keep the diversity of population could be better.

**SMAC**

Expand Down

0 comments on commit 115028f

Please sign in to comment.