Skip to content

Pytorch implementation of "Forward Thinking: Building and Training Neural Networks One Layer at a Time"

License

Notifications You must be signed in to change notification settings

kimhc6028/forward-thinking-pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

forward-thinking-pytorch

Pytorch implementation of Forward Thinking: Building and Training Neural Networks One Layer at a Time

Requirements

Usage

  $ ./run.sh

to run forward-thinking & forward-thinking (deep) & backpropagation & backpropagation (deep).

For forward-thinking experiment (5 layers),

$ python forward_thinking.py

If you want to do this with very deep model (20 layers),

$ python forward_thinking.py --epoch 200 --deep

As a comparison, learn by backpropagation:

$ python backpropagate.py

Backpropagate with very deep model (20 layers),

$ python backpropagate.py --epoch 200 --deep

Result

For 5 layers, forward-thinking learns slightly faster than backpropagation. Dips are observed when new layers are added.

When the model becomes very deep (20 layers), backpropagation cannot train the model. On the other hand, forward-thinking can train the model as usual.

About

Pytorch implementation of "Forward Thinking: Building and Training Neural Networks One Layer at a Time"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published