Skip to content

Batch-Normalized LSTM (Recurrent Batch Normalization) implementation in Torch.

Notifications You must be signed in to change notification settings

iassael/torch-bnlstm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 

Repository files navigation

Recurrent Batch Normalization

Batch-Normalized LSTMs

Tim Cooijmans, Nicolas Ballas, César Laurent, Çağlar Gülçehre, Aaron Courville

http://arxiv.org/abs/1603.09025

Usage

local rnn = LSTM(input_size, rnn_size, n, dropout, bn)

n = number of layers (1-N)

dropout = probability of dropping a neuron (0-1)

bn = batch normalization (true, false)

Example

https://github.com/iassael/char-rnn

Performance

Validation scores on char-rnn with default options

Implemented in Torch by Yannis M. Assael (www.yannisassael.com)

About

Batch-Normalized LSTM (Recurrent Batch Normalization) implementation in Torch.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages