Skip to content

Implementation of "Teaching Machines to Read and Comprehend" in Theano/Lasagne

Notifications You must be signed in to change notification settings

adbrebs/rnn_reader

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fast attentive reader from "Teaching Machines to Read and Comprehend"

Lasagne/Theano implementation of the attentive reader of the following paper from Google DeepMind.

Teaching Machines to Read and Comprehend,
Karl Moritz Hermann, Tomáš Kočiský, Edward Grefenstette, Lasse Espeholt, Will Kay, Mustafa Suleyman, Phil Blunsom,
NIPS 2015

Our attentive reader architecture is fast to train and reaches 62.1% accuracy in only 4-5 hours (green curve below). Learning starts straight away, the plateau phase is very short.

img

Instructions

  1. Clone the repository

  2. Follow the instructions steps of https://github.com/thomasmesnard/DeepMind-Teaching-Machines-to-Read-and-Comprehend to download and process the data.

  3. Create a $DATA_PATH env variable with the path of the dataset folder. More precisely the dataset folder should have the following structure: $DATA_PATH > deepmind-qa > cnn > questions and stats folder

  4. Go to the cloned repository and run python main.py -s config/big/attention_softmax.py.

It should take about 4-5 hours to reach a validation performance of 62% on a Titan X.

Differences between deepmind's model and ours

If the attentive reader mechanism is the same, there are several architecture differences compared to Deepmind's model. In particular:

  • we use GRUs instead of LSTMS,
  • we don't use dropout,
  • we use ADAM as gradient descent rule,
  • a single small layer of GRUs (256 units),
  • no bidirectional layer,
  • no skip connections,
  • no weight noise,
  • vocabulary is limited to 30K words.

The config file of our architecture: https://github.com/adbrebs/rnn_reader/blob/master/config/big/attention_softmax.py

Requirements

  • Theano
  • Lasagne
  • Fuel (for the data pipeline)
  • Raccoon (to monitor training): NEW: you need a earlier version: git checkout 5174d65e69f7cf7a7b8fd26db6b6eab9a48d0339

Credit

Our code relies on the fuel data extensions developed by Thomas Mesnard, Alex Auvolat and Étienne Simon: https://github.com/thomasmesnard/DeepMind-Teaching-Machines-to-Read-and-Comprehend

About

Implementation of "Teaching Machines to Read and Comprehend" in Theano/Lasagne

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages