Skip to content

A keras implementation of word level encoder-decoder architecture with teacher forcing.

Notifications You must be signed in to change notification settings

ArthDh/Chatbot-Reddit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Chatbot-Reddit

Chatbot that is trained on Reddit dump of 2005. This is a word level model which predicts the next sentence based on question statement and context (comments and replies in this case). The model uses encoder/decoder architecture and teacher forcing.

Dependencies

Usage

  1. Download and install Jupyter Notebook and IPython kernel
  2. run a Jupyter environment locally using jupyter notebook in the terminal
  3. call load_model() to load pretrained model in /models dir or train model for atleast 30 epochs
  4. change temperature of output(random sampling coefficient) in sample()
    (higher value of temperature = higher randomness, lower value of temperature preserves local structure but increases redundancy)
  5. call make_inference() with question statements to generate outputs

Example Output

Question: life
Reply: joke not did apparently best non wrong usually when call love you seriously 

References

TODO

  • [] Train on dialog dataset for proper answer structure
  • [] Clean dataset for stop words using nltk library

* The bot might say some offensive things(trained on reddit)

About

A keras implementation of word level encoder-decoder architecture with teacher forcing.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published