A Markov chain model to generate text and tweet it. In this particular example we used ~600 chapters of the Simpsons to generate the Homer Simpson text model.
In the the notebook explore_dataset_and_create_bot.ipynb you will find the code snippets to generate a Markcov chain generator model trained using a dataset. For this particular model we choosed to train a model based on Homer Simpson.
See requirements.txt
-
Follow the steps on explore_dataset_and_create_bot.ipynb
-
setup keys.py
-
run
python bot.py model_name
. Use the samemodel_name
that you defined in the notebook
Generate a the file keys.py
that contains
API_KEY = 'FILL_API_KEY'
API_SECRET = 'FILL_API_SECRET'
ACCES_TOKEN = 'FILL_ACCES_TOKEN'
ACCES_TOKEN_SECRET = 'FILL_ACCES_TOKEN_SECRET'
and save it in ./
You will get this keys following the [tutorial]https://www.mattcrampton.com/blog/step_by_step_tutorial_to_post_to_twitter_using_python/
- markovify - A simple, extensible Markov chain generator.
- Ivan Lengyel http://ivanlen.github.io/
See also the list of contributors who participated in this project.
https://github.com/jsvine/markovify
This project is licensed under the MIT License - see the LICENSE.md file for details