diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst index 3135238e..36df95eb 100644 --- a/CONTRIBUTING.rst +++ b/CONTRIBUTING.rst @@ -62,33 +62,33 @@ Ready to contribute? Here's how to set up `scrapy-redis` for local development. 1. Fork the `scrapy-redis` repo on GitHub. 2. Clone your fork locally:: - $ git clone git@github.com:your_name_here/scrapy-redis.git + git clone git@github.com:your_name_here/scrapy-redis.git 3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development:: - $ mkvirtualenv scrapy-redis - $ cd scrapy-redis/ - $ python setup.py develop + mkvirtualenv scrapy-redis + cd scrapy-redis/ + python setup.py develop 4. Create a branch for local development:: - $ git checkout -b name-of-your-bugfix-or-feature + git checkout -b name-of-your-bugfix-or-feature Now you can make your changes locally. 5. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox:: - $ flake8 scrapy_redis tests - $ pytest --ignore=setup.py - $ tox + flake8 scrapy_redis tests + pytest --ignore=setup.py + tox To get flake8 and tox, just pip install them into your virtualenv. 6. Commit your changes and push your branch to GitHub:: - $ git add . - $ git commit -m "Your detailed description of your changes." - $ git push origin name-of-your-bugfix-or-feature + git add . + git commit -m "Your detailed description of your changes." + git push origin name-of-your-bugfix-or-feature 7. Submit a pull request through the GitHub website. @@ -110,4 +110,4 @@ Tips To run a subset of tests:: - $ pytest tests/test_scrapy_redis + pytest tests/test_scrapy_redis diff --git a/README.rst b/README.rst index 54435e0c..fd2cc8ed 100644 --- a/README.rst +++ b/README.rst @@ -77,9 +77,9 @@ Installation From `github`:: - $ git clone https://github.com/darkrho/scrapy-redis.git - $ cd scrapy-redis - $ python setup.py install + git clone https://github.com/darkrho/scrapy-redis.git + cd scrapy-redis + python setup.py install .. note:: For using this json supported data feature, please make sure you have not installed the scrapy-redis through pip. If you already did it, you first uninstall that one. .. code:: @@ -190,28 +190,28 @@ across multiple spider instances, highly suitable for broad crawls. 2. Run the crawler for first time then stop it:: - $ cd example-project - $ scrapy crawl dmoz - ... [dmoz] ... - ^C + cd example-project + scrapy crawl dmoz + ... [dmoz] ... + ^C 3. Run the crawler again to resume stopped crawling:: - $ scrapy crawl dmoz - ... [dmoz] DEBUG: Resuming crawl (9019 requests scheduled) + scrapy crawl dmoz + ... [dmoz] DEBUG: Resuming crawl (9019 requests scheduled) 4. Start one or more additional scrapy crawlers:: - $ scrapy crawl dmoz - ... [dmoz] DEBUG: Resuming crawl (8712 requests scheduled) + scrapy crawl dmoz + ... [dmoz] DEBUG: Resuming crawl (8712 requests scheduled) 5. Start one or more post-processing workers:: - $ python process_items.py dmoz:items -v - ... - Processing: Kilani Giftware (http://www.dmoz.org/Computers/Shopping/Gifts/) - Processing: NinjaGizmos.com (http://www.dmoz.org/Computers/Shopping/Gifts/) - ... + python process_items.py dmoz:items -v + ... + Processing: Kilani Giftware (http://www.dmoz.org/Computers/Shopping/Gifts/) + Processing: NinjaGizmos.com (http://www.dmoz.org/Computers/Shopping/Gifts/) + ... Feeding a Spider from Redis @@ -240,11 +240,11 @@ Then: 1. run the spider:: - scrapy runspider myspider.py + scrapy runspider myspider.py 2. push json data to redis:: - redis-cli lpush myspider '{"url": "https://exaple.com", "meta": {"job-id":"123xsd", "start-date":"dd/mm/yy"}, "url_cookie_key":"fertxsas" }' + redis-cli lpush myspider '{"url": "https://exaple.com", "meta": {"job-id":"123xsd", "start-date":"dd/mm/yy"}, "url_cookie_key":"fertxsas" }' .. note:: diff --git a/docs/installation.rst b/docs/installation.rst index acb737f0..179e246a 100644 --- a/docs/installation.rst +++ b/docs/installation.rst @@ -12,7 +12,7 @@ To install Scrapy-Redis, run this command in your terminal: .. code-block:: console - $ pip install scrapy-redis + pip install scrapy-redis If you don't have `pip`_ installed, this `Python installation guide`_ can guide you through the process. @@ -30,19 +30,19 @@ You can either clone the public repository: .. code-block:: console - $ git clone git://github.com/rolando/scrapy-redis + git clone git://github.com/rolando/scrapy-redis Or download the `tarball`_: .. code-block:: console - $ curl -OL https://github.com/rolando/scrapy-redis/tarball/master + curl -OL https://github.com/rolando/scrapy-redis/tarball/master Once you have a copy of the source, you can install it with: .. code-block:: console - $ pip install -e . + pip install -e . .. _Github repo: https://github.com/rolando/scrapy-redis