Skip to content

Commit

Permalink
[docs] Remove docs $ prefix (#229)
Browse files Browse the repository at this point in the history
* remove docs $ prefix

* align code indent
  • Loading branch information
LuckyPigeon authored Mar 28, 2022
1 parent 9a6d2da commit 5d7f1ee
Show file tree
Hide file tree
Showing 3 changed files with 34 additions and 34 deletions.
24 changes: 12 additions & 12 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -62,33 +62,33 @@ Ready to contribute? Here's how to set up `scrapy-redis` for local development.
1. Fork the `scrapy-redis` repo on GitHub.
2. Clone your fork locally::

$ git clone git@github.com:your_name_here/scrapy-redis.git
git clone git@github.com:your_name_here/scrapy-redis.git

3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development::

$ mkvirtualenv scrapy-redis
$ cd scrapy-redis/
$ python setup.py develop
mkvirtualenv scrapy-redis
cd scrapy-redis/
python setup.py develop

4. Create a branch for local development::

$ git checkout -b name-of-your-bugfix-or-feature
git checkout -b name-of-your-bugfix-or-feature

Now you can make your changes locally.

5. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox::

$ flake8 scrapy_redis tests
$ pytest --ignore=setup.py
$ tox
flake8 scrapy_redis tests
pytest --ignore=setup.py
tox

To get flake8 and tox, just pip install them into your virtualenv.

6. Commit your changes and push your branch to GitHub::

$ git add .
$ git commit -m "Your detailed description of your changes."
$ git push origin name-of-your-bugfix-or-feature
git add .
git commit -m "Your detailed description of your changes."
git push origin name-of-your-bugfix-or-feature

7. Submit a pull request through the GitHub website.

Expand All @@ -110,4 +110,4 @@ Tips

To run a subset of tests::

$ pytest tests/test_scrapy_redis
pytest tests/test_scrapy_redis
36 changes: 18 additions & 18 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -77,9 +77,9 @@ Installation

From `github`::

$ git clone https://github.com/darkrho/scrapy-redis.git
$ cd scrapy-redis
$ python setup.py install
git clone https://github.com/darkrho/scrapy-redis.git
cd scrapy-redis
python setup.py install

.. note:: For using this json supported data feature, please make sure you have not installed the scrapy-redis through pip. If you already did it, you first uninstall that one.
.. code::
Expand Down Expand Up @@ -190,28 +190,28 @@ across multiple spider instances, highly suitable for broad crawls.

2. Run the crawler for first time then stop it::

$ cd example-project
$ scrapy crawl dmoz
... [dmoz] ...
^C
cd example-project
scrapy crawl dmoz
... [dmoz] ...
^C

3. Run the crawler again to resume stopped crawling::

$ scrapy crawl dmoz
... [dmoz] DEBUG: Resuming crawl (9019 requests scheduled)
scrapy crawl dmoz
... [dmoz] DEBUG: Resuming crawl (9019 requests scheduled)

4. Start one or more additional scrapy crawlers::

$ scrapy crawl dmoz
... [dmoz] DEBUG: Resuming crawl (8712 requests scheduled)
scrapy crawl dmoz
... [dmoz] DEBUG: Resuming crawl (8712 requests scheduled)

5. Start one or more post-processing workers::

$ python process_items.py dmoz:items -v
...
Processing: Kilani Giftware (http://www.dmoz.org/Computers/Shopping/Gifts/)
Processing: NinjaGizmos.com (http://www.dmoz.org/Computers/Shopping/Gifts/)
...
python process_items.py dmoz:items -v
...
Processing: Kilani Giftware (http://www.dmoz.org/Computers/Shopping/Gifts/)
Processing: NinjaGizmos.com (http://www.dmoz.org/Computers/Shopping/Gifts/)
...


Feeding a Spider from Redis
Expand Down Expand Up @@ -240,11 +240,11 @@ Then:

1. run the spider::

scrapy runspider myspider.py
scrapy runspider myspider.py

2. push json data to redis::

redis-cli lpush myspider '{"url": "https://exaple.com", "meta": {"job-id":"123xsd", "start-date":"dd/mm/yy"}, "url_cookie_key":"fertxsas" }'
redis-cli lpush myspider '{"url": "https://exaple.com", "meta": {"job-id":"123xsd", "start-date":"dd/mm/yy"}, "url_cookie_key":"fertxsas" }'


.. note::
Expand Down
8 changes: 4 additions & 4 deletions docs/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ To install Scrapy-Redis, run this command in your terminal:

.. code-block:: console
$ pip install scrapy-redis
pip install scrapy-redis
If you don't have `pip`_ installed, this `Python installation guide`_ can guide
you through the process.
Expand All @@ -30,19 +30,19 @@ You can either clone the public repository:

.. code-block:: console
$ git clone git://github.com/rolando/scrapy-redis
git clone git://github.com/rolando/scrapy-redis
Or download the `tarball`_:

.. code-block:: console
$ curl -OL https://github.com/rolando/scrapy-redis/tarball/master
curl -OL https://github.com/rolando/scrapy-redis/tarball/master
Once you have a copy of the source, you can install it with:

.. code-block:: console
$ pip install -e .
pip install -e .
.. _Github repo: https://github.com/rolando/scrapy-redis
Expand Down

0 comments on commit 5d7f1ee

Please sign in to comment.