Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[test] Dev coverage #235

Merged
merged 15 commits into from
Jul 16, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 18 additions & 0 deletions .bandit.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
skips:
- B101
- B105
- B301
- B303
- B306
- B307
- B311
- B320
- B321
- B324
- B403
- B404
- B406
- B410
- B503
- B603
- B605
1 change: 1 addition & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ source =
src

[run]
omit = setup.py
branch = true
source =
scrapy_redis
Expand Down
9 changes: 9 additions & 0 deletions .flake8
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@

[flake8]

max-line-length = 119
ignore = W503

exclude =
tests/test_spiders.py E731
docs/conf.py E265
4 changes: 2 additions & 2 deletions .readthedocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,9 @@ build:
# For available versions, see:
# https://docs.readthedocs.io/en/stable/config-file/v2.html#build-tools-python
python: "3.7" # Keep in sync with .github/workflows/checks.yml
scrapy: "2.5.1"
scrapy: "2.6.1"

python:
install:
- requirements: ./requirements-dev.txt
- requirements: docs/requirements.txt
- path: .
49 changes: 33 additions & 16 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
.. highlight:: shell

============
Contributing
Contribution
============

Contributions are welcome, and they are greatly appreciated! Every
Expand All @@ -12,10 +12,20 @@ You can contribute in many ways:
Types of Contributions
----------------------

New to here
~~~~~~~~~~~

Any issue with good first issue tag on it is a great place to start! Feel free to ask any questions here.

Don't know how to start
~~~~~~~~~~~

Review codebases and PRs can give you quite a knowledge to know what's going on here!

Report Bugs
~~~~~~~~~~~

Report bugs at https://github.com/rolando/scrapy-redis/issues.
Report bugs at https://github.com/rmax/scrapy-redis/issues.

If you are reporting a bug, please include:

Expand All @@ -29,10 +39,10 @@ Fix Bugs
Look through the GitHub issues for bugs. Anything tagged with "bug"
is open to whoever wants to implement it.

Implement Features
Implement Features & imporvments
~~~~~~~~~~~~~~~~~~

Look through the GitHub issues for features. Anything tagged with "feature"
Look through the GitHub issues for features. Anything tagged with "feature" or "improvments"
is open to whoever wants to implement it.

Write Documentation
Expand All @@ -45,7 +55,7 @@ articles, and such.
Submit Feedback
~~~~~~~~~~~~~~~

The best way to send feedback is to file an issue at https://github.com/rolando/scrapy-redis/issues.
The best way to send feedback is to file an issue at https://github.com/rmax/scrapy-redis/issues.

If you are proposing a feature:

Expand All @@ -59,48 +69,55 @@ Get Started!

Ready to contribute? Here's how to set up `scrapy-redis` for local development.

Setup environment
~~~~~~~~~~~~~~~

1. Fork the `scrapy-redis` repo on GitHub.
2. Clone your fork locally::

git clone git@github.com:your_name_here/scrapy-redis.git

3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development::

mkvirtualenv scrapy-redis
pip install virtualenv==20.0.23
virtualenv --python=/usr/bin/python3 ~/scrapy_redis
source ~/scrapy_redis/bin/activate
cd scrapy-redis/
python setup.py develop
pip install -r requirements-install.txt
pip install .

4. Create a branch for local development::

git checkout -b name-of-your-bugfix-or-feature

Now you can make your changes locally.

5. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox::
Setup testing environment
~~~~~~~~~~~~~~~

1. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox::

flake8 scrapy_redis tests
pip install .
pip install -r requirements-tests.txt
flake8 src/ tests/
python -m pytest --ignore=setup.py
tox

To get flake8 and tox, just pip install them into your virtualenv.

6. Note that if the error of `No module named scrapy_redis` shows, please install `scrapy-redis` of your branch by::
2. Note that if the error of `No module named scrapy_redis` shows, please check the install `scrapy-redis` of your branch by::

pip install .

7. Or change the import lines::
3. Or change the import lines::

from scrapy_redis import xxx # from this
from src.scrapy_redis import xxx # to this

8. Commit your changes and push your branch to GitHub::
4. Commit your changes and push your branch to GitHub::

git add .
git commit -m "Your detailed description of your changes."
git push origin name-of-your-bugfix-or-feature

9. Submit a pull request through the GitHub website.
5. Submit a pull request through the GitHub website.

Pull Request Guidelines
-----------------------
Expand Down
7 changes: 5 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,13 @@
.PHONY: release dist install build-inplace
define BROWSER_PYSCRIPT
import os, webbrowser, sys
FAIL = "\033[91m"
ENDC = "\033[0m"

try:
from urllib import pathname2url
except:
from urllib.request import pathname2url
except:
print(FAIL + "Python2 is deprecated, please upgrade your python >= 3.7" + ENDC)

webbrowser.open("file://" + pathname2url(os.path.abspath(sys.argv[1])))
endef
Expand Down
Loading