Skip to content

A modern cookiecutter template utilising many of my favourite python development tools

License

Notifications You must be signed in to change notification settings

Stephen-RA-King/pydough

Repository files navigation

A general purpose development environment template for the generation of Python modules & packages.
Utilizing many of my favourite development tools and deployable with the cookicutter module.

๐ŸŒŸ Features


The following tool is mandatory (you will thank me later):

  • pip-tools - Pins every single package dependency (even the dependencyโ€™s dependencies).

All the following features are optional:

  • flake8 - linting wrapper that glues together pycodestyle, pyflakes & mccabe.
  • black - code formatter.
  • mypy - for static type checking.
  • pre-commit - a framework for managing pre-commit hooks (pre-configured with many hooks).
  • isort - organizes imports automatically.
  • bandit - finds common security issues .
  • sphinx - for creating documentation.
  • click - for creating Command line interfaces.
  • invoke - Common package maintenance tasks are automated using a module I have written with this python library.

Other Features

  • Automatically configures git message template.
  • Selection of licenses to choose from (or vist choosealicense.com for more choices).
  • Optionally include the following:
    • python-semantic-release or bump2version - version control and release.
    • commitizen - for parsing and enforcing descriptive git commits.
    • cruft - project boilerplate is maintained in sync with parent cookiecutter template.
    • safety - check dependencies for known security vulnerabilities.
    • flakeheaven - Flake8 wrapper to replace flakehell.
  • Included are GitHub actions for codeql-analysis, CICD/tests and dependabot.
  • Option to specify Docstring style (Google or numpy - see pep 257 and sphinx).
  • Optionally include a configuration file (toml, ini, json or yaml).
  • Optionally include resource files (e.g. sqlite3, pickle, png).
  • Option to include Logging using the dictConfig() configuration.
  • Automatically include badges based on your selections e.g.:

pre-commit Imports: isort Code style: black Checked with mypy security: bandit

๐Ÿ“‹ Pre-Installation Requirements


$ pip install cookiecutter
$ pip install jinja2-time

๐Ÿ’พ Installation


Assuming you are at the command prompt of the activated virtual environment simply type the following:

$ cookiecutter https://github.com/stephen-ra-king/pydough

cookiecutter will prompt you for a selection of inputs and eventually pip-tools will pin your dependencies from the inputs you have given.

When the installation script has finished it would be prudent to check the following log for any errors:

logs / post_gen.log

This template uses layered requirements (the default being "development") so now install the packages using pip-tools as follows:

$ pip-sync <package-name>/requirements.txt

The following is no longer required as pip-tools is installed by default.

$ pip install -r <package-name>/requirements.txt

โš™๏ธ Optional Post-Installation steps


โ“ The majority of the following steps are automated by a post_installation.py file that I have written. However this file is removed by the post_gen_project.py hook if the name you have used is not "Stephen-RA-King". I have configured it this way as I use the keyring library to store my API keys. This way I have automated the entire environment creation. If you feel this is something that you would like to do, then feel free to configure this file to your needs.

Recommended Post Installation Requirements:

You will need accounts and API keys with the following services:

The following services can be linked to your github account (settings > integrations > applications)

Finalization Steps

  1. Put your pypi and testpypi keys into the .pypirc file.
  2. Create a remote repository on GitHub.
  3. If using GitHub actions - Add your PyPi and TestPyPi tokens to the repository actions secrets with the following variable names:
    • TEST_PYPI_API_TOKEN
    • PYPI_API_TOKEN
  4. If you are using Python Semantic Release, create the following environment variables:
    • GH_TOKEN = GitHub token
    • REPOSITORY_PASSWORD = PyPI token
    • REPOSITORY_USERNAME = _token_
  5. Import the GitHub repository into Read the Docs.
  6. Install your package as an "editable" package.

Editable installs were not possible until 2021, but that has been remedied by pep 660. This can be performed by either 'pip' or 'setuptools'

$ python -m pip install -e . 

or

$ python setup.py develop
  1. Push the local files to GitHub

Note Git is automatically initialized and the following is automatically run by the post install hook

$ git remote add origin git@github.com:<user>/<repository-name>.git 

So simply add, commit and push

$ git add *
$ git commit -m "chore: initial commit"
$ git push -u origin main 

Note: If you chose to use the pre-commit package then many hooks (e.g. Flake8, Black, Bandit Prettier etc.) will now download and configure themselves and eventually be run against each file in the repository. This may take some time and some files may get modified. You will need to "git add" these files again.

  1. Again when the time comes create a git tag (optionally signed) and push to the remote
$ git tag -s 0.1.0 -m "chore: 0.1.0 tag"
$ git push --tags
  1. Create build artefacts with the following command:
$ python -m build
  1. Now upload the build artefacts to the test repository for final testing:
$ python -m twine upload --config-file .pypirc -r testpypi dist/*
  1. When the time comes, release to the main repository:
$ python -m twine upload --config-file .pypirc dist/*
  1. If you are using services codecov.io, codefactor.io and deepsource.io then you will need to login to those services with your GitHub account and import your respositories

Note

Some badges can take up to 24 Hrs to update (yes download badge ... I am looking at you). So please be patient.

๐Ÿ“ Using the template Features


Python Semantic Release

If you opted to use PSR then the future uploading to GitHub & uploading to PyPI, will be done automatically. After you have committed changes to git, issue the following command:

$ semantic-release publish

Publish will do a sequence of things:

  • Update changelog file.
  • Run semantic-release version.
  • Push changes to git.
  • Run build_command and upload the distribution file to your repository.
  • Run semantic-release changelog and post to your vcs provider.
  • Attach the files created by build_command to GitHub releases.

Documentation

"Read the docs" will automatically generate new documentation when you push to GitHub. Howver you can manually generate local documenattion by simply moving to the "docs" directory and issuing the make command.

$ make html

You can then open the "...docs/_build/html/index.html" file with a browser. It is generally best to clear the _build directory when generating new documentation by using the following command

$ make clean html

Logging

If you have chosen to use logging, then the ABSOLUTE path to the log file will need to be set. Currently it is set to a RELATIVE file to where the application is run. Everybody's file structures are different, so obviously it is not possible for me to know where you want your log file to be.

This can be set in the src / init.py file -> handlers -> file -> filename

Invoke

I have written a "tasks.py" file that takes care of many of the mundane repository maintenance tasks:

inv --list
Available tasks:

  bandit                     Runs bandit against selected python files.
  build                      Creates a new sdist & wheel build using the PyPA tool.
  clean                      Removes all test, build, log and lint artifacts from the environment.
  docs                       Build documentation.
  lint                       Run all lint tasks on 'src' files only.
  lint-all                   Run all lint tasks on all files.
  lint-black (bl, black)     Runs black formatter against selected python files.
  lint-flake8 (fl, flake8)   Run flake8 against selected files.
  lint-isort (is, isort)     Run isort against selected python files.
  mypy                       Run mypy against selected python files.
  psr                        Runs semantic-release publish.
  publish                    Uploads a build to the PyPI-test and PyPI python repositories.
  pypi                       Uploads a build to the PyPI python repository.
  pypi-test                  Uploads a build to the PyPI-test python repository.
  safety                     Runs safety to check for insecure requirements.
  secure                     Runs all security tools.
  tests                      Run tests using pytest.
  update                     Updates the development environment

๐Ÿ“† Possible future enhancements


  • Use Ruff to replace Flake8 (plus dozens of plugins), isort, pydocstyle, eradicate, pyupgrade, and autoflake, all while executing tens or hundreds of times faster.

๐Ÿ“œ License

Distributed under the MIT license. See for more information.

<โ„น๏ธ> Meta


Author: Stephen King (sking.github@gmail.com)

Cookiecutter template: pydough version: 1.3.4

Digital Object Identifier: DOI