Skip to content

Commit

Permalink
Update docs to reflect swapping Make in for Tox, application packaging.
Browse files Browse the repository at this point in the history
* Rename tox-pytest and update-lockfile workflows.
* Make scheduled PRs against dev rather than conda-lockfile branch.
* Update to pyarrow 14 and grpcio 1.59 b/c security
* Update release notes
  • Loading branch information
zaneselvans committed Nov 13, 2023
1 parent 29ef03f commit 071766a
Show file tree
Hide file tree
Showing 21 changed files with 1,391 additions and 1,235 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/bot-auto-merge.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ name: bot-auto-merge
on:
workflow_run:
types: [completed]
workflows: ["tox-pytest"]
workflows: ["pytest"]

jobs:
bot-auto-merge:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
name: tox-pytest
name: pytest

on:
pull_request:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
name: update-lockfile
name: update-conda-lockfile

on:
workflow_dispatch:
Expand All @@ -9,7 +9,7 @@ on:
paths:
- "pyproject.toml"
- "environments/*"
- ".github/workflows/update-lockfile.yml"
- ".github/workflows/update-conda-lockfile.yml"

# What branch does this action run on?
# - workflow_dispatch: Whatever branch it was run against.
Expand All @@ -31,7 +31,7 @@ jobs:
- name: Set GITHUB_REF for use with schedule
if: ${{ (github.event_name == 'schedule') }}
run: |
echo "GITHUB_REF=conda-lockfile" >> $GITHUB_ENV
echo "GITHUB_REF=dev" >> $GITHUB_ENV
- name: Set GITHUB_REF for use with push
if: ${{ (github.event_name == 'push') }}
run: |
Expand Down Expand Up @@ -75,7 +75,7 @@ jobs:
title: Update Lockfile
body: >
This pull request relocks the dependencies with conda-lock.
It is triggered by [update-lockfile](https://github.com/catalyst-cooperative/pudl/blob/main/.github/workflows/update-lockfile.yml).
It is triggered by [update-conda-lockfile](https://github.com/catalyst-cooperative/pudl/blob/main/.github/workflows/update-conda-lockfile.yml).
labels: dependencies, conda-lock
reviewers: zaneselvans
branch: update-conda-lockfile
Expand Down
14 changes: 0 additions & 14 deletions MANIFEST.in

This file was deleted.

5 changes: 3 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ conda-lock.yml: pyproject.toml
# Create the pudl-dev conda environment based on the universal lockfile
.PHONY: pudl-dev
pudl-dev: conda-lock.yml
${mamba} run --name base ${mamba} env remove --name pudl-dev
conda-lock install --name pudl-dev --${mamba} --dev environments/conda-lock.yml

.PHONY: install-pudl
Expand Down Expand Up @@ -123,12 +124,12 @@ pytest-coverage: coverage-erase docs-build pytest-unit pytest-integration

.PHONY: pytest-integration-full
pytest-integration-full:
pytest ${pytest_args} --etl-settings ${etl_full_yml} test/integration
pytest ${pytest_args} -n auto --live-dbs --etl-settings ${etl_full_yml} test/integration

.PHONY: pytest-validate
pytest-validate:
pytest --live-dbs test/validate
pudl_check_fks
pytest ${pytest_args} -n auto --live-dbs test/validate

# Run the full ETL, generating new FERC & PUDL SQLite DBs and EPA CEMS Parquet files.
# Then run the full integration tests and data validations on all years of data.
Expand Down
18 changes: 3 additions & 15 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,9 @@ The Public Utility Data Liberation Project (PUDL)
:target: https://www.repostatus.org/#active
:alt: Project Status: Active

.. image:: https://github.com/catalyst-cooperative/pudl/workflows/tox-pytest/badge.svg
:target: https://github.com/catalyst-cooperative/pudl/actions?query=workflow%3Atox-pytest
:alt: Tox-PyTest Status
.. image:: https://github.com/catalyst-cooperative/pudl/workflows/pytest/badge.svg
:target: https://github.com/catalyst-cooperative/pudl/actions?query=workflow%3Apytest
:alt: PyTest Status

.. image:: https://img.shields.io/codecov/c/github/catalyst-cooperative/pudl?style=flat&logo=codecov
:target: https://codecov.io/gh/catalyst-cooperative/pudl
Expand All @@ -20,18 +20,6 @@ The Public Utility Data Liberation Project (PUDL)
:target: https://catalystcoop-pudl.readthedocs.io/en/latest/
:alt: Read the Docs Build Status

.. image:: https://img.shields.io/pypi/v/catalystcoop.pudl
:target: https://pypi.org/project/catalystcoop.pudl/
:alt: PyPI Latest Version

.. image:: https://img.shields.io/conda/vn/conda-forge/catalystcoop.pudl
:target: https://anaconda.org/conda-forge/catalystcoop.pudl
:alt: conda-forge Version

.. image:: https://img.shields.io/pypi/pyversions/catalystcoop.pudl
:target: https://pypi.org/project/catalystcoop.pudl/
:alt: Supported Python Versions

.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
:alt: Any color you want, so long as it's black.
Expand Down
1 change: 0 additions & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,6 @@
"setuptools": ("https://setuptools.pypa.io/en/latest/", None),
"sklearn": ("https://scikit-learn.org/stable", None),
"sqlalchemy": ("https://docs.sqlalchemy.org/en/latest/", None),
"tox": ("https://tox.wiki/en/latest/", None),
}

# Add any paths that contain templates here, relative to this directory.
Expand Down
36 changes: 18 additions & 18 deletions docs/dev/annual_updates.rst
Original file line number Diff line number Diff line change
Expand Up @@ -321,9 +321,9 @@ A. FERC 1 & EIA Plants & Utilities
:doc:`pudl_id_mapping` page for further instructions.


.. code-block:: bash
.. code-block:: console
tox -e get_unmapped_ids
$ make unmapped-ids
.. note::

Expand Down Expand Up @@ -369,21 +369,21 @@ called from within the :class:`pudl.output.pudltabl.PudlTabl` class.
* Are there new columns that should incorporated into the output tables?
* Are there new tables that need to have an output function defined for them?

**8.2)** To ensure that you (more) fully exercise all of the possible output functions,
run the entire CI test suite against your live databases with:
**8.2)** To ensure that you fully exercise all of the possible output functions,
run all the integration tests against your live PUDL DB with:

.. code-block:: bash
.. code-block:: console
tox -e full -- --live-dbs
$ make pytest-integration-full
9. Run and Update Data Validations
-----------------------------------
**9.1)** When the CI tests are passing against all years of data, sanity check the data
in the database and the derived outputs by running

.. code-block:: bash
.. code-block:: console
tox -e validate
$ make pytest-validate
We expect at least some of the validation tests to fail initially because we haven't
updated the number of records we expect to see in each table.
Expand All @@ -398,18 +398,18 @@ attention to how far off of previous expectations the new tables are. E.g. if th
are already 20 years of data, and you're integrating 1 new year of data, probably the
number of rows in the tables should be increasing by around 5% (since 1/20 = 0.05).

10. Run Additional Standalone Analyses
--------------------------------------
**10.1)** Run any important analyses that haven't been integrated into the CI
tests on the new year of data for sanity checking. For example the
:mod:`pudl.analysis.state_demand` script or generating the EIA Plant Parts List for
integration with FERC 1 data.

11. Update the Documentation
10. Update the Documentation
----------------------------
**11.1)** Once the new year of data is integrated, update the documentation
to reflect the new state of affairs. This will include updating at least:
**10.1)** Once the new year of data is integrated, update the documentation to reflect
the new state of affairs. This will include updating at least:

* the top-level :doc:`README </index>`
* the :doc:`data access </data_access>` page
* the :doc:`/release_notes`
* any updated :doc:`data sources </data_sources/index>`

Check that the docs still build with

.. code-block:: console
$ make docs-build
8 changes: 4 additions & 4 deletions docs/dev/build_docs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,12 @@ controlled by ``docs/conf.py``.

If you are editing the documentation and need to regenerate the outputs as
you go to see your changes reflected locally, the most reliable option is to
use Tox. Tox will remove the previously generated outputs and regenerate
use ``make``. Make will remove the previously generated outputs and regenerate
everything from scratch:

.. code-block:: console
$ tox -e docs
$ make docs-build
If you're just working on a single page and don't care about the entire set
of documents being regenerated and linked together, you can call Sphinx
Expand All @@ -44,5 +44,5 @@ documentation in your text editor with appropriate plugins.
need to be checked in to version control.

Similarly the :doc:`../data_dictionaries/pudl_db` is generated dynamically
by the :mod:`pudl.convert.metadata_to_rst` script that gets run by Tox when
it builds the docs.
by the :mod:`pudl.convert.metadata_to_rst` script that gets run by Sphinx during
the docs build.
Loading

0 comments on commit 071766a

Please sign in to comment.