Skip to content

Commit

Permalink
fixup! Update dev/README_RELEASE_AIRFLOW.md
Browse files Browse the repository at this point in the history
  • Loading branch information
potiuk committed Jul 30, 2023
1 parent 8735de9 commit 3963955
Showing 1 changed file with 27 additions and 11 deletions.
38 changes: 27 additions & 11 deletions TESTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1475,13 +1475,13 @@ In order to build apache-airflow from sources, you need to run the following com

.. code-block:: bash
breeze release-managment prepare-airflow-package
breeze release-management prepare-airflow-package
In order to build providers from sources, you need to run the following command:

.. code-block:: bash
breeze release-managment prepare-provider-packages <PROVIDER_1> <PROVIDER_2> ... <PROVIDER_N>
breeze release-management prepare-provider-packages <PROVIDER_1> <PROVIDER_2> ... <PROVIDER_N>
The packages are built in ``dist`` folder and the command will summarise what packages are available in the
``dist`` folder after it finishes.
Expand All @@ -1494,12 +1494,14 @@ If you want to download the packages from PyPI, you need to run the following co
You can use it for both release and pre-release packages.

Examples of testing pre-release packages
----------------------------------------

Few examples below will explain how you can test pre-release (also release) packages.
Few examples below explain how you can test pre-release packages, and combine them with locally build
and released packages.

This one will download ``airflow`` and ``celery`` and ``kubernetes`` provider packages from PyPI and
eventually start Airflow using the packages downloaded with the Celery Executor. It will also
load example dags and default connections:
The following example downloads ``apache-airflow`` and ``celery`` and ``kubernetes`` provider packages from PyPI and
eventually starts Airflow with the Celery Executor. It also load example dags and default connections:

.. code:: bash
Expand All @@ -1510,19 +1512,33 @@ load example dags and default connections:
breeze start-airflow --mount-sources remove --use-packages-from-dist --executor CeleryExecutor --load-default-connections --load-example-dags
This one will download ``celery`` and ``kubernetes`` provider packages from PyPI but build
``airflow`` package from the main sources and eventually start Airflow using the packages downloaded
with the Celery Executor. It will also load example dags and default connections:
The following example downloads ``celery`` and ``kubernetes`` provider packages from PyPI, builds
``apache-airflow`` package from the main sources and eventually starts Airflow with the Celery Executor.
It also loads example dags and default connections:

.. code:: bash
rm dist/*
breeze release-managment prepare-airflow-package
breeze release-management prepare-airflow-package
pip download apache-airflow-providers-cncf-kubernetes==7.4.0rc1 --dest dist --no-deps
pip download apache-airflow-providers-cncf-kubernetes==3.3.0rc1 --dest dist --no-deps
breeze start-airflow --mount-sources remove --use-packages-from-dist --executor CeleryExecutor --load-default-connections --load-example-dags
You can mix and match PyPI and locally build packages this way as you see fit
The following example builds ``celery``, ``kubernetes`` provider packages from PyPI, downloads 2.6.3 version
of ``apache-airflow`` package from PyPI and eventually starts Airflow using default executor
for the backend chosen (no example dags, no default connections):

.. code:: bash
rm dist/*
pip download apache-airflow==2.6.3 --dest dist --no-deps
breeze release-management prepare-provider-packages celery cncf.kubernetes
breeze start-airflow --mount-sources remove --use-packages-from-dist
You can mix and match packages from PyPI (final or pre-release candidates) with locally build packages. You
can also choose which providers you install this way (the ``--remove-sources`` flag makes sure that airflow
installed does not contain all the providers - only those that you explicitly downloaded or built in the
``dist`` folder. This way you can test all the combinations of Airflow + Providers you might need.


Airflow System Tests
Expand Down

0 comments on commit 3963955

Please sign in to comment.