Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TEST-#7173: Update github actions #7168

Merged
merged 1 commit into from
Apr 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/actions/mamba-env/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,14 +23,14 @@ runs:
shell: bash
- name: Cache conda
id: cache-conda
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: |
~/conda_pkgs_dir
~/.cache/pip
key:
${{ runner.os }}-conda-${{ steps.get-week.outputs.thisweek }}-${{ hashFiles(inputs.environment-file) }}
- uses: conda-incubator/setup-miniconda@v2
- uses: conda-incubator/setup-miniconda@v3
with:
miniforge-variant: Mambaforge
miniforge-version: latest
Expand Down
2 changes: 1 addition & 1 deletion .github/actions/python-only/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ inputs:
runs:
using: "composite"
steps:
- uses: actions/setup-python@v4
- uses: actions/setup-python@v5
with:
python-version: ${{ inputs.python-version }}
architecture: "x64"
Expand Down
5 changes: 3 additions & 2 deletions .github/actions/upload-coverage/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,10 @@ runs:
- run: |
COVERAGE_UUID=$(python3 -c "import uuid; print(uuid.uuid4())")
mv .coverage .coverage.${COVERAGE_UUID}
echo "COVERAGE_UUID=${COVERAGE_UUID}" >> $GITHUB_ENV
id: coverage-uuid
shell: bash
- uses: actions/upload-artifact@v3.1.2
- uses: actions/upload-artifact@v4
with:
name: coverage-data
name: coverage-data-${{ env.COVERAGE_UUID }}
path: .coverage*
2 changes: 1 addition & 1 deletion .github/workflows/ci-notebooks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ jobs:
matrix:
execution: [pandas_on_ray, pandas_on_dask, pandas_on_unidist, hdk_on_native]
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/python-only
if: matrix.execution != 'hdk_on_native' && matrix.execution != 'pandas_on_unidist'
- uses: ./.github/actions/mamba-env
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/ci-required.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,10 @@ jobs:
name: build docs
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
fetch-depth: 1
- uses: actions/setup-python@v4
- uses: actions/setup-python@v5
with:
python-version: "3.9.x"
architecture: "x64"
Expand All @@ -39,7 +39,7 @@ jobs:
name: lint (pydocstyle)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/python-only
# The `numpydoc` version here MUST match the versions in the dev requirements files.
- run: pip install pytest pytest-cov pydocstyle numpydoc==1.1.0
Expand Down Expand Up @@ -110,7 +110,7 @@ jobs:
name: lint (black and isort)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/python-only
- run: pip install black>=24.1.0 isort>=5.12
# NOTE: keep the black command here in sync with the pre-commit hook in
Expand Down
67 changes: 41 additions & 26 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ jobs:
name: lint (mypy)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/python-only
- run: pip install -r requirements-dev.txt
- run: mypy --config-file mypy.ini
Expand All @@ -39,7 +39,7 @@ jobs:
name: lint (flake8)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/python-only
# NOTE: If you are changing the set of packages installed here, make sure that
# the dev requirements match them.
Expand All @@ -55,7 +55,7 @@ jobs:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: requirements/requirements-no-engine.yml
Expand All @@ -78,7 +78,7 @@ jobs:
shell: bash -l {0}
name: test-clean-install-${{ matrix.os }}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/python-only
- run: python -m pip install -e ".[all]"
- name: Ensure Ray and Dask engines start up
Expand All @@ -101,7 +101,7 @@ jobs:
shell: bash -l {0}
name: test-internals
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: environment-dev.yml
Expand Down Expand Up @@ -131,7 +131,7 @@ jobs:
MODIN_TEST_DATASET_SIZE: "small"
name: Test ${{ matrix.execution }} execution, Python 3.9
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: environment-dev.yml
Expand Down Expand Up @@ -169,7 +169,7 @@ jobs:
AWS_ACCESS_KEY_ID: foobar_key
AWS_SECRET_ACCESS_KEY: foobar_secret
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: requirements/env_hdk.yml
Expand Down Expand Up @@ -218,10 +218,10 @@ jobs:
MODIN_TEST_DATASET_SIZE: small
name: test-asv-benchmarks
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
fetch-depth: 1
- uses: conda-incubator/setup-miniconda@v2
- uses: conda-incubator/setup-miniconda@v3
with:
auto-activate-base: true
activate-environment: ""
Expand Down Expand Up @@ -271,7 +271,7 @@ jobs:
if: always()

- name: Publish benchmarks artifact
uses: actions/upload-artifact@master
uses: actions/upload-artifact@v4
with:
name: Benchmarks log
path: asv_bench/benchmarks.log
Expand All @@ -287,8 +287,8 @@ jobs:
engines: ${{ steps.engines.outputs.engines }}
experimental: ${{ steps.experimental.outputs.experimental }}
steps:
- uses: actions/checkout@v3
- uses: dorny/paths-filter@v2
- uses: actions/checkout@v4
- uses: dorny/paths-filter@v3
id: filter
with:
filters: |
Expand All @@ -305,7 +305,7 @@ jobs:
- 'modin/core/execution/unidist/**'
experimental:
- 'modin/experimental/**'
- uses: actions/setup-python@v4
- uses: actions/setup-python@v5
- id: engines
run: |
python -c "import sys, json; print('engines=' + json.dumps(['python'] + (sys.argv[1] == 'true' and ['ray'] or []) + (sys.argv[2] == 'true' and ['dask'] or []) ))" \
Expand Down Expand Up @@ -340,7 +340,7 @@ jobs:
AWS_ACCESS_KEY_ID: foobar_key
AWS_SECRET_ACCESS_KEY: foobar_secret
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: requirements/env_unidist_linux.yml
Expand All @@ -366,7 +366,7 @@ jobs:
- run: ./.github/workflows/sql_server/set_up_sql_server.sh
# need an extra argument "genv" to set environment variables for mpiexec. We need
# these variables to test writing to the mock s3 filesystem.
- uses: nick-fields/retry@v2
- uses: nick-fields/retry@v3
# to avoid issues with non-stable `to_csv` tests for unidist on MPI backend.
# for details see: https://github.com/modin-project/modin/pull/6776
with:
Expand Down Expand Up @@ -436,15 +436,15 @@ jobs:
- name: Tell Modin to use existing ray cluster
run: echo "MODIN_RAY_CLUSTER=True" >> $GITHUB_ENV
if: matrix.os == 'windows' && matrix.engine == 'ray'
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: environment-dev.yml
python-version: ${{matrix.python-version}}
- name: Start local ray cluster
# Try a few times to start ray to work around
# https://github.com/modin-project/modin/issues/4562
uses: nick-fields/retry@v2
uses: nick-fields/retry@v3
with:
timeout_minutes: 5
max_attempts: 5
Expand Down Expand Up @@ -557,7 +557,7 @@ jobs:
AWS_ACCESS_KEY_ID: foobar_key
AWS_SECRET_ACCESS_KEY: foobar_secret
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: ${{ matrix.os == 'ubuntu' && matrix.execution.name == 'unidist' && 'requirements/env_unidist_linux.yml' || matrix.os == 'windows' && matrix.execution.name == 'unidist' && 'requirements/env_unidist_win.yml' || 'environment-dev.yml' }}
Expand All @@ -575,7 +575,7 @@ jobs:
- name: Start local ray cluster
# Try a few times to start ray to work around
# https://github.com/modin-project/modin/issues/4562
uses: nick-fields/retry@v2
uses: nick-fields/retry@v3
with:
timeout_minutes: 5
max_attempts: 5
Expand Down Expand Up @@ -632,7 +632,7 @@ jobs:
- run: ${{ matrix.execution.shell-ex }} $PARALLEL modin/tests/numpy
- run: ${{ matrix.execution.shell-ex }} -m "not exclude_in_sanity" modin/tests/pandas/test_io.py --verbose
if: matrix.execution.name != 'unidist'
- uses: nick-fields/retry@v2
- uses: nick-fields/retry@v3
# to avoid issues with non-stable `to_csv` tests for unidist on MPI backend.
# for details see: https://github.com/modin-project/modin/pull/6776
with:
Expand Down Expand Up @@ -672,7 +672,7 @@ jobs:
AWS_ACCESS_KEY_ID: foobar_key
AWS_SECRET_ACCESS_KEY: foobar_secret
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: environment-dev.yml
Expand All @@ -699,32 +699,47 @@ jobs:
MODIN_ENGINE: ${{matrix.engine}}
name: test-spreadsheet (engine ${{matrix.engine}}, python ${{matrix.python-version}})
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: environment-dev.yml
python-version: ${{matrix.python-version}}
- run: python -m pytest modin/tests/experimental/spreadsheet/test_general.py

upload-coverage:
merge-coverage-artifacts:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need this?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Version 4 is incompatible with version 3 (actions/upload-artifact), since now artifacts are considered immutable, so it is possible to create only a separate artifact from each job, and then, for convenience, need to combine it into one artifact.

For details:
https://github.com/actions/upload-artifact/blob/main/docs/MIGRATION.md#multiple-uploads-to-the-same-named-artifact

needs: [test-internals, test-api-and-no-engine, test-defaults, test-hdk, test-all-unidist, test-all, test-experimental, test-sanity]
if: always() # we need to run it regardless of some job being skipped, like in PR
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v3
- name: Merge Artifacts
uses: actions/upload-artifact/merge@v4
with:
name: coverage-data
pattern: coverage-data-*
delete-merged: true

upload-coverage:
needs: [merge-coverage-artifacts]
if: always() # we need to run it regardless of some job being skipped, like in PR
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v4
- uses: ./.github/actions/python-only
- name: Download coverage data
uses: actions/download-artifact@v3.0.2
uses: actions/download-artifact@v4
with:
name: coverage-data
- run: pip install coverage
- name: Combine coverage
run: python -m coverage combine
- name: Generate coverage report in xml format
run: python -m coverage xml
- uses: codecov/codecov-action@v3
- uses: codecov/codecov-action@v4
with:
fail_ci_if_error: ${{ github.event_name == 'push' }} # do not care about uploads in PR
2 changes: 1 addition & 1 deletion .github/workflows/codeql.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ jobs:

steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v4

- name: Initialize CodeQL
uses: github/codeql-action/init@v2
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/fuzzydata-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ jobs:
python-version: ["3.9"]
engine: ["ray", "dask"]
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: environment-dev.yml
Expand All @@ -42,7 +42,7 @@ jobs:
run: python -m pytest modin/tests/experimental/test_fuzzydata.py -Wignore::UserWarning --log-file=/tmp/fuzzydata-test-wf-${{matrix.engine}}/run.log --log-file-level=INFO
env:
MODIN_ENGINE: ${{matrix.engine}}
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
if: success() || failure()
with:
name: fuzzydata-test-workflow-${{matrix.engine}}
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/push-to-master.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ jobs:
AWS_ACCESS_KEY_ID: foobar_key
AWS_SECRET_ACCESS_KEY: foobar_secret
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: environment-dev.yml
Expand Down Expand Up @@ -76,7 +76,7 @@ jobs:
shell: bash -l {0}
name: test docs
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: ./.github/actions/mamba-env
with:
environment-file: environment-dev.yml
Expand Down
Loading