Skip to content

Commit

Permalink
Use CodeCov for Coverage Reports and Badges
Browse files Browse the repository at this point in the history
Use CodeCov for Coverage Reports and Badges
  • Loading branch information
rickyschools committed Apr 27, 2024
1 parent b440495 commit 4aef5df
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 18 deletions.
20 changes: 3 additions & 17 deletions .github/workflows/tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -43,24 +43,10 @@ jobs:
- name: Run Unit Tests
run: tox -p -e $(tox -l | grep $(echo ${{ matrix.python-version }} | sed 's/\.//g') | paste -sd "," -) -vv

- name: "Combine Coverage Summary"
run: |
export TOTAL=$(python -c "import json;print(json.load(open('coverage.json'))['totals']['percent_covered_display'])")
echo "total=$TOTAL" >> $GITHUB_ENV
echo "### Total coverage: ${TOTAL}%" >> $GITHUB_STEP_SUMMARY
- name: Install coverage badge requirements
run: |
npm i coverage-badges-cli
- name: Create Coverage Badges
uses: jaywcjlove/coverage-badges-cli@main
- name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v4.0.1
with:
style: flat
source: ./coverage.json
output: coverage/badges.svg
jsonPath: totals.percent_covered

token: ${{ secrets.CODECOV_TOKEN }}

docs:
runs-on: ubuntu-latest
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# `dltflow`
![Coverage](./coverage/badges.svg) ![Static Badge](https://img.shields.io/badge/python-3.9%2C_3.10%2C_3.11%2C-blue) ![Static Badge](https://img.shields.io/badge/pyspark-3.4%2C_3.5-blue)
![Static Badge](https://img.shields.io/badge/python-3.9%2C_3.10%2C_3.11%2C-blue) ![Static Badge](https://img.shields.io/badge/pyspark-3.4%2C_3.5-blue) [![codecov](https://codecov.io/gh/rickyschools/dltflow/graph/badge.svg?token=OSHZBF2639)](https://codecov.io/gh/rickyschools/dltflow)

`dltflow` is a Python package that provides authoring utilities and CD patterns for Databricks' DLT product. It intends
make writing and deploying DLT code and pipelines to Databricks as easy as possible.
Expand Down

0 comments on commit 4aef5df

Please sign in to comment.