This repository is deprecated and no longer maintaned. We invite you to start using the forked version maintained by Datacoves on datacoves/pre-commit-dbt.
Thank you to Radek Tomšej for initial development and maintenance of this great package, and for sharing your work with the community!
List of pre-commit hooks to ensure the quality of your dbt projects.
Quick ensure the quality of your dbt
projects.
dbt
is awesome, but when a number of models, sources, and macros grow it starts to be challenging to maintain quality. People often forget to update columns in schema files, add descriptions, or test. Besides, with the growing number of objects, dbt slows down, users stop running models/tests (because they want to deploy the feature quickly), and the demands on reviews increase.
If this is the case, pre-commit-dbt
is here to help you!
💡 Click on hook name to view the details.
Model checks:
check-column-desc-are-same
: Check column descriptions are the same.check-column-name-contract
: Check column name abides to contract.check-model-columns-have-desc
: Check the model columns have description.check-model-has-all-columns
: Check the model has all columns in the properties file.check-model-has-description
: Check the model has description.check-model-has-meta-keys
: Check the model has keys in the meta part.check-model-has-properties-file
: Check the model has properties file.check-model-has-tests-by-name
: Check the model has a number of tests by test name.check-model-has-tests-by-type
: Check the model has a number of tests by test type.check-model-has-tests-by-group
: Check the model has a number of tests from a group of tests.check-model-has-tests
: Check the model has a number of tests.check-model-name-contract
: Check model name abides to contract.check-model-parents-and-childs
: Check the model has a specific number (max/min) of parents or/and childs.check-model-parents-database
: Check the parent model has a specific database.check-model-parents-schema
: Check the parent model has a specific schema.check-model-tags
: Check the model has valid tags.
Script checks:
check-script-semicolon
: Check the script does not contain a semicolon.check-script-has-no-table-name
: Check the script has not table name (is not usingsource()
orref()
macro for all tables).check-script-ref-and-source
: Check the script has only existing refs and sources.
Source checks:
check-source-columns-have-desc
: Check for source column descriptions.check-source-has-all-columns
: Check the source has all columns in the properties file.check-source-table-has-description
: Check the source table has description.check-source-has-freshness
: Check the source has the freshness.check-source-has-loader
: Check the source has loader option.check-source-has-meta-keys
: Check the source has keys in the meta part.check-source-has-tests-by-name
: Check the source has a number of tests by test name.check-source-has-tests-by-type
: Check the source has a number of tests by test type.check-source-has-tests
: Check the source has a number of tests.check-source-tags
: Check the source has valid tags.check-source-childs
: Check the source has a specific number (max/min) of childs.
Macro checks:
check-macro-has-description
: Check the macro has description.check-macro-arguments-have-desc
: Check the macro arguments have description.
Modifiers:
generate-missing-sources
: If any source is missing this hook tries to create it.generate-model-properties-file
: Generate model properties file.unify-column-description
: Unify column descriptions across all models.replace-script-table-names
: Replace table names withsource()
orref()
macros in the script.remove-script-semicolon
: Remove the semicolon at the end of the script.
dbt commands:
dbt-clean
: Rundbt clean
command.dbt-compile
: Rundbt compile
command.dbt-deps
: Rundbt deps
command.dbt-docs-generate
: Rundbt docs generate
command.dbt-run
: Rundbt run
command.dbt-test
: Rundbt test
command.
❗If you have an idea for a new hook or you found a bug, let us know❗
For detailed installation and usage, instructions see pre-commit.com site.
pip install pre-commit
- Create a file named
.pre-commit-config.yaml
in yourdbt
root folder. - Add list of hooks you want to run befor every commit. E.g.:
repos:
- repo: https://github.com/offbi/pre-commit-dbt
rev: v1.0.0
hooks:
- id: check-script-semicolon
- id: check-script-has-no-table-name
- id: dbt-test
- id: dbt-docs-generate
- id: check-model-has-all-columns
name: Check columns - core
files: ^models/core
- id: check-model-has-all-columns
name: Check columns - mart
files: ^models/mart
- id: check-model-columns-have-desc
files: ^models/mart
- Optionally, run
pre-commit install
to set up the git hook scripts. With this,pre-commit
will run automatically ongit commit
! You can also manually runpre-commit run
after youstage
all files you want to run. Orpre-commit run --all-files
to run the hooks against all of the files (not onlystaged
).
Unfortunately, you cannot natively use pre-commit-dbt
if you are using dbt Cloud. But you can run checks after you push changes into Github.
pre-commit-dbt
for the most of the hooks needs manifest.json
(see requirements section in hook documentation), that is in the target
folder. Since this target folder is usually in .gitignore
, you need to generate it. For that you need to run dbt-compile
(or dbt-run
) command.
To be able to compile dbt, you also need profiles.yml file with your credentials. To provide passwords and secrets use Github Secrets (see example).
So you want to e.g. run chach on number of tests:
repos:
- repo: https://github.com/offbi/pre-commit-dbt
rev: v1.0.0
hooks:
- id: check-model-has-tests
args: ["--test-cnt", "2", "--"]
To be able to run this in Github actions you need to modified it to:
repos:
- repo: https://github.com/offbi/pre-commit-dbt
rev: v1.0.0
hooks:
- id: dbt-compile
args: ["--cmd-flags", "++profiles-dir", "."]
- id: check-model-has-tests
args: ["--test-cnt", "2", "--"]
First step is to create profiles.yml. E.g.
# example profiles.yml file
jaffle_shop:
target: dev
outputs:
dev:
type: postgres
host: localhost
user: alice
password: "{{ env_var('DB_PASSWORD') }}"
port: 5432
dbname: jaffle_shop
schema: dbt_alice
threads: 4
and store this file in project root ./profiles.yml
.
- inside your Github repository create folder
.github/workflows
(unless it already exists). - create new file e.g.
main.yml
- specify your workflow e.g.:
name: pre-commit
on:
pull_request:
push:
branches: [main]
jobs:
pre-commit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- id: file_changes
uses: trilom/file-changes-action@v1.2.4
with:
output: ' '
- uses: offbi/pre-commit-dbt@v1.0.0
env:
DB_PASSWORD: ${{ secrets.SuperSecret }}
with:
args: run --files ${{ steps.file_changes.outputs.files}}