-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow project-level severity config for specific test types #3841
Comments
@joellabes Thanks for opening! A totally reasonable thing to want. I do think the best today is to override/reimplement the generic test definitions, which can set default configs for all specific tests that use it. As a side effect of the way we've implemented generic tests in dbt, and some of the tests in dbt-utils—separating the test definition from a (dispatched) macro containing its core logic—I think you could do this a bit more subtly than you might think: -- macros/test_config_overrides.sql
{% test unique(model, column_name) %}
{{ config(error_if = ">10") }}
{{ dbt.default__test_unique(model, column_name) }}
{% endtest %}
{% test relationships(model, column_name, to, field) %}
{{ config(warn_if = ">10", error_if = ">20") }}
{{ dbt.default__test_relationships(model, column_name, to, field) }}
{% endtest %}
{% test at_least_one(model, column_name) %}
{{ config(severity = 'warn') }}
{{ dbt_utils.default__at_least_one(model, column_name) }}
{% endtest %} As a current workaround, that doesn't feel too shabby. (There are some project parse timing implications of overriding the I agree that the long-term sustainable answer for this looks much more like:
# tests/test_config_overrides.yml
tests:
- name: unique
overrides: dbt # perhaps just call this `package`?
config:
error_if: ">10"
- name: relationships
overrides: dbt
config:
warn_if: ">10"
error_if: ">20"
- name: at_least_one
overrides: dbt_utils
config:
severity: warn |
Yeah this is really slick! I'll do that for now 🤩 thanks! |
This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please remove the stale label or comment on the issue, or it will be closed in 7 days. |
Describe the feature
I want up to 10* incorrect records to be allowed on a model before almost any generic test should fail.
Right now, I can:
Instead, I'd like to be able to do something along the lines of this in dbt_project.yml
*I would also like this to be configurable, probably via an environment variable - e.g. I'd like to be able to have a tighter internal rule but our exposure tile job might allow 30 failing records to reflect our true SLA, so that we have time to take action before the tiles turn red
Describe alternatives you've considered
Described above
Who will this benefit?
People who don't like overriding the built in tests
Are you interested in contributing this feature?
No, but I'm also happy to wait for it to come in as part of a wider test configuration revamp
The text was updated successfully, but these errors were encountered: