Releases: calogica/dbt-expectations
dbt-expectations v0.4.1
dbt-expectations v0.4.1
Fixes
-
expect_table_columns_to_match_list
removed''
to leave columns as numbers (#98) -
expect_table_columns_to_match_ordered_list
now explicitly casts the column list to a string type (#99) -
Fixed regex matching tests for Redshift by adding a Redshift specific adapter macro in
regexp_instr
([#99](#102 @mirosval)
dbt-expectations v0.4.0
dbt-expectations v0.4.0
Breaking Changes
-
Requires
dbt >= 0.20
-
Requires
dbt-date >= 0.4.0
-
Updates test macros to tests to support
dbt >= 0.20
-
Updates calls to adapter.dispatch to support
dbt >= 0.20
(see Changes to dispatch in dbt v0.20 #78)
dbt-expectations v0.3.7
dbt-expectations v0.3.6
dbt-expectations v0.3.6
- Remove unnecessary macro to fix issue with 0.19.2 (#88)
dbt-expectations v0.3.5
dbt-expectations v0.3.5
Features
-
Added a new macro,
expect_row_values_to_have_data_for_every_n_datepart
, which tests whether a model has values for every groupeddate_part
.For example, this tests whether a model has data for every
day
(grouped ondate_col
) from either a specifiedstart_date
andend_date
, or for themin
/max
value of the specifieddate_col
.tests: - dbt_expectations.expect_row_values_to_have_data_for_every_n_datepart: date_col: date_day date_part: day
Fixes
-
Fixed
join
syntax because Twitter induced guilt: https://twitter.com/emilyhawkins__/status/1400967270537564160 -
Bump version of dbt-date to
< 0.4.0
(#85)
dbt-expectations 0.3.4
dbt-expectations v0.3.4
Features
-
Added support for optional
min_value
andmax_value
parameters to all*_between_*
tests. (#70) -
Added support for
strictly
parameter tobetween
tests. If set toTrue
,striclty
changes the operators>=
and<=
to>
and<
.For example, while
dbt_expectations.expect_column_stdev_to_be_between: min_value: 0
evaluates to
>= 0
,dbt_expectations.expect_column_stdev_to_be_between: min_value: 0 strictly: True
Fixes
- Corrected a typo in the README (#67)
Under the hood
dbt-expectations 0.3.3
Removes dependency on spark-utils
.
Users running this package on non-core (Snowflake, BigQuery, Redshift, Postgres) platforms, will need to use a package like spark_utils
to shim macros.
For example, in packages.yml
, you will need to include the relevant package:
- package: fishtown-analytics/spark_utils
version: <latest or range>
And reference in the dispatch list for dbt_utils
in dbt_project.yml
:
vars:
dbt_utils_dispatch_list: [spark_utils]