Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update my Branch #291

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
f7acd92
Started new effectivtiy Satellite macro
tkirschke Apr 12, 2024
a62a837
Added comments
tkirschke Aug 23, 2024
433369f
Merge branch 'main' into effectivity_satellite
tkirschke Sep 9, 2024
b0bcd49
adjust column spacing
tkirschke Sep 9, 2024
d668af3
Multi source fix
tkirschke Sep 11, 2024
9b1178e
copy eff_sat_v0 macros to all adapter folders
tkiehn Sep 12, 2024
543075c
implement eff_sat_v0 for all adapters
tkiehn Sep 12, 2024
4fa65a3
change ldts of first appearing HKs from current_timestamp to the sour…
tkiehn Sep 13, 2024
af155d0
fix records_to_inserts for multi models
tkiehn Sep 13, 2024
f1e623b
incorporate improvements from snowflake for all adapters, replace fir…
tkiehn Sep 20, 2024
7e0b471
Merge pull request #258 from ScalefreeCOM/cast-postgres-string-to-tim…
tkiehn Sep 20, 2024
42a6e9b
change hardcoded default for is_active_alias to is_active
tkiehn Sep 20, 2024
771d7bf
adapt eff sat macros for bigquery, exasol, postgres
tkiehn Sep 20, 2024
3cda87b
cast exasol string_to_timestamp to timestamp_default_dtype
tkiehn Sep 20, 2024
130bba2
remove unused rownumber and LAG()
tkiehn Sep 25, 2024
3b42687
finalize eff_sats for bigquery, posgres, redshift and synapse
tkiehn Sep 27, 2024
58e1d65
Merge pull request #263 from ScalefreeCOM/main
tkiehn Sep 27, 2024
2554245
add eff_sat for oracle
tderk Oct 11, 2024
26823c8
add eff_sat_v0 for fabric
tkiehn Oct 23, 2024
ab9dd5b
Merge pull request #281 from ScalefreeCOM/main
tkiehn Nov 7, 2024
547447c
fix:synapse/exasol: ghostrecords for timestamps didnt use alias-variable
tkiehn Nov 11, 2024
b1487d5
Merge pull request #284 from ScalefreeCOM/fix/synapse&exasol_ghostrec…
tkiehn Nov 11, 2024
aa6c942
add databricks__eff_sat_v0-macro
tkiehn Nov 11, 2024
4b9b528
Merge pull request #285 from ScalefreeCOM/effectivity_satellites_othe…
tkirschke Nov 12, 2024
5504a5e
Update README.md
tkirschke Nov 12, 2024
7c32fcb
Update README.md, add Snap Control view link, fix wiki link
tkiehn Nov 12, 2024
b15e3e5
Update README.md
tkirschke Nov 12, 2024
44a2c91
Merge pull request #286 from ScalefreeCOM/tkirschke-patch-2
tkirschke Nov 12, 2024
dbf902e
Fixed PARTION BY Clause in Sat v0 for Fabric
tkirschke Nov 18, 2024
8a73818
Merge pull request #289 from ScalefreeCOM/tkirschke-patch-2
tkiehn Nov 19, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 15 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,25 +1,26 @@
# datavault4dbt by [Scalefree International GmbH](https://www.scalefree.com)

<img src="https://user-images.githubusercontent.com/81677440/195860893-435b5faa-71f1-4e01-969d-3593a808daa8.png">
[<img src="https://user-images.githubusercontent.com/81677440/195860893-435b5faa-71f1-4e01-969d-3593a808daa8.png">](https://www.datavault4dbt.com/)


---

### Included Macros
- Staging Area (For Hashing, prejoins and ghost records)
- Hubs, Links & Satellites (allowing multiple deltas)
- Non-Historized Links and Satellites
- Multi-Active Satellites
- Virtualized End-Dating (in Satellites)
- Reference Hubs, - Satellites, and - Tables
- PIT Tables
- Hook for Cleaning up PITs
- Snapshot Control
- [Staging Area (For Hashing, prejoins and ghost records)](https://www.datavault4dbt.com/documentation/macro-instructions/staging/)
- [Hubs](https://www.datavault4dbt.com/documentation/macro-instructions/hubs/standard-hub/), [Links](https://www.datavault4dbt.com/documentation/macro-instructions/links/standard-link/) & [Satellites](https://www.datavault4dbt.com/documentation/macro-instructions/satellites/standard-satellite/standard-satellite-v0/) (allowing multiple deltas)
- [Non-Historized Links](https://www.datavault4dbt.com/documentation/macro-instructions/links/non-historized-link/) and [Satellites](https://www.datavault4dbt.com/documentation/macro-instructions/satellites/non-historized-satellite/)
- [Multi-Active Satellites](https://www.datavault4dbt.com/documentation/macro-instructions/satellites/multi-active-satellite/multi-active-satellite-v0/)
- [Effectivity](https://www.datavault4dbt.com/documentation/macro-instructions/satellites/effectivity-satellite/) and [Record Tracking Satellites](https://www.datavault4dbt.com/documentation/macro-instructions/satellites/record-tracking-satellite/)
- [Virtualized End-Dating (in Satellites)](https://www.datavault4dbt.com/documentation/macro-instructions/satellites/standard-satellite/standard-satellite-v1/)
- [Reference Hubs](https://www.datavault4dbt.com/documentation/macro-instructions/reference-data/reference-hub/), [- Satellites](https://www.datavault4dbt.com/documentation/macro-instructions/reference-data/reference-satellite/reference-satellite-v0/), and [- Tables](https://www.datavault4dbt.com/documentation/macro-instructions/reference-data/reference-tables/)
- [PIT Tables](https://www.datavault4dbt.com/documentation/macro-instructions/business-vault/pit/)
- [Hook for Cleaning up PITs](https://www.datavault4dbt.com/documentation/macro-instructions/business-vault/pit/hook-cleanup-pits/)
- Snapshot Control [Tables](https://www.datavault4dbt.com/documentation/macro-instructions/business-vault/snapshot-control/snapshot-control-v0/) and [Views](https://www.datavault4dbt.com/documentation/macro-instructions/business-vault/snapshot-control/snapshot-control-v1/)

### Features
With datavault4dbt you will get a lot of awesome features, including:
- A Data Vault 2.0 implementation congruent to the original Data Vault 2.0 definition by Dan Linstedt
- Ready for both Persistent Staging Areas and Transient Staging Areas, due to the allowance of multiple deltas in all macros, without losing any intermediate changes - Enforcing standards in naming conventions by implementing [global variables](https://github.com/ScalefreeCOM/datavault4dbt/wiki/Global-variables) for technical columns
- Ready for both Persistent Staging Areas and Transient Staging Areas, due to the allowance of multiple deltas in all macros, without losing any intermediate changes - Enforcing standards in naming conventions by implementing [global variables](https://www.datavault4dbt.com/documentation/general-usage-notes/global-variables/) for technical columns
- A fully auditable solution for a Data Warehouse
- Creating a centralized, snapshot-based Business interface by using a centralized snapshot table supporting logarithmic logic
- A modern insert-only approach that avoids updating data
Expand All @@ -36,11 +37,8 @@ To use the macros efficiently, there are a few prerequisites you need to provide
<img src="https://www.getdbt.com/ui/img/logos/dbt-logo.svg" width=33% align=right>

### Resources:
- Find technical information about the macros, examples, and more, on [the official datavault4dbt Website](https://www.datavault4dbt.com/)!
- Learn more about dbt [in the docs](https://docs.getdbt.com/docs/introduction)
- Check out [Discourse](https://discourse.getdbt.com/) for commonly asked questions and answers
- Join the [chat](https://community.getdbt.com/) on Slack for live discussions and support
- Check out [the blog](https://blog.getdbt.com/) for the latest news on dbt's development and best practices
- Find [dbt events](https://events.getdbt.com) near you
- Check out the [Scalefree-Blog](https://www.scalefree.com/blog/)
- [Data-Vault 2.0 with dbt #1](https://www.scalefree.com/scalefree-newsletter/data-vault-2-0-with-dbt-part-1/)
- [Data-Vault 2.0 with dbt #2](https://www.scalefree.com/scalefree-newsletter/data-vault-2-0-with-dbt-part-2/)
Expand Down Expand Up @@ -79,11 +77,11 @@ For further information on how to install packages in dbt, please visit the foll
[https://docs.getdbt.com/docs/building-a-dbt-project/package-management](https://docs.getdbt.com/docs/building-a-dbt-project/package-management#how-do-i-add-a-package-to-my-project)

### Global variables
datavault4dbt is highly customizable by using many global variables. Since they are applied on multiple levels, a high rate of standardization across your data vault 2.0 solution is guaranteed. The default values of those variables are set inside the packages `dbt_project.yml` and should be copied to your own `dbt_project.yml`. For an explanation of all global variables see [the wiki](https://github.com/ScalefreeCOM/datavault4dbt/wiki/Global-variables).
datavault4dbt is highly customizable by using many global variables. Since they are applied on multiple levels, a high rate of standardization across your data vault 2.0 solution is guaranteed. The default values of those variables are set inside the packages `dbt_project.yml` and should be copied to your own `dbt_project.yml`. For an explanation of all global variables see [the docs](https://www.datavault4dbt.com/documentation/general-usage-notes/global-variables/).

---
## Usage
The datavault4dbt package provides macros for Staging and Creation of all DataVault-Entities you need, to build your own DataVault2.0 solution. The usage of the macros is well-explained in the documentation: https://github.com/ScalefreeCOM/datavault4dbt/wiki
The datavault4dbt package provides macros for Staging and Creation of all DataVault-Entities you need, to build your own DataVault2.0 solution. The usage of the macros is well-explained in the [documentation]([url](https://www.datavault4dbt.com/documentation/)).

---
## Contributing
Expand Down
8 changes: 4 additions & 4 deletions macros/supporting/ghost_record_per_datatype.sql
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@

{%- if ghost_record_type == 'unknown' -%}

{%- if datatype == 'TIMESTAMP' or datatype == 'TIMESTAMP WITH LOCAL TIMEZONE' %} {{- datavault4dbt.string_to_timestamp( timestamp_format , beginning_of_all_times) }} as "{{ column_name }}"
{%- if datatype == 'TIMESTAMP' or datatype == 'TIMESTAMP WITH LOCAL TIMEZONE' %} {{- datavault4dbt.string_to_timestamp( timestamp_format , beginning_of_all_times) }} as {{ alias }}
{%- elif datatype == 'DATE'-%} TO_DATE('{{ beginning_of_all_times_date }}', '{{ date_format }}' ) as {{ alias }}
{%- elif datatype.upper().startswith('VARCHAR') -%}
{%- if col_size is not none -%}
Expand Down Expand Up @@ -110,7 +110,7 @@

{%- elif ghost_record_type == 'error' -%}

{%- if datatype == 'TIMESTAMP' or datatype == 'TIMESTAMP WITH LOCAL TIME ZONE' %} {{- datavault4dbt.string_to_timestamp( timestamp_format , end_of_all_times) }} as "{{ column_name }}"
{%- if datatype == 'TIMESTAMP' or datatype == 'TIMESTAMP WITH LOCAL TIME ZONE' %} {{- datavault4dbt.string_to_timestamp( timestamp_format , end_of_all_times) }} as {{ alias }}
{%- elif datatype == 'DATE'-%} TO_DATE('{{ end_of_all_times_date }}', '{{ date_format }}' ) as {{ alias }}
{%- elif datatype.upper().startswith('VARCHAR') -%}
{%- if col_size is not none -%}
Expand Down Expand Up @@ -267,7 +267,7 @@

{%- if ghost_record_type == 'unknown' -%}

{%- if datatype in ['DATETIME', 'DATETIME2', 'DATETIMEOFFSET'] %} CONVERT({{ datatype }}, {{- datavault4dbt.string_to_timestamp( timestamp_format , beginning_of_all_times) }}) as "{{ column_name }}"
{%- if datatype in ['DATETIME', 'DATETIME2', 'DATETIMEOFFSET'] %} CONVERT({{ datatype }}, {{- datavault4dbt.string_to_timestamp( timestamp_format , beginning_of_all_times) }}) as "{{ alias }}"
{%- elif 'CHAR' in datatype -%}
{%- if col_size is not none -%}
{%- if (col_size | int) == -1 -%}
Expand Down Expand Up @@ -297,7 +297,7 @@

{%- elif ghost_record_type == 'error' -%}

{%- if datatype in ['DATETIME', 'DATETIME2', 'DATETIMEOFFSET'] %} CONVERT({{ datatype }}, {{- datavault4dbt.string_to_timestamp( timestamp_format , end_of_all_times) }}) as "{{ column_name }}"
{%- if datatype in ['DATETIME', 'DATETIME2', 'DATETIMEOFFSET'] %} CONVERT({{ datatype }}, {{- datavault4dbt.string_to_timestamp( timestamp_format , end_of_all_times) }}) as "{{ alias }}"
{%- elif 'CHAR' in datatype -%}
{%- if col_size is not none -%}
{%- if (col_size | int) == -1 -%}
Expand Down
2 changes: 1 addition & 1 deletion macros/supporting/string_to_timestamp.sql
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
{%- endmacro -%}

{%- macro exasol__string_to_timestamp(format, timestamp) -%}
TO_TIMESTAMP('{{ timestamp }}', '{{ format }}')
CAST(TO_TIMESTAMP('{{ timestamp }}', '{{ format }}') AS {{ datavault4dbt.timestamp_default_dtype() }})
{%- endmacro -%}

{%- macro snowflake__string_to_timestamp(format, timestamp) -%}
Expand Down
Loading
Loading