Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

V1.8.8 PR #246

Merged
merged 53 commits into from
Dec 1, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
99e45af
Testing OIDC
prdpsvs Nov 22, 2024
e5a6425
Testing OIDC
prdpsvs Nov 22, 2024
193819e
Testing OIDC
prdpsvs Nov 22, 2024
8a37a3d
Testing OIDC
prdpsvs Nov 23, 2024
0f23143
Testing OIDC
prdpsvs Nov 23, 2024
5c016f9
Testing OIDC
prdpsvs Nov 23, 2024
70d7d7a
Testing OIDC
prdpsvs Nov 23, 2024
0e37adb
Testing OIDC
prdpsvs Nov 23, 2024
a8408c1
Testing OIDC
prdpsvs Nov 23, 2024
5b02613
Testing OIDC
prdpsvs Nov 23, 2024
abef333
Testing OIDC
prdpsvs Nov 23, 2024
28eacc1
Testing OIDC
prdpsvs Nov 23, 2024
b438c4a
Testing OIDC
prdpsvs Nov 23, 2024
836c945
Testing OIDC
prdpsvs Nov 23, 2024
3fe221c
Testing OIDC
prdpsvs Nov 23, 2024
a798983
Testing OIDC
prdpsvs Nov 23, 2024
cc9827b
Testing OIDC
prdpsvs Nov 23, 2024
93ca65e
Testing OIDC
prdpsvs Nov 23, 2024
f093217
Testing OIDC
prdpsvs Nov 23, 2024
ef96e1a
Testing OIDC
prdpsvs Nov 23, 2024
15f3c7d
Testing OIDC
prdpsvs Nov 23, 2024
7dbc8a7
Testing OIDC
prdpsvs Nov 23, 2024
a3bf019
Testing OIDC
prdpsvs Nov 23, 2024
955a2f5
Testing OIDC
prdpsvs Nov 23, 2024
5b32d43
Testing OIDC
prdpsvs Nov 23, 2024
0a39666
Testing OIDC
prdpsvs Nov 23, 2024
5825ed2
Testing OIDC
prdpsvs Nov 23, 2024
30c23d7
Testing OIDC
prdpsvs Nov 23, 2024
b66838f
Testing OIDC
prdpsvs Nov 25, 2024
09480c4
Testing OIDC
prdpsvs Nov 25, 2024
050b35b
Testing OIDC
prdpsvs Nov 25, 2024
0961457
Testing OIDC
prdpsvs Nov 25, 2024
3d65834
Testing OIDC
prdpsvs Nov 25, 2024
3cf4060
Testing OIDC
prdpsvs Nov 25, 2024
aa37756
Testing OIDC
prdpsvs Nov 25, 2024
64e1a50
Testing OIDC
prdpsvs Nov 25, 2024
35491f1
Testing OIDC
prdpsvs Nov 25, 2024
3baeb4d
Testing OIDC
prdpsvs Nov 25, 2024
9cb00eb
Testing OIDC
prdpsvs Nov 25, 2024
fbe564d
Testing OIDC
prdpsvs Nov 25, 2024
46f0559
Testing OIDC
prdpsvs Nov 26, 2024
c86e466
Testing OIDC
prdpsvs Nov 26, 2024
6b3115f
Testing OIDC
prdpsvs Nov 26, 2024
6a5c076
Testing OIDC
prdpsvs Nov 26, 2024
61eb419
Testing OIDC
prdpsvs Nov 26, 2024
4e738e6
Testing OIDC
prdpsvs Nov 26, 2024
a650d99
Testing OIDC
prdpsvs Nov 26, 2024
9c85649
configuring local tests to run on user credentials. Dropping a relati…
prdpsvs Nov 27, 2024
19ff404
Addressing #243, #221, #228, #229, #232, #235 issues
prdpsvs Nov 28, 2024
bdb4b81
Updated test helper with ephemeral
prdpsvs Dec 1, 2024
899a990
Updated unit tests
prdpsvs Dec 1, 2024
4d94cdd
Updated get_pyodbc_attrs_before_credentials method
prdpsvs Dec 1, 2024
dd918ed
include only lines that start with order by
prdpsvs Dec 1, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
52 changes: 37 additions & 15 deletions .github/workflows/integration-tests-azure.yml
Original file line number Diff line number Diff line change
@@ -1,32 +1,56 @@
---
name: Integration tests on Azure
on: # yamllint disable-line rule:truthy
name: Integration tests on Fabric DW
on: # yamllint disable-line rule:truthy
workflow_dispatch:
pull_request:
branches:
- main

jobs:
integration-tests-azure:
integration-tests-fabric-dw:
name: Regular
strategy:
fail-fast: false
max-parallel: 1
matrix:
profile: ["ci_azure_auto"]
profile: ["integration_tests"]
python_version: ["3.11"]
msodbc_version: ["17", "18"]
msodbc_version: ["18"]

runs-on: ubuntu-latest
permissions:
contents: read # Required to access repository files
packages: read # Grant explicit read access to packages
id-token: write # Needed if using OIDC authentication
container:
image: ghcr.io/${{ github.repository }}:CI-${{ matrix.python_version }}-msodbc${{ matrix.msodbc_version }}
steps:
- name: AZ CLI login
run: az login --service-principal --username="${AZURE_CLIENT_ID}" --password="${AZURE_CLIENT_SECRET}" --tenant="${AZURE_TENANT_ID}"
env:
AZURE_CLIENT_ID: ${{ secrets.DBT_AZURE_SP_NAME }}
AZURE_CLIENT_SECRET: ${{ secrets.DBT_AZURE_SP_SECRET }}
AZURE_TENANT_ID: ${{ secrets.DBT_AZURE_TENANT }}
# Azure login using federated credentials
- name: Azure login with OIDC
uses: azure/login@v2
with:
client-id: ${{ secrets.DBT_AZURE_SP_NAME }}
tenant-id: ${{ secrets.DBT_AZURE_TENANT }}
allow-no-subscriptions: true
federated-token: true

- name: Test Connection To Fabric Data Warehouse
id: fetch_token
run: |
pip install azure-identity pyodbc azure-core

python - <<EOF
from azure.core.credentials import AccessToken
from azure.identity import DefaultAzureCredential
import pyodbc
import logging
import struct
try:
credential = DefaultAzureCredential()
token = credential.get_token("https://database.windows.net/.default")
print(f"::set-output name=access_token::{token.token}")
except pyodbc.Error as e:
logging.error("Error occurred while connecting to the database.", exc_info=True)
EOF

- uses: actions/checkout@v4

Expand All @@ -37,9 +61,7 @@ jobs:
env:
DBT_AZURESQL_SERVER: ${{ secrets.DBT_AZURESQL_SERVER }}
DBT_AZURESQL_DB: ${{ secrets.DBT_AZURESQL_DB }}
AZURE_CLIENT_ID: ${{ secrets.DBT_AZURE_SP_NAME }}
AZURE_CLIENT_SECRET: ${{ secrets.DBT_AZURE_SP_SECRET }}
AZURE_TENANT_ID: ${{ secrets.DBT_AZURE_TENANT }}
FABRIC_INTEGRATION_TESTS_TOKEN: ${{ steps.fetch_token.outputs.access_token }}
FABRIC_TEST_DRIVER: 'ODBC Driver ${{ matrix.msodbc_version }} for SQL Server'
DBT_TEST_USER_1: dbo
DBT_TEST_USER_2: dbo
Expand Down
4 changes: 4 additions & 0 deletions .github/workflows/publish-docker.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
---
name: Publish Docker images for CI/CD
on: # yamllint disable-line rule:truthy
workflow_dispatch:
push:
paths:
- 'devops/**'
Expand Down Expand Up @@ -40,3 +41,6 @@ jobs:
platforms: linux/amd64
target: ${{ matrix.docker_target }}
tags: ghcr.io/${{ github.repository }}:CI-${{ matrix.python_version }}-${{ matrix.docker_target }}

- name: List Docker images
run: docker images ghcr.io/${{ github.repository }}
2 changes: 1 addition & 1 deletion .github/workflows/unit-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ jobs:
name: Unit tests
strategy:
matrix:
python_version: ["3.8", "3.9", "3.10", "3.11"]
python_version: ["3.10", "3.11"]
runs-on: ubuntu-latest
permissions:
contents: read
Expand Down
45 changes: 41 additions & 4 deletions dbt/adapters/fabric/fabric_connection_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ def get_environment_access_token(credentials: FabricCredentials) -> AccessToken:
}


def get_pyodbc_attrs_before(credentials: FabricCredentials) -> Dict:
def get_pyodbc_attrs_before_credentials(credentials: FabricCredentials) -> Dict:
"""
Get the pyodbc attrs before.

Expand Down Expand Up @@ -220,6 +220,36 @@ def get_pyodbc_attrs_before(credentials: FabricCredentials) -> Dict:
return attrs_before


def get_pyodbc_attrs_before_accesstoken(accessToken: str) -> Dict:
"""
Get the pyodbc attrs before.

Parameters
----------
credentials : Access Token for Integration Tests
Credentials.

Returns
-------
out : Dict
The pyodbc attrs before.

Source
------
Authentication for SQL server with an access token:
https://docs.microsoft.com/en-us/sql/connect/odbc/using-azure-active-directory?view=sql-server-ver15#authenticating-with-an-access-token
"""

access_token_utf16 = accessToken.encode("utf-16-le")
token_struct = struct.pack(
f"<I{len(access_token_utf16)}s", len(access_token_utf16), access_token_utf16
)
sql_copt_ss_access_token = 1256 # see source in docstring
attrs_before = {sql_copt_ss_access_token: token_struct}

return attrs_before


def bool_to_connection_string_arg(key: str, value: bool) -> str:
"""
Convert a boolean to a connection string argument.
Expand Down Expand Up @@ -323,15 +353,18 @@ def open(cls, connection: Connection) -> Connection:

con_str.append(f"Database={credentials.database}")

#Enabling trace flag
# Enabling trace flag
if credentials.trace_flag:
con_str.append("SQL_ATTR_TRACE=SQL_OPT_TRACE_ON")
else:
con_str.append("SQL_ATTR_TRACE=SQL_OPT_TRACE_OFF")

assert credentials.authentication is not None

if "ActiveDirectory" in credentials.authentication:
if (
"ActiveDirectory" in credentials.authentication
and credentials.authentication != "ActiveDirectoryAccessToken"
):
con_str.append(f"Authentication={credentials.authentication}")

if credentials.authentication == "ActiveDirectoryPassword":
Expand Down Expand Up @@ -395,7 +428,11 @@ def open(cls, connection: Connection) -> Connection:
def connect():
logger.debug(f"Using connection string: {con_str_display}")

attrs_before = get_pyodbc_attrs_before(credentials)
if credentials.authentication == "ActiveDirectoryAccessToken":
attrs_before = get_pyodbc_attrs_before_accesstoken(credentials.access_token)
else:
attrs_before = get_pyodbc_attrs_before_credentials(credentials)

handle = pyodbc.connect(
con_str_concat,
attrs_before=attrs_before,
Expand Down
1 change: 1 addition & 0 deletions dbt/adapters/fabric/fabric_credentials.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ class FabricCredentials(Credentials):
tenant_id: Optional[str] = None
client_id: Optional[str] = None
client_secret: Optional[str] = None
access_token: Optional[str] = None
authentication: Optional[str] = "ActiveDirectoryServicePrincipal"
encrypt: Optional[bool] = True # default value in MS ODBC Driver 18 as well
trust_cert: Optional[bool] = False # default value in MS ODBC Driver 18 as well
Expand Down
4 changes: 2 additions & 2 deletions dbt/include/fabric/macros/adapters/catalog.sql
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@
c.column_id as column_index,
t.name as column_type
from sys.columns as c {{ information_schema_hints() }}
left join sys.types as t on c.system_type_id = t.system_type_id {{ information_schema_hints() }}
left join sys.types as t on c.system_type_id = t.system_type_id
)

select
Expand Down Expand Up @@ -223,7 +223,7 @@
c.column_id as column_index,
t.name as column_type
from sys.columns as c {{ information_schema_hints() }}
left join sys.types as t on c.system_type_id = t.system_type_id {{ information_schema_hints() }}
left join sys.types as t on c.system_type_id = t.system_type_id
)

select
Expand Down
30 changes: 9 additions & 21 deletions dbt/include/fabric/macros/adapters/show.sql
Original file line number Diff line number Diff line change
@@ -1,24 +1,12 @@
{% macro fabric__get_limit_sql(sql, limit) %}

{% if limit == -1 or limit is none %}
{% if sql.strip().lower().startswith('with') %}
{{ sql }}
{% else -%}
select *
from (
{{ sql }}
) as model_limit_subq
{%- endif -%}
{% else -%}
{% if sql.strip().lower().startswith('with') %}
{{ sql }} order by (select null)
offset 0 rows fetch first {{ limit }} rows only
{% else -%}
select *
from (
{{ sql }}
) as model_limit_subq order by (select null)
offset 0 rows fetch first {{ limit }} rows only
{%- endif -%}
{%- if limit == -1 or limit is none -%}
{{ sql }}
{#- Special processing if the last non-blank line starts with order by -#}
{%- elif sql.strip().splitlines()[-1].strip().lower().startswith('order by') -%}
{{ sql }}
offset 0 rows fetch first {{ limit }} rows only
{%- else -%}
{{ sql }}
order by (select null) offset 0 rows fetch first {{ limit }} rows only
{%- endif -%}
{% endmacro %}
25 changes: 15 additions & 10 deletions dbt/include/fabric/macros/materializations/models/table/table.sql
Original file line number Diff line number Diff line change
Expand Up @@ -35,20 +35,25 @@

{% if existing_relation is not none and existing_relation.is_table %}

-- making a backup relation, this will come in use when contract is enforced or not
{%- set backup_relation = make_backup_relation(existing_relation, 'table') -%}
-- making a backup relation, this will come in use when contract is enforced or not
{%- set set_backup_relation = adapter.get_relation(database=this.database, schema=this.schema, identifier=this.identifier) -%}
{% if (set_backup_relation != none and set_backup_relation.type == "table") %}
{%- set backup_relation = make_backup_relation(target_relation, 'table') -%}
{% elif (set_backup_relation != none and set_backup_relation.type == "view") %}
{%- set backup_relation = make_backup_relation(target_relation, 'view') -%}
{% endif %}

-- Dropping a temp relation if it exists
{{ adapter.drop_relation(backup_relation) }}
-- Dropping a temp relation if it exists
{{ adapter.drop_relation(backup_relation) }}

-- Rename existing relation to back up relation
{{ adapter.rename_relation(existing_relation, backup_relation) }}
-- Rename existing relation to back up relation
{{ adapter.rename_relation(existing_relation, backup_relation) }}

-- Renaming temp relation as main relation
{{ adapter.rename_relation(temp_relation, target_relation) }}
-- Renaming temp relation as main relation
{{ adapter.rename_relation(temp_relation, target_relation) }}

-- Drop backup relation
{{ adapter.drop_relation(backup_relation) }}
-- Drop backup relation
{{ adapter.drop_relation(backup_relation) }}

{%- else %}

Expand Down
54 changes: 5 additions & 49 deletions dbt/include/fabric/macros/materializations/snapshots/helpers.sql
Original file line number Diff line number Diff line change
Expand Up @@ -3,56 +3,12 @@
{% do drop_relation_if_exists(staging_relation) %}
{% endmacro %}

--Due to Alter not being supported, have to rely on this for temporarily
{% macro fabric__create_columns(relation, columns) %}
{# default__ macro uses "add column"
TSQL preferes just "add"
#}

{% set columns %}
{% for column in columns %}
, CAST(NULL AS {{column.data_type}}) AS {{column.name}}
{% endfor %}
{% endset %}

{% set tempTableName %}
[{{relation.database}}].[{{ relation.schema }}].[{{ relation.identifier }}_{{ range(1300, 19000) | random }}]
{% endset %}
{{ log("Creating new columns are not supported without dropping a table. Using random table as a temp table. - " ~ tempTableName) }}

{% set tempTable %}
CREATE TABLE {{tempTableName}}
AS SELECT * {{columns}} FROM [{{relation.database}}].[{{ relation.schema }}].[{{ relation.identifier }}] {{ information_schema_hints() }} {{ apply_label() }}
{% endset %}

{% call statement('create_temp_table') -%}
{{ tempTable }}
{%- endcall %}

{% set dropTable %}
DROP TABLE [{{relation.database}}].[{{ relation.schema }}].[{{ relation.identifier }}]
{% endset %}

{% call statement('drop_table') -%}
{{ dropTable }}
{%- endcall %}

{% set createTable %}
CREATE TABLE {{ relation }}
AS SELECT * FROM {{tempTableName}} {{ information_schema_hints() }} {{ apply_label() }}
{% endset %}

{% call statement('create_Table') -%}
{{ createTable }}
{%- endcall %}

{% set dropTempTable %}
DROP TABLE {{tempTableName}}
{% endset %}

{% call statement('drop_temp_table') -%}
{{ dropTempTable }}
{%- endcall %}
{% for column in columns %}
{% call statement() %}
alter table {{ relation.render() }} add "{{ column.name }}" {{ column.data_type }} NULL;
{% endcall %}
{% endfor %}
{% endmacro %}

{% macro fabric__get_true_sql() %}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,10 +31,11 @@

-- Create a temporary view to manage if user SQl uses CTE
{% set temp_snapshot_relation_sql = model['compiled_code'].replace("'", "''") %}
{% call statement('create temp_snapshot_relation') %}
EXEC('DROP VIEW IF EXISTS {{ temp_snapshot_relation.include(database=False) }};');
EXEC('create view {{ temp_snapshot_relation.include(database=False) }} as {{ temp_snapshot_relation_sql }};');
{% endcall %}
{{ adapter.drop_relation(temp_snapshot_relation) }}

{% call statement('create temp_snapshot_relation') -%}
{{ get_create_view_as_sql(temp_snapshot_relation, temp_snapshot_relation_sql) }}
{%- endcall %}

{% if not target_relation_exists %}

Expand Down
Loading
Loading