diff --git a/website/docs/docs/build/about-metricflow.md b/website/docs/docs/build/about-metricflow.md index 6f4764a83b5..25e4e86b1c4 100644 --- a/website/docs/docs/build/about-metricflow.md +++ b/website/docs/docs/build/about-metricflow.md @@ -8,7 +8,7 @@ pagination_next: "docs/build/join-logic" pagination_prev: null --- -This guide introduces MetricFlow's fundamental ideas for people new to this feature. MetricFlow, which powers the dbt Semantic Layer, helps you define and manage the logic for your company's metrics. It's an opinionated set of abstractions and helps data consumers retrieve metric datasets from a data platform quickly and efficiently. +This guide introduces MetricFlow's fundamental ideas for people new to this feature. MetricFlow, which powers the dbt Semantic Layer, helps you define and manage the logic for your company's metrics. It's an opinionated set of abstractions and helps data consumers retrieve metric datasets from a data platform quickly and efficiently. MetricFlow handles SQL query construction and defines the specification for dbt semantic models and metrics. It allows you to define metrics in your dbt project and query them with [MetricFlow commands](/docs/build/metricflow-commands) whether in dbt Cloud or dbt Core. @@ -284,5 +284,4 @@ Yes, but because a dimension is considered an attribute of the primary or unique ## Related docs - [Joins](/docs/build/join-logic) -- [Validations](/docs/build/validation) - +- [Validations](/docs/build/validation) diff --git a/website/docs/docs/build/build-metrics-intro.md b/website/docs/docs/build/build-metrics-intro.md index 508f85fe1c6..8aa6d8ad5aa 100644 --- a/website/docs/docs/build/build-metrics-intro.md +++ b/website/docs/docs/build/build-metrics-intro.md @@ -5,7 +5,7 @@ description: "Learn about MetricFlow and build your metrics with semantic models sidebar_label: Build your metrics tags: [Metrics, Semantic Layer, Governance] hide_table_of_contents: true -pagination_next: "docs/build/sl-getting-started" +pagination_next: "guides/sl-snowflake-qs" pagination_prev: null --- @@ -20,9 +20,9 @@ MetricFlow allows you to:
- -## Prerequisites - -import SetUp from '/snippets/_v2-sl-prerequisites.md'; - - - -:::tip -New to dbt or metrics? Try our [Jaffle shop example project](https://github.com/dbt-labs/jaffle-sl-template) to help you get started! -::: - -## Create a semantic model - - - -## Define metrics - - - -## Configure the MetricFlow time spine model - - - -## Test and query metrics - - - -## Run a production job - - - -## Set up dbt Semantic Layer - -import SlSetUp from '/snippets/_new-sl-setup.md'; - - - -## Connect and query API - - - - -## Next steps - -- [dbt Semantic Layer FAQs](/docs/use-dbt-semantic-layer/sl-faqs) -- [About MetricFlow](/docs/build/about-metricflow) -- [Build your metrics](/docs/build/build-metrics-intro) -- [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) -- Demo on [how to define and query metrics with MetricFlow](https://www.loom.com/share/60a76f6034b0441788d73638808e92ac?sid=861a94ac-25eb-4fd8-a310-58e159950f5a) -- [Billing](/docs/cloud/billing) - - diff --git a/website/docs/docs/cloud/billing.md b/website/docs/docs/cloud/billing.md index 0e88c8d9eb4..493d133f6b2 100644 --- a/website/docs/docs/cloud/billing.md +++ b/website/docs/docs/cloud/billing.md @@ -134,7 +134,7 @@ Customers who purchased the dbt Cloud Team plan before August 11, 2023, remain o :::note Legacy Semantic Layer -For customers using the legacy Semantic Layer with dbt_metrics package, this product will be deprecated in December 2023. Legacy users may choose to upgrade at any time to the revamped version, Semantic Layer powered by MetricFlow. The revamped version is available to most customers (see [prerequisites](/docs/use-dbt-semantic-layer/quickstart-sl#prerequisites)) for a limited time on a free trial basis, subject to reasonable use. +For customers using the legacy Semantic Layer with dbt_metrics package, this product will be deprecated in December 2023. Legacy users may choose to upgrade at any time to the revamped version, Semantic Layer powered by MetricFlow. The revamped version is available to most customers (see [prerequisites](/guides/sl-snowflake-qs#prerequisites)) for a limited time on a free trial basis, subject to reasonable use. ::: diff --git a/website/docs/docs/dbt-cloud-apis/sl-graphql.md b/website/docs/docs/dbt-cloud-apis/sl-graphql.md index ed4f07e0e2d..f43fa5a181a 100644 --- a/website/docs/docs/dbt-cloud-apis/sl-graphql.md +++ b/website/docs/docs/dbt-cloud-apis/sl-graphql.md @@ -52,7 +52,7 @@ Note that the dbt Semantic Layer API doesn't support `ref` to call dbt objects. If you're a dbt user or partner with access to dbt Cloud and the [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), you can [setup](/docs/use-dbt-semantic-layer/setup-sl) and test this API with data from your own instance by configuring the Semantic Layer and obtaining the right GQL connection parameters described in this document. -Refer to [Get started with the dbt Semantic Layer](docs/use-dbt-semantic-layer/quickstart-sl) for more info. +Refer to [Get started with the dbt Semantic Layer](/guides/sl-snowflake-qs) for more info. ### Authentication diff --git a/website/docs/docs/dbt-cloud-apis/sl-jdbc.md b/website/docs/docs/dbt-cloud-apis/sl-jdbc.md index 991fc53b3dd..bb5c8fe7918 100644 --- a/website/docs/docs/dbt-cloud-apis/sl-jdbc.md +++ b/website/docs/docs/dbt-cloud-apis/sl-jdbc.md @@ -31,7 +31,7 @@ If you are a dbt user or partner with access to dbt Cloud and the [dbt Semantic You *may* be able to use our JDBC API with tools that do not have an official integration with the dbt Semantic Layer. If the tool you use allows you to write SQL and either supports a generic JDBC driver option (such as DataGrip) or supports Dremio and uses ArrowFlightSQL driver version 12.0.0 or higher, you can access the Semantic Layer API. -Refer to [Get started with the dbt Semantic Layer](/docs/use-dbt-semantic-layer/quickstart-sl) for more info. +Refer to [Get started with the dbt Semantic Layer](/guides/sl-snowflake-qs) for more info. Note that the dbt Semantic Layer API doesn't support `ref` to call dbt objects. Instead, use the complete qualified table name. If you're using dbt macros at query time to calculate your metrics, you should move those calculations into your Semantic Layer metric definitions as code. diff --git a/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/sl-revamp-beta.md b/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/sl-revamp-beta.md index 599257840bb..809068c752a 100644 --- a/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/sl-revamp-beta.md +++ b/website/docs/docs/dbt-versions/release-notes/78-Aug-2023/sl-revamp-beta.md @@ -60,6 +60,6 @@ To experience the universal dbt Semantic Layer and its enhanced beta capabilitie - [Introducing the new dbt Semantic Layer](https://www.getdbt.com/blog/introducing-new-look-dbt-semantic-layer) - [dbt Semantic Layer docs](/docs/use-dbt-semantic-layer/dbt-sl) -- [dbt Semantic Layer get started guide](/docs/use-dbt-semantic-layer/quickstart-sl) +- [dbt Semantic Layer get started guide](/guides/sl-snowflake-qs) - [Build your metrics with MetricFlow](/docs/build/build-metrics-intro) diff --git a/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md b/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md index 6fe63b08a7a..21b15d518f1 100644 --- a/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md @@ -5,7 +5,7 @@ description: "Learn how the dbt Semantic Layer enables data teams to centrally d sidebar_label: "About the dbt Semantic Layer" tags: [Semantic Layer] hide_table_of_contents: true -pagination_next: "docs/use-dbt-semantic-layer/quickstart-sl" +pagination_next: "guides/sl-snowflake-qs" pagination_prev: null --- @@ -36,9 +36,9 @@ plan="dbt Cloud Team or Enterprise"
- -import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; - - - - - -The dbt Semantic Layer, powered by [MetricFlow](/docs/build/about-metricflow), simplifies defining and using critical business metrics. It centralizes metric definitions, eliminates duplicate coding, and ensures consistent self-service access to metrics in downstream tools. - -MetricFlow, a powerful component of the dbt Semantic Layer, simplifies the creation and management of company metrics. It offers flexible abstractions, SQL query generation, and enables fast retrieval of metric datasets from a data platform. - -Use this guide to fully experience the power of the universal dbt Semantic Layer. Here are the following steps you'll take: - -- [Create a semantic model](#create-a-semantic-model) in dbt Cloud using MetricFlow -- [Define metrics](#define-metrics) in dbt using MetricFlow -- [Test and query metrics](#test-and-query-metrics) with MetricFlow -- [Run a production job](#run-a-production-job) in dbt Cloud -- [Set up dbt Semantic Layer](#set-up-dbt-semantic-layer) in dbt Cloud -- [Connect and query API](#connect-and-query-api) with dbt Cloud - -MetricFlow allows you to define metrics in your dbt project and query them whether in dbt Cloud or dbt Core with [MetricFlow commands](/docs/build/metricflow-commands). - -However, to experience the power of the universal [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) and query those metrics in downstream tools, you'll need a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. - -import SLCourses from '/snippets/_sl-course.md'; - - - -## Prerequisites - -import SetUp from '/snippets/_v2-sl-prerequisites.md'; - - - -:::tip -New to dbt or metrics? Try our [Jaffle shop example project](https://github.com/dbt-labs/jaffle-sl-template) to help you get started! -::: - -## Create a semantic model - - - -## Define metrics - - - -## Test and query metrics - - - -## Run a production job - - - - - -
- -What’s happening internally? -- Merging the code into your main branch allows dbt Cloud to pull those changes and builds the definition in the manifest produced by the run.
-- Re-running the job in the deployment environment helps materialize the models, which the metrics depend on, in the data platform. It also makes sure that the manifest is up to date.
-- The Semantic Layer APIs pulls in the most recent manifest and allows your integration information to extract metadata from it. -
- -## Set up dbt Semantic Layer - -import SlSetUp from '/snippets/_new-sl-setup.md'; - - - - -## Connect and query API - - - -## Next steps - -- [dbt Semantic Layer FAQs](/docs/use-dbt-semantic-layer/sl-faqs) -- [Set up dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) -- [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) -- Demo on [how to define and query metrics with MetricFlow](https://www.loom.com/share/60a76f6034b0441788d73638808e92ac?sid=861a94ac-25eb-4fd8-a310-58e159950f5a) -- [Billing](/docs/cloud/billing) diff --git a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md index a93390e8c4c..a2a09d0ec8f 100644 --- a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md @@ -18,7 +18,6 @@ With the dbt Semantic Layer, you can centrally define business metrics, reduce c ## Prerequisites - import SetUp from '/snippets/_v2-sl-prerequisites.md'; @@ -49,5 +48,5 @@ import SlSetUp from '/snippets/_new-sl-setup.md'; - [Build your metrics](/docs/build/build-metrics-intro) - [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) - [Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) -- [Get started with the dbt Semantic Layer](/docs/use-dbt-semantic-layer/quickstart-sl) +- [Get started with the dbt Semantic Layer](/guides/sl-snowflake-qs) - [dbt Semantic Layer FAQs](/docs/use-dbt-semantic-layer/sl-faqs) diff --git a/website/docs/docs/use-dbt-semantic-layer/sl-faqs.md b/website/docs/docs/use-dbt-semantic-layer/sl-faqs.md index c0ae011dda1..e9e33356aa2 100644 --- a/website/docs/docs/use-dbt-semantic-layer/sl-faqs.md +++ b/website/docs/docs/use-dbt-semantic-layer/sl-faqs.md @@ -136,7 +136,7 @@ MetricFlow is hosted in dbt Cloud. Requests from the [Semantic Layer APIs](/docs 2. Then you build your metrics on top of these semantic models. This is all done in `.yml` configurations alongside your dbt models in your projects. 3. Once you've defined your metrics and semantic models, you can [configure the dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) in dbt Cloud. -Read our [Quickstart](/docs/use-dbt-semantic-layer/quickstart-sl) for more information. +Read our [dbt Semantic Layer quickstart](/guides/sl-snowflake-qs) guide for more information. diff --git a/website/docs/guides/sl-migration.md b/website/docs/guides/sl-migration.md index df0ada9d7e5..76446666638 100644 --- a/website/docs/guides/sl-migration.md +++ b/website/docs/guides/sl-migration.md @@ -133,12 +133,11 @@ If you created a new environment in [Step 3](#step-3-setup-the-semantic-layer-in ### Related docs +- [Quickstart guide with the dbt Semantic Layer](/guides/sl-snowflake-qs) - [dbt Semantic Layer FAQs](/docs/use-dbt-semantic-layer/sl-faqs) -- [MetricFlow quickstart guide](/docs/build/sl-getting-started) -- [Example dbt project](https://github.com/dbt-labs/jaffle-sl-template) - [dbt metrics converter](https://github.com/dbt-labs/dbt-converter) - [Why we're deprecating the dbt_metrics package](/blog/deprecating-dbt-metrics) blog post - [dbt Semantic Layer API query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) - [dbt Semantic Layer on-demand courses](https://courses.getdbt.com/courses/semantic-layer) -
\ No newline at end of file +
diff --git a/website/docs/guides/sl-snowflake-qs.md b/website/docs/guides/sl-snowflake-qs.md new file mode 100644 index 00000000000..28e2be9fc66 --- /dev/null +++ b/website/docs/guides/sl-snowflake-qs.md @@ -0,0 +1,934 @@ +--- +title: "Quickstart for the dbt Cloud Semantic Layer and Snowflake" +id: sl-snowflake-qs +description: "Use this guide to build and define metrics, set up the dbt Cloud Semantic Layer, and query them using Google Sheets." +sidebar_label: "Quickstart with the dbt Semantic Layer and Snowflake" +meta: + api_name: dbt Semantic Layer APIs +icon: 'guides' +hide_table_of_contents: true +tags: ['Semantic Layer', 'Snowflake', 'dbt Cloud','Quickstart'] +keywords: ['dbt Semantic Layer','Metrics','dbt Cloud', 'Snowflake', 'Google Sheets'] +level: 'Intermediate' +recently_updated: true +--- + + +import CreateModel from '/snippets/_sl-create-semanticmodel.md'; +import DefineMetrics from '/snippets/_sl-define-metrics.md'; +import ConfigMetric from '/snippets/_sl-configure-metricflow.md'; +import TestQuery from '/snippets/_sl-test-and-query-metrics.md'; +import ConnectQueryAPI from '/snippets/_sl-connect-and-query-api.md'; +import RunProdJob from '/snippets/_sl-run-prod-job.md'; +import SlSetUp from '/snippets/_new-sl-setup.md'; + + + +import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; + + + + + +## Introduction + +The [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), powered by [MetricFlow](/docs/build/about-metricflow), simplifies the setup of key business metrics. It centralizes definitions, avoids duplicate code, and ensures easy access to metrics in downstream tools. MetricFlow helps manage company metrics easier, allowing you to define metrics in your dbt project and query them in dbt Cloud with [MetricFlow commands](/docs/build/metricflow-commands). + + +import SLCourses from '/snippets/_sl-course.md'; + + + +This quickstart guide is designed for dbt Cloud users using Snowflake as their data platform. It focuses on building and defining metrics, setting up the dbt Semantic Layer in a dbt Cloud project, and querying metrics in Google Sheets. + +**For users on different data platform** + +If you're using a data platform other than Snowflake, this guide is also be applicable to you. You can adapt the setup for your specific platform by following the account setup and data loading instructions detailed in the following tabs for each respective platform. + +The rest of this guide applies universally across all supported platforms, ensuring you can fully leverage the dbt Semantic Layer. + + + + + +Open a new tab and follow these quick steps for account setup and data loading instructions: + +- [Step 2: Create a new GCP project](https://docs.getdbt.com/guides/bigquery?step=2) +- [Step 3: Create BigQuery dataset](https://docs.getdbt.com/guides/bigquery?step=3) +- [Step 4: Generate BigQuery credentials](https://docs.getdbt.com/guides/bigquery?step=4) +- [Step 5: Connect dbt Cloud to BigQuery](https://docs.getdbt.com/guides/bigquery?step=5) + + + + + +Open a new tab and follow these quick steps for account setup and data loading instructions: + +- [Step 2: Create a Databricks workspace](https://docs.getdbt.com/guides/databricks?step=2) +- [Step 3: Load data](https://docs.getdbt.com/guides/databricks?step=3) +- [Step 4: Connect dbt Cloud to Databricks](https://docs.getdbt.com/guides/databricks?step=4) + + + + + +Open a new tab and follow these quick steps for account setup and data loading instructions: + +- [Step 2: Load data into your Microsoft Fabric warehouse](https://docs.getdbt.com/guides/microsoft-fabric?step=2) +- [Step 3: Connect dbt Cloud to Microsoft Fabric](https://docs.getdbt.com/guides/microsoft-fabric?step=3) + + + + + +Open a new tab and follow these quick steps for account setup and data loading instructions: + +- [Step 2: Create a Redshift cluster](https://docs.getdbt.com/guides/redshift?step=2) +- [Step 3: Load data](https://docs.getdbt.com/guides/redshift?step=3) +- [Step 4: Connect dbt Cloud to Redshift](https://docs.getdbt.com/guides/redshift?step=3) + + + + + +Open a new tab and follow these quick steps for account setup and data loading instructions: + +- [Step 2: Load data to an Amazon S3 bucket](https://docs.getdbt.com/guides/starburst-galaxy?step=2) +- [Step 3: Connect Starburst Galaxy to Amazon S3 bucket data](https://docs.getdbt.com/guides/starburst-galaxy?step=3) +- [Step 4: Create tables with Starburst Galaxy](https://docs.getdbt.com/guides/starburst-galaxy?step=4) +- [Step 5: Connect dbt Cloud to Starburst Galaxy](https://docs.getdbt.com/guides/starburst-galaxy?step=5) + + + + + +## Prerequisites + +- You need a [dbt Cloud](https://www.getdbt.com/signup/) Trial, Team, or Enterprise account for all deployments. Contact your representative for Single-tenant setup; otherwise, create an account using this guide. +- Have the correct [dbt Cloud license](/docs/cloud/manage-access/seats-and-users) and [permissions](/docs/cloud/manage-access/self-service-permissions) based on your plan: + + + - Enterprise — Developer license with Account Admin permissions. Or "Owner" with a Developer license, assigned Project Creator, Database Admin, or Admin permissions. + - Team — "Owner" access with a Developer license. + - Trial — Automatic "Owner" access under a trail of the Team plan. + + + +- Production and development environments must be on [dbt version 1.6 or higher](/docs/dbt-versions/upgrade-dbt-version-in-cloud). Alternatively, set your environment to[ Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always remain on the latest version. +- Create a [trial Snowflake account](https://signup.snowflake.com/): + - Select the Enterprise Snowflake edition with ACCOUNTADMIN access. Consider organizational questions when choosing a cloud provider, refer to Snowflake's [Introduction to Cloud Platforms](https://docs.snowflake.com/en/user-guide/intro-cloud-platforms). + - Select a cloud provider and region. All cloud providers and regions will work so choose whichever you prefer. +- Basic understanding of SQL and dbt. For example, you've used dbt before or have completed the [dbt Fundamentals](https://courses.getdbt.com/collections) course. + +### What you'll learn + +This guide will cover the following topics: + +- [Create a new Snowflake worksheet and set up your environment](/guides/sl-snowflake-qs?step=3) +- [Load sample data into your Snowflake account](/guides/sl-snowflake-qs?step=4) +- [Connect dbt Cloud to Snowflake](/guides/sl-snowflake-qs?step=5) +- [Set up a dbt Cloud managed repository](/guides/sl-snowflake-qs?step=6) +- [Initialized a dbt Cloud project and start developer](/guides/sl-snowflake-qs?step=7) +- [Build your dbt Cloud project](/guides/sl-snowflake-qs?step=8) +- [Create a semantic model in dbt Cloud](/guides/sl-snowflake-qs?step=9) +- [Define metrics in dbt Cloud](/guides/sl-snowflake-qs?step=10) +- [Add second semantic model](/guides/sl-snowflake-qs?step=11) +- [Test and query metrics in dbt Cloud](/guides/sl-snowflake-qs?step=12) +- [Run a production job in dbt Cloud](/guides/sl-snowflake-qs?step=13) +- [Set up dbt Semantic Layer in dbt Cloud](/guides/sl-snowflake-qs?step=14) +- [Connect and query metrics with Google Sheets](/guides/sl-snowflake-qs?step=15) + +## Create new Snowflake worksheet and set up environment + +1. Log in to your [trial Snowflake account](https://signup.snowflake.com). +2. In the Snowflake user interface (UI), click **+ Worksheet** in the upper right corner. +3. Select **SQL Worksheet** to create a new worksheet. + + +### Set up Snowflake environment + +The data used here is stored as CSV files in a public S3 bucket and the following steps will guide you through how to prepare your Snowflake account for that data and upload it. + +Create a new virtual warehouse, two new databases (one for raw data, the other for future dbt development), and two new schemas (one for `jaffle_shop` data, the other for `stripe` data). + +1. Run the following SQL commands one by one by typing them into the Editor of your new Snowflake SQL worksheet to set up your environment. + +2. Click **Run** in the upper right corner of the UI for each one: + +```sql +-- Create a virtual warehouse named 'transforming' +create warehouse transforming; + +-- Create two databases: one for raw data and another for analytics +create database raw; +create database analytics; + +-- Within the 'raw' database, create two schemas: 'jaffle_shop' and 'stripe' +create schema raw.jaffle_shop; +create schema raw.stripe; +``` + +## Load data into Snowflake +Now that your environment is set up, you can start loading data into it. You will be working within the raw database, using the `jaffle_shop` and stripe schemas to organize your tables. + +1. Create customer table. First, delete all contents (empty) in the Editor of the Snowflake worksheet. Then, run this SQL command to create the customer table in the `jaffle_shop` schema: + + ```sql + create table raw.jaffle_shop.customers + ( id integer, + first_name varchar, + last_name varchar + ); + ``` + + You should see a ‘Table `CUSTOMERS` successfully created.’ message. + +2. Load data. After creating the table, delete all contents in the Editor. Run this command to load data from the S3 bucket into the customer table: + + ```sql + copy into raw.jaffle_shop.customers (id, first_name, last_name) + from 's3://dbt-tutorial-public/jaffle_shop_customers.csv' + file_format = ( + type = 'CSV' + field_delimiter = ',' + skip_header = 1 + ); + ``` + + You should see a confirmation message after running the command. + +3. Create `orders` table. Delete all contents in the Editor. Run the following command to create… + + ```sql + create table raw.jaffle_shop.orders + ( id integer, + user_id integer, + order_date date, + status varchar, + _etl_loaded_at timestamp default current_timestamp + ); + ``` + + You should see a confirmation message after running the command. + +4. Load data. Delete all contents in the Editor, then run this command to load data into the orders table: + + ```sql + copy into raw.jaffle_shop.orders (id, user_id, order_date, status) + from 's3://dbt-tutorial-public/jaffle_shop_orders.csv' + file_format = ( + type = 'CSV' + field_delimiter = ',' + skip_header = 1 + ); + ``` + + You should see a confirmation message after running the command. + +5. Create `payment` table. Delete all contents in the Editor. Run the following command to create the payment table: + + ```sql + create table raw.stripe.payment + ( id integer, + orderid integer, + paymentmethod varchar, + status varchar, + amount integer, + created date, + _batched_at timestamp default current_timestamp + ); + ``` + + You should see a confirmation message after running the command. + +6. Load data. Delete all contents in the Editor. Run the following command to load data into the payment table: + + ```sql + copy into raw.stripe.payment (id, orderid, paymentmethod, status, amount, created) + from 's3://dbt-tutorial-public/stripe_payments.csv' + file_format = ( + type = 'CSV' + field_delimiter = ',' + skip_header = 1 + ); + ``` + + You should see a confirmation message after running the command. + +7. Verify data. Verify that the data is loaded by running these SQL queries. Confirm that you can see output for each one, like the following confirmation image. + + ```sql + select * from raw.jaffle_shop.customers; + select * from raw.jaffle_shop.orders; + select * from raw.stripe.payment; + ``` + + + +## Connect dbt Cloud to Snowflake + +There are two ways to connect dbt Cloud to Snowflake. The first option is Partner Connect, which provides a streamlined setup to create your dbt Cloud account from within your new Snowflake trial account. The second option is to create your dbt Cloud account separately and build the Snowflake connection yourself (connect manually). If you want to get started quickly, dbt Labs recommends using Partner Connect. If you want to customize your setup from the very beginning and gain familiarity with the dbt Cloud setup flow, dbt Labs recommends connecting manually. + + + + +Using Partner Connect allows you to create a complete dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [a managed repository](/docs/collaborate/git/managed-repository), [environments](/docs/build/custom-schemas#managing-environments), and credentials. + +1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Admin**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt. + + + + If you’re using the classic version of the Snowflake UI, you can click the **Partner Connect** button in the top bar of your account. From there, click on the dbt tile to open up the connect box. + + + +2. In the **Connect to dbt** popup, find the **Optional Grant** option and select the **RAW** and **ANALYTICS** databases. This will grant access for your new dbt user role to each database. Then, click **Connect**. + + + + + +3. Click **Activate** when a popup appears: + + + + + +4. After the new tab loads, you will see a form. If you already created a dbt Cloud account, you will be asked to provide an account name. If you haven't created account, you will be asked to provide an account name and password. + + + +5. After you have filled out the form and clicked **Complete Registration**, you will be logged into dbt Cloud automatically. + +6. From your **Account Settings** in dbt Cloud (using the gear menu in the upper right corner), choose the "Partner Connect Trial" project and select **snowflake** in the overview table. Select edit and update the fields **Database** and **Warehouse** to be `analytics` and `transforming`, respectively. + + + + + + + + + +1. Create a new project in dbt Cloud. From **Account settings** (using the gear menu in the top right corner), click **+ New Project**. +2. Enter a project name and click **Continue**. +3. For the warehouse, click **Snowflake** then **Next** to set up your connection. + + + +4. Enter your **Settings** for Snowflake with: + * **Account** — Find your account by using the Snowflake trial account URL and removing `snowflakecomputing.com`. The order of your account information will vary by Snowflake version. For example, Snowflake's Classic console URL might look like: `oq65696.west-us-2.azure.snowflakecomputing.com`. The AppUI or Snowsight URL might look more like: `snowflakecomputing.com/west-us-2.azure/oq65696`. In both examples, your account will be: `oq65696.west-us-2.azure`. For more information, see [Account Identifiers](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html) in the Snowflake docs. + + + + * **Role** — Leave blank for now. You can update this to a default Snowflake role later. + * **Database** — `analytics`. This tells dbt to create new models in the analytics database. + * **Warehouse** — `transforming`. This tells dbt to use the transforming warehouse that was created earlier. + + + +5. Enter your **Development Credentials** for Snowflake with: + * **Username** — The username you created for Snowflake. The username is not your email address and is usually your first and last name together in one word. + * **Password** — The password you set when creating your Snowflake account. + * **Schema** — You’ll notice that the schema name has been auto-created for you. By convention, this is `dbt_`. This is the schema connected directly to your development environment, and it's where your models will be built when running dbt within the Cloud IDE. + * **Target name** — Leave as the default. + * **Threads** — Leave as 4. This is the number of simultaneous connects that dbt Cloud will make to build models concurrently. + + + +6. Click **Test Connection**. This verifies that dbt Cloud can access your Snowflake account. +7. If the connection test succeeds, click **Next**. If it fails, you may need to check your Snowflake settings and credentials. + + + + +## Set up a dbt Cloud managed repository +If you used Partner Connect, you can skip to [initializing your dbt project](#initialize-your-dbt-project-and-start-developing) as the Partner Connect provides you with a managed repository. Otherwise, you will need to create your repository connection. + + + +## Initialize your dbt project and start developing +Now that you have a repository configured, you can initialize your project and start development in dbt Cloud: + +1. Click **Start developing in the dbt Cloud IDE**. It might take a few minutes for your project to spin up for the first time as it establishes your git connection, clones your repo, and tests the connection to the warehouse. +2. Above the file tree to the left, click **Initialize your project**. This builds out your folder structure with example models. +3. Make your initial commit by clicking **Commit and sync**. Use the commit message `initial commit`. This creates the first commit to your managed repo and allows you to open a branch where you can add new dbt code. +4. You can now directly query data from your warehouse and execute `dbt run`. You can try this out now: + - Delete the models/examples folder in the **File Explorer**. + - Click **+ Create new file**, add this query to the new file, and click **Save as** to save the new file: + ```sql + select * from raw.jaffle_shop.customers + ``` + - In the command line bar at the bottom, enter dbt run and click Enter. You should see a dbt run succeeded message. + +## Build your dbt project +The next step is to build your project. This involves adding sources, staging models, business-defined entities, and packages to your project. + +### Add sources + +[Sources](/docs/build/sources) in dbt are the raw data tables you'll transform. By organizing your source definitions, you document the origin of your data. It also makes your project and transformation more reliable, structured, and understandable. + +You have two options for working with files in the dbt Cloud IDE: + +- **Create a new branch (recommended)** — Create a new branch to edit and commit your changes. Navigate to **Version Control** on the left sidebar and click **Create branch**. +- **Edit in the protected primary branch** — If you prefer to edit, format, or lint files and execute dbt commands directly in your primary git branch, use this option. The dbt Cloud IDE prevents commits to the protected branch so you'll be prompted to commit your changes to a new branch. + +Name the new branch `build-project`. + +1. Hover over the `models` directory and click the three dot menu (**...**), then select **Create file**. +2. Name the file `staging/jaffle_shop/src_jaffle_shop.yml` , then click **Create**. +3. Copy the following text into the file and click **Save**. + +```yaml +version: 2 + +sources: + - name: jaffle_shop + database: raw + schema: jaffle_shop + tables: + - name: customers + - name: orders +``` + +:::tip +In your source file, you can also use the **Generate model** button to create a new model file for each source. This creates a new file in the `models` directory with the given source name and fill in the SQL code of the source definition. +::: + +4. Hover over the `models` directory and click the three dot menu (**...**), then select **Create file**. +5. Name the file `staging/stripe/src_stripe.yml` , then click **Create**. +6. Copy the following text into the file and click **Save**. + +```yaml +version: 2 + +sources: + - name: stripe + database: raw + schema: stripe + tables: + - name: payment +``` + +### Add staging models +[Staging models](/best-practices/how-we-structure/2-staging) are the first transformation step in dbt. They clean and prepare your raw data, making it ready for more complex transformations and analyses. Follow these steps to add your staging models to your project. + +1. Create the file `models/staging/jaffle_shop/stg_customers.sql`. Or, you can use the **Generate model** button to create a new model file for each source. +2. Copy the following query into the file and click **Save**. + +```sql + select + id as customer_id, + first_name, + last_name +from {{ source('jaffle_shop', 'customers') }} +``` + +3. Create the file `models/staging/jaffle_shop/stg_orders.sql` +4. Copy the following query into the file and click **Save**. + +```sql + select + id as order_id, + user_id as customer_id, + order_date, + status + from {{ source('jaffle_shop', 'orders') }} +``` + +5. Create the file `models/staging/stripe/stg_payments.sql`. +6. Copy the following query into the file and click **Save**. + +```sql +select + id as payment_id, + orderid as order_id, + paymentmethod as payment_method, + status, + -- amount is stored in cents, convert it to dollars + amount / 100 as amount, + created as created_at + + +from {{ source('stripe', 'payment') }} +``` + +7. Enter `dbt run` in the command prompt at the bottom of the screen. You should get a successful run and see the three models. + +### Add business-defined entities + +This phase involves creating [models that serve as the entity layer or concept layer of your dbt project](/best-practices/how-we-structure/4-marts), making the data ready for reporting and analysis. It also includes adding [packages](/docs/build/packages) and the [MetricFlow time spine](/docs/build/metricflow-time-spine) that extend dbt's functionality. + +This phase is the [marts layer](/best-practices/how-we-structure/1-guide-overview#guide-structure-overview), which brings together modular pieces into a wide, rich vision of the entities an organization cares about. + +1. Create the file `models/marts/fct_orders.sql`. +2. Copy the following query into the file and click **Save**. + +```sql +with orders as ( + select * from {{ ref('stg_orders' )}} +), + + +payments as ( + select * from {{ ref('stg_payments') }} +), + + +order_payments as ( + select + order_id, + sum(case when status = 'success' then amount end) as amount + + + from payments + group by 1 +), + + +final as ( + + + select + orders.order_id, + orders.customer_id, + orders.order_date, + coalesce(order_payments.amount, 0) as amount + + + from orders + left join order_payments using (order_id) +) + + +select * from final + +``` + +3. Create the file `models/marts/dim_customers.sql`. +4. Copy the following query into the file and click **Save**. + +```sql +with customers as ( + select * from {{ ref('stg_customers')}} +), +orders as ( + select * from {{ ref('fct_orders')}} +), +customer_orders as ( + select + customer_id, + min(order_date) as first_order_date, + max(order_date) as most_recent_order_date, + count(order_id) as number_of_orders, + sum(amount) as lifetime_value + from orders + group by 1 +), +final as ( + select + customers.customer_id, + customers.first_name, + customers.last_name, + customer_orders.first_order_date, + customer_orders.most_recent_order_date, + coalesce(customer_orders.number_of_orders, 0) as number_of_orders, + customer_orders.lifetime_value + from customers + left join customer_orders using (customer_id) +) +select * from final +``` + +5. Create the file `packages.yml` in your main directory +6. Copy the following text into the file and click **Save**. + +```sql +packages: + - package: dbt-labs/dbt_utils + version: 1.1.1 +``` + +7. Create the file `models/metrics/metricflow_time_spine.sql` in your main directory. +8. Copy the following query into the file and click **Save**. + +```sql +{{ + config( + materialized = 'table', + ) +}} +with days as ( + {{ + dbt_utils.date_spine( + 'day', + "to_date('01/01/2000','mm/dd/yyyy')", + "to_date('01/01/2027','mm/dd/yyyy')" + ) + }} +), +final as ( + select cast(date_day as date) as date_day + from days +) +select * from final + +``` + +9. Enter `dbt run` in the command prompt at the bottom of the screen. You should get a successful run message and also see in the run details that dbt has successfully built five models. + +## Create semantic models + +[Semantic models](/docs/build/semantic-models) contain many object types (such as entities, measures, and dimensions) that allow MetricFlow to construct the queries for metric definitions. + +- Each semantic model will be 1:1 with a dbt SQL/Python model. +- Each semantic model will contain (at most) 1 primary or natural entity. +- Each semantic model will contain zero, one, or many foreign or unique entities used to connect to other entities. +- Each semantic model may also contain dimensions, measures, and metrics. This is what actually gets fed into and queried by your downstream BI tool. + +In the following steps, semantic models enable you to define how to interpret the data related to orders. It includes entities (like ID columns serving as keys for joining data), dimensions (for grouping or filtering data), and measures (for data aggregations). + +1. Create a new file `models/metrics/fct_orders.yml` +2. Add the following code to that newly created file: + +```yaml +semantic_models: + - name: orders + description: | + Order fact table. This table’s grain is one row per order. + model: ref('fct_orders') + +``` + +The following sections explain [dimensions](/docs/build/dimensions), [entities](/docs/build/entities), and [measures](/docs/build/measures) in more detail, showing how they each play a role in semantic models. + +- [Entities](#entities) act as unique identifiers (like ID columns) that link data together from different tables. +- [Dimensions](#dimensions) categorize and filter data, making it easier to organize. +- [Measures](#measures) calculates data, providing valuable insights through aggregation. + +### Entities + +[Entities](/docs/build/semantic-models#entities) are a real-world concept in a business, serving as the backbone of your semantic model. These are going to be ID columns (like `order_id`) in our semantic models. These will serve as join keys to other semantic models. + +Add entities to your `fct_orders.yml` semantic model file: + +```yaml +semantic_models: + - name: orders + defaults: + agg_time_dimension: order_date + description: | + Order fact table. This table’s grain is one row per order. + model: ref('fct_orders') + # Newly added + entities: + - name: order_id + type: primary + - name: customer_id + type: foreign +``` + +### Dimensions + +[Dimensions](/docs/build/semantic-models#entities) are a way to group or filter information based on categories or time. + +Add dimensions to your `fct_orders.yml` semantic model file: + +```yaml +semantic_models: + - name: orders + defaults: + agg_time_dimension: order_date + description: | + Order fact table. This table’s grain is one row per order. + model: ref('fct_orders') + entities: + - name: order_id + type: primary + - name: customer_id + type: foreign + # Newly added + dimensions: + - name: order_date + type: time + type_params: + time_granularity: day +``` + +### Measures + +[Measures](/docs/build/semantic-models#measures) are aggregations performed on columns in your model. Often, you’ll find yourself using them as final metrics themselves. Measures can also serve as building blocks for more complicated metrics. + +Add measures to your `fct_orders.yml` semantic model file: + +```yaml +semantic_models: + - name: orders + defaults: + agg_time_dimension: order_date + description: | + Order fact table. This table’s grain is one row per order. + model: ref('fct_orders') + entities: + - name: order_id + type: primary + - name: customer_id + type: foreign + dimensions: + - name: order_date + type: time + type_params: + time_granularity: day + # Newly added + measures: + - name: order_total + description: The total amount for each order including taxes. + agg: sum + expr: amount + - name: order_count + expr: 1 + agg: sum + - name: customers_with_orders + description: Distinct count of customers placing orders + agg: count_distinct + expr: customer_id + - name: order_value_p99 ## The 99th percentile order value + expr: amount + agg: percentile + agg_params: + percentile: 0.99 + use_discrete_percentile: True + use_approximate_percentile: False +``` + +## Define metrics + +[Metrics](/docs/build/metrics-overview) are the language your business users speak and measure business performance. They are an aggregation over a column in your warehouse that you enrich with dimensional cuts. + +There are different types of metrics you can configure: + +- [Conversion metrics](/docs/build/conversion) — Track when a base event and a subsequent conversion event occur for an entity within a set time period. +- [Cumulative metrics](/docs/build/metrics-overview#cumulative-metrics) — Aggregate a measure over a given window. If no window is specified, the window will accumulate the measure over all of the recorded time period. Note, that you must create the time spine model before you add cumulative metrics. +- [Derived metrics](/docs/build/metrics-overview#derived-metrics) — Allows you to do calculations on top of metrics. +- [Simple metrics](/docs/build/metrics-overview#simple-metrics) — Directly reference a single measure without any additional measures involved. +- [Ratio metrics](/docs/build/metrics-overview#ratio-metrics) — Involve a numerator metric and a denominator metric. A constraint string can be applied to both the numerator and denominator or separately to the numerator or denominator. + +Once you've created your semantic models, it's time to start referencing those measures you made to create some metrics: + +Add metrics to your `fct_orders.yml` semantic model file: + +```yaml +semantic_models: + - name: orders + defaults: + agg_time_dimension: order_date + description: | + Order fact table. This table’s grain is one row per order + model: ref('fct_orders') + entities: + - name: order_id + type: primary + - name: customer_id + type: foreign + dimensions: + - name: order_date + type: time + type_params: + time_granularity: day + measures: + - name: order_total + description: The total amount for each order including taxes. + agg: sum + expr: amount + - name: order_count + expr: 1 + agg: sum + - name: customers_with_orders + description: Distinct count of customers placing orders + agg: count_distinct + expr: customer_id + - name: order_value_p99 + expr: amount + agg: percentile + agg_params: + percentile: 0.99 + use_discrete_percentile: True + use_approximate_percentile: False +# Newly added +metrics: + # Simple type metrics + - name: "order_total" + description: "Sum of orders value" + type: simple + label: "order_total" + type_params: + measure: order_total + - name: "order_count" + description: "number of orders" + type: simple + label: "order_count" + type_params: + measure: order_count + - name: large_orders + description: "Count of orders with order total over 20." + type: simple + label: "Large Orders" + type_params: + measure: order_count + filter: | + {{ Dimension('order_id__order_total_dim') }} >= 20 + # Ratio type metric + - name: "avg_order_value" + label: "avg_order_value" + description: "average value of each order" + type: ratio + type_params: + numerator: order_total + denominator: order_count + # Cumulative type metrics + - name: "cumulative_order_amount_mtd" + label: "cumulative_order_amount_mtd" + description: "The month to date value of all orders" + type: cumulative + type_params: + measure: order_total + grain_to_date: month + # Derived metric + - name: "pct_of_orders_that_are_large" + label: "pct_of_orders_that_are_large" + description: "percent of orders that are large" + type: derived + type_params: + expr: large_orders/order_count + metrics: + - name: large_orders + - name: order_count +``` + +## Add second semantic model to your project + +Great job, you've successfully built your first semantic model! It has all the required elements: entities, dimensions, measures, and metrics. + +Let’s expand your project's analytical capabilities by adding another semantic model in your other marts model, such as: `dim_customers.yml`. + +After setting up your orders model: + +1. Create the file `models/metrics/dim_customers.yml`. +2. Copy the following query into the file and click **Save**. + +```yaml +semantic_models: + - name: customers + defaults: + agg_time_dimension: most_recent_order_date + description: | + semantic model for dim_customers + model: ref('customers') + entities: + - name: customer + expr: customer_id + type: primary + dimensions: + - name: customer_name + type: categorical + - name: customer_type + type: categorical + - name: first_order_date + type: time + type_params: + time_granularity: day + - name: most_recent_order_date + type: time + type_params: + time_granularity: day + measures: + - name: count_lifetime_orders + description: Total count of orders per customer. + agg: sum + - name: lifetime_spend + agg: sum + description: Gross customer lifetime spend inclusive of taxes. + - name: customers + expr: customer_id + agg: count_distinct + +metrics: + - name: "customers_with_orders" + label: "customers_with_orders" + description: "Unique count of customers placing orders" + type: simple + type_params: + measure: customers +``` + +This semantic model uses simple metrics to focus on customer metrics and emphasizes customer dimensions like name, type, and order dates. It uniquely analyzes customer behavior, lifetime value, and order patterns. + +## Test and query metrics + + + + + +## Run a production job + + + + + +
+ +What’s happening internally? + +- Merging the code into your main branch allows dbt Cloud to pull those changes and builds the definition in the manifest produced by the run.
+- Re-running the job in the deployment environment helps materialize the models, which the metrics depend on, in the data platform. It also makes sure that the manifest is up to date.
+- The Semantic Layer APIs pulls in the most recent manifest and allows your integration information to extract metadata from it. + +
+ +## Set up dbt Semantic Layer + + + + + +## Connect and query with Google Sheets + + + + + +## What's next + + + +Great job on completing the comprehensive dbt Semantic Layer guide 🎉! You should hopefully have gained a clear understanding of what the dbt Semantic Layer is, its purpose, and when to use it in your projects. + +You've learned how to: + +- Set up your Snowflake environment and dbt Cloud, including creating worksheets and loading data. +- Connect and configure dbt Cloud with Snowflake. +- Build, test, and manage dbt Cloud projects, focusing on metrics and semantic layers. +- Run production jobs and query metrics with Google Sheets. + +For next steps, you can start defining your own metrics and learn additional configuration options such as [exports](/docs/use-dbt-semantic-layer/exports), [fill null values](/docs/build/advanced-topics), and more. + +Here are some additional resources to help you continue your journey: + +- [dbt Semantic Layer FAQs](/docs/use-dbt-semantic-layer/sl-faqs) +- [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) +- Demo on [how to define and query metrics with MetricFlow](https://www.loom.com/share/60a76f6034b0441788d73638808e92ac?sid=861a94ac-25eb-4fd8-a310-58e159950f5a) +- [Join our live demos](https://www.getdbt.com/resources/webinars/dbt-cloud-demos-with-experts) + + diff --git a/website/sidebars.js b/website/sidebars.js index cfd53c90d32..3bb85695879 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -306,7 +306,11 @@ const sidebarSettings = { collapsed: true, items: [ "docs/build/build-metrics-intro", - "docs/build/sl-getting-started", + { + type: "link", + label: "Quickstart with the dbt Cloud Semantic Layer", + href: `/guides/sl-snowflake-qs`, + }, { type: "category", label: "About MetricFlow", @@ -509,7 +513,11 @@ const sidebarSettings = { link: { type: "doc", id: "docs/use-dbt-semantic-layer/dbt-sl" }, items: [ "docs/use-dbt-semantic-layer/dbt-sl", - "docs/use-dbt-semantic-layer/quickstart-sl", + { + type: "link", + label: "Quickstart with the dbt Cloud Semantic Layer", + href: `/guides/sl-snowflake-qs`, + }, "docs/use-dbt-semantic-layer/setup-sl", "docs/use-dbt-semantic-layer/exports", "docs/use-dbt-semantic-layer/sl-architecture", diff --git a/website/snippets/_new-sl-setup.md b/website/snippets/_new-sl-setup.md index e7039c8103a..74273a4c617 100644 --- a/website/snippets/_new-sl-setup.md +++ b/website/snippets/_new-sl-setup.md @@ -3,17 +3,13 @@ You can set up the dbt Semantic Layer in dbt Cloud at the environment and projec - You must be part of the Owner group, and have the correct [license](/docs/cloud/manage-access/seats-and-users) and [permissions](/docs/cloud/manage-access/self-service-permissions) to configure the Semantic Layer: * Enterprise plan — Developer license with Account Admin permissions. Or Owner with a Developer license, assigned Project Creator, Database Admin, or Admin permissions. * Team plan — Owner with a Developer license. + * If you are using a free trial dbt Cloud account, you are on a trial of the Team plan as an Owner, so you're good to go. - You must have a successful run in your new environment. -:::tip -If you've configured the legacy Semantic Layer, it has been deprecated. dbt Labs strongly recommends that you [upgrade your dbt version](/docs/dbt-versions/upgrade-dbt-version-in-cloud) to dbt version 1.6 or higher to use the latest dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/sl-migration) for details. -::: - -1. In dbt Cloud, create a new [deployment environment](/docs/deploy/deploy-environments#create-a-deployment-environment) or use an existing environment on dbt 1.6 or higher. - * Note — Deployment environment is currently supported (_development experience coming soon_) - -2. Navigate to **Account Settings** and select the specific project you want to enable the Semantic Layer for. +Now that we've created and successfully run a job in your environment, you're ready to configure the semantic layer. +1. Navigate to **Account Settings** in the navigation menu. +2. Use the sidebar to select your project settings. Select the specific project you want to enable the Semantic Layer for. 3. In the **Project Details** page, navigate to the **Semantic Layer** section, and select **Configure Semantic Layer**. @@ -26,13 +22,10 @@ If you've configured the legacy Semantic Layer, it has been deprecated. dbt Labs 5. Select the deployment environment you want for the Semantic Layer and click **Save**. - 6. After saving it, you'll be provided with the connection information that allows you to connect to downstream tools. If your tool supports JDBC, save the JDBC URL or individual components (like environment id and host). If it uses the GraphQL API, save the GraphQL API host information instead. -7. Save and copy your environment ID, service token, and host, which you'll need to use downstream tools. For more info on how to integrate with partner integrations, refer to [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations). - -8. Return to the **Project Details** page, then select **Generate Service Token**. You will need Semantic Layer Only and Metadata Only [service token](/docs/dbt-cloud-apis/service-tokens) permissions. +7. Save and copy your environment ID, service token, and host, which you'll need to use in the downstream tools. For more info on how to integrate with partner integrations, refer to [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations). -Great job, you've configured the Semantic Layer 🎉! +8. Return to the **Project Details** page and click the **[Generate a Service Token](/docs/dbt-cloud-apis/service-tokens)** button. Make sure it has Semantic Layer Only and Metadata Only permissions. Name the token and save it. Once the token is generated, you won't be able to view this token again so make sure to record it somewhere safe. diff --git a/website/snippets/_sl-connect-and-query-api.md b/website/snippets/_sl-connect-and-query-api.md index c155af4d11e..dd177f93604 100644 --- a/website/snippets/_sl-connect-and-query-api.md +++ b/website/snippets/_sl-connect-and-query-api.md @@ -1,8 +1,11 @@ -You can query your metrics in a JDBC-enabled tool or use existing first-class integrations with the dbt Semantic Layer. +This section will guide you on how to query and use the Google Sheets integration. You can also query your metrics using: +- [First-class integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) (such as Tableau, Hex, and more) +- Other tools using the [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview). +- [Exports](/docs/use-dbt-semantic-layer/exports) to expose tables of metrics and dimensions in your data platform and create a custom integration with tools such as PowerBI, and more. -- You must have a dbt Cloud Team or Enterprise account. Suitable for both Multi-tenant and Single-tenant deployment. - - Single-tenant accounts should contact their account representative for necessary setup and enablement. -- To learn how to use the JDBC or GraphQL API and what tools you can query it with, refer to [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview). - * To authenticate, you need to [generate a service token](/docs/dbt-cloud-apis/service-tokens) with Semantic Layer Only and Metadata Only permissions. - * Refer to the [SQL query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) to query metrics using the API. -- To learn more about the sophisticated integrations that connect to the dbt Semantic Layer, refer to [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) for more info. You can also use [Exports](/docs/use-dbt-semantic-layer/exports) to expose tables of metrics and dimensions in your data platform and create a custom integration with tools such as PowerBI, and more. +To query your metrics using Google Sheets: +1. Make sure you have a [Gmail](http://gmail.com/) account. +2. To set up Google Sheets and query your metrics, follow the detailed instructions on [Google Sheets integration](/docs/use-dbt-semantic-layer/gsheets). +3. Start using with it. You can query a metric, like `order_total`. Filter it with a dimension, like `order_date`. You can also use the `group_by` parameter to group your metrics by a specific dimension. + + diff --git a/website/snippets/_sl-run-prod-job.md b/website/snippets/_sl-run-prod-job.md index a637b0b431e..8eb4049efc8 100644 --- a/website/snippets/_sl-run-prod-job.md +++ b/website/snippets/_sl-run-prod-job.md @@ -1,7 +1,9 @@ -Once you’ve defined metrics in your dbt project, you can perform a job run in your deployment environment in dbt Cloud to materialize your metrics. The deployment environment is only supported for the dbt Semantic Layer currently. +Once you’ve committed and merged your metric changes in your dbt project, you can perform a job run in your deployment environment in dbt Cloud to materialize your metrics. The deployment environment is only supported for the dbt Semantic Layer currently. -1. Select **Deploy** from the top navigation bar. -2. Select **Jobs** to rerun the job with the most recent code in the deployment environment. -3. Your metric should appear as a red node in the dbt Cloud IDE and dbt directed acyclic graphs (DAG). - - +1. In dbt Cloud, create a new [deployment environment](/docs/deploy/deploy-environments#create-a-deployment-environment) or use an existing environment on dbt 1.6 or higher. + * Note — Deployment environment is currently supported (_development experience coming soon_) +2. To create a new environment, navigate to **Deploy** in the navigation menu, select **Environments**, and then select **Create new environment**. +3. Fill in your deployment credentials with your Snowflake username and password. You can name the schema anything you want. Click **Save** to create your new production environment. +4. [Create a new deploy job](/docs/deploy/deploy-jobs#create-and-schedule-jobs) that runs in the environment you just created. Go back to the **Deploy** menu, select **Jobs**, select **Create job**, and click **Deploy job**. +5. Set the job to run a `dbt build` and select the **Generate docs on run** checkbox. +6. Run the job and make sure it runs successfully. diff --git a/website/snippets/_sl-test-and-query-metrics.md b/website/snippets/_sl-test-and-query-metrics.md index 129987661e0..936f4804f9f 100644 --- a/website/snippets/_sl-test-and-query-metrics.md +++ b/website/snippets/_sl-test-and-query-metrics.md @@ -1,21 +1,22 @@ -This section explains how you can test and run MetricFlow commands with dbt Cloud or dbt Core (dbt Cloud IDE support coming soon). dbt Cloud IDE users can skip to [Run a production job](#run-a-production-job) to run a model. +To work with metrics in dbt, you have several tools to validate or run commands. Here's how you can test and query metrics depending on your setup: -:::important Testing and querying metrics in the dbt Cloud IDE is currently not supported +- [**dbt Cloud IDE users**](#dbt-cloud-ide-users) — Currently, running MetricFlow commands directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) isn't supported, but is coming soon. You can still validate metrics using the **Preview** or **Compile** options, or visually through the DAG for semantic checks. This ensures your metrics are correctly defined without directly running commands. +- [**dbt Cloud CLI users**](#dbt-cloud-cli-users) — The [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) enables you to run [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) for direct interaction with metrics. +- **dbt Core users** — Use the MetricFlow CLI for command execution. While this guide focuses on dbt Cloud users, dbt Core users can find detailed MetricFlow CLI setup instructions in the [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) page. Note that to use the dbt Semantic Layer, you need to have a Team or Enterprise account. -Support for running [MetricFlow commands](/docs/build/metricflow-commands) in the dbt Cloud IDE is not available but is coming soon. +Alternatively, you can run commands with SQL client tools like DataGrip, DBeaver, or RazorSQL. -You can use the **Preview** or **Compile** buttons in the IDE to run semantic validations and make sure your metrics are defined. Alternatively, you can run commands with the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) or with SQL client tools like DataGrip, DBeaver, or RazorSQL. +### dbt Cloud IDE users -::: +You can validate your metrics in the dbt Cloud IDE by selecting the metric you want to validate and viewing it in the **Lineage** tab. - +Once validated, make sure you commit and merge your changes in your project. - + -This section is for people using the dbt Cloud CLI (support for dbt Cloud IDE is coming soon). With dbt Cloud: +### dbt Cloud CLI users -- You can run MetricFlow commands after installing the dbt Cloud CLI. They're integrated with dbt Cloud so you can use them immediately. -- Your account will automatically manage version control for you. +This section is for dbt Cloud CLI users. MetricFlow commands are integrated with dbt Cloud, which means you can run MetricFlow commands as soon as you install the dbt Cloud CLI. Your account will automatically manage version control for you. Refer to the following steps to get started: @@ -24,41 +25,14 @@ Refer to the following steps to get started: 3. MetricFlow builds a semantic graph and generates a `semantic_manifest.json` file in dbt Cloud, which is stored in the `/target` directory. If using the Jaffle Shop example, run `dbt seed && dbt run` to ensure the required data is in your data platform before proceeding. :::info Run dbt parse when you make changes to metrics -Anytime you make changes to metrics, you need to run `dbt parse` at a minimum. This ensures the `semantic_manifest.json` file is updated and you can have your changes reflected when querying metrics. +Any time you make changes to metrics, you need to run `dbt parse` at a minimum. This ensures the `semantic_manifest.json` file is updated and you can have your changes reflected when querying metrics. ::: 4. Run `dbt sl --help` to confirm you have MetricFlow installed and that you can view the available commands. -5. Run `dbt sl query --metrics --group-by ` to query the metrics and dimensions. For example, `dbt sl query --metrics order_total --group-by metric_time`. +5. Run `dbt sl query --metrics --group-by ` to query the metrics and dimensions. For example, to query the `order_total` and `order_count` (both metrics), and then group them by the `order_date` (dimension), you would run: + + ```sql + dbt sl query --metrics order_total,order_count --group-by order_date + ``` 6. Verify that the metric values are what you expect. To further understand how the metric is being generated, you can view the generated SQL if you type `--compile` in the command line. 7. Commit and merge the code changes that contain the metric definitions. - - - - - -This step is for dbt Core users only. MetricFlow is compatible with Python versions 3.8, 3.9, 3.10, and 3.11. You need to use `pip` to install MetricFlow on Windows or Linux operating systems: - -:::note -The dbt Cloud CLI is strongly recommended to define and query metrics for your dbt project in dbt Cloud or dbt Core with MetricFlow. If you're using dbt Core, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow. -::: - - -1. Install [MetricFlow](/docs/build/metricflow-commands) as an extension of a dbt adapter from PyPI. -2. Create or activate your virtual environment with `python -m venv venv` or `source your-venv/bin/activate`. -3. Run `python -m pip install dbt-metricflow`. - - You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `python -m pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. As an example for a Snowflake adapter, run `python -m pip install "dbt-metricflow[snowflake]"`. - - You'll need to manage versioning between dbt Core, your adapter, and MetricFlow. -4. Run `dbt parse`. This allows MetricFlow to build a semantic graph and generate a `semantic_manifest.json`. - - This creates the file in your `/target` directory. If you're working from the Jaffle shop example, run `dbt seed && dbt run` before proceeding to ensure the data exists in your warehouse. -5. Run `mf --help` to confirm you have MetricFlow installed and that you can view the available commands. -6. Run `mf query --metrics --group-by ` to query the metrics and dimensions. For example, `mf query --metrics order_total --group-by metric_time`. -7. Verify that the metric values are what you expect. To further understand how the metric is being generated, you can view the generated SQL if you type `--explain` in the command line. -8. Run `mf validate-configs` to run validation on your semantic models and metrics. -9. Commit and merge the code changes that contain the metric definitions. - - - - - -To streamline your metric querying process, you can connect to the [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) to access your metrics programmatically. For SQL syntax, refer to [Querying the API for metric metadata](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata). - diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/sl-gsheets.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/sl-gsheets.jpg new file mode 100644 index 00000000000..7ce8a801f88 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/semantic-layer/sl-gsheets.jpg differ diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/sl-ide-dag.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/sl-ide-dag.jpg new file mode 100644 index 00000000000..2ba768c10f9 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/semantic-layer/sl-ide-dag.jpg differ diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/sl-snowflake-confirm.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/sl-snowflake-confirm.jpg new file mode 100644 index 00000000000..956c09398c1 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/semantic-layer/sl-snowflake-confirm.jpg differ diff --git a/website/vercel.json b/website/vercel.json index 69d9505afcc..8c4d8f90909 100644 --- a/website/vercel.json +++ b/website/vercel.json @@ -2,6 +2,16 @@ "cleanUrls": true, "trailingSlash": false, "redirects": [ + { + "source": "/docs/build/sl-getting-started", + "destination": "/guides/sl-snowflake-qs", + "permanent": true + }, + { + "source": "/docs/use-dbt-semantic-layer/quickstart-sl", + "destination": "/guides/sl-snowflake-qs", + "permanent": true + }, { "source": "/docs/cloud/manage-access/set-up-sso-azure-active-directory", "destination": "/docs/cloud/manage-access/set-up-sso-microsoft-entra-id",