Skip to content

Commit

Permalink
screenshot updates WMS ID 8001 (#313)
Browse files Browse the repository at this point in the history
screenshot updates

Database Actions updates, new screenshots
  • Loading branch information
malcherm authored Jun 7, 2024
1 parent 190e97f commit f1acd7e
Show file tree
Hide file tree
Showing 19 changed files with 47 additions and 73 deletions.
30 changes: 16 additions & 14 deletions data-lake/data-catalog.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,11 +74,15 @@ Under Quick Actions, you want to click on **Create Data Asset**

![Create Data Asset](./images/create_dataasset1.png " ")

Select Oracle ADW from the list of data asset types.

![Select Oracle ADW as the data asset type](./images/create_dataasset.png " ")

The first data asset is going to be our ADW database we already created. Fill in MOVIESTREAM_ADW and select type to Oracle Autonomous Data Warehouse.

Continue to fill with Select Database, regions, Tenancy OCID that you saved off to the side, or if you need to find it again click on your profile, then click on Tenancy and copy the OCID. Make sure the compartment is lakehouse1 and DB is lakehousedb.

![Create Data Asset](./images/create_dataasset.png " ")
![Create Data Asset](./images/create_dataasset2.png " ")

Using the default connection you can put in User Name ADMIN and the password you configured for your database, and select the TNS Alias from the dropdown.

Expand Down Expand Up @@ -180,13 +184,9 @@ In this step, you will simply be creating a new table to verify that the table i

2. Click on the database and then proceed to click on the Tools Tab and click on Open Database Actions.

![Database Actions](./images/dbactions1.png " ")

3. Click on SQL to execute the query to create the table.
![Database Actions](./images/dbactionssql.png " ")

![SQL](./images/SQL_queries.png " ")

4. You are going to copy and paste the following code to build the MOVIE_GENRE table that we will use later in our data feed process and end queries.
3. You are going to copy and paste the following code to build the MOVIE_GENRE table that we will use later in our data feed process and end queries.

```
<copy>
Expand All @@ -201,24 +201,26 @@ In this step, you will simply be creating a new table to verify that the table i
</copy>
```

5. After the create table statement executes, you can close Database Actions tab to get back to the Oracle Cloud menu.
![SQL query](./images/dbactionssqlquery.png " ")

4. After the create table statement executes, you can close Database Actions tab to get back to the Oracle Cloud menu.

6. Now you can verify that the entity is available as part of the OCI Data Catalog. Navigate to the Oracle Cloud Menu. Click on Analytics & AI and click on Data Catalog under the Data Lake header.
5. Now you can verify that the entity is available as part of the OCI Data Catalog. Navigate to the Oracle Cloud Menu. Click on Analytics & AI and click on Data Catalog under the Data Lake header.

7. Click on lakehousecatalog from the Data Catalogs. Verify compartment if you do not see it listed.
6. Click on lakehousecatalog from the Data Catalogs. Verify compartment if you do not see it listed.

![SQL](./images/currentcatalog.png " ")

8. Click on Data Assets and click on Harvest using the dropdown menu for the database Data Asset. This harvesting for the Data Catalog should be scheduled to automatically pull the entity information into the Data Asset, but for now in the lab you can run this manually.
7. Click on Data Assets and click on Harvest using the dropdown menu for the database Data Asset. This harvesting for the Data Catalog should be scheduled to automatically pull the entity information into the Data Asset, but for now in the lab you can run this manually.
Select the ADMIN data entity and run the job now.

![Harvest](./images/harvest1.png " ")

9. Now if you go back to the Home Tab from the Data Catalog, you will discover that there are now 7 Data Entities are being kept up to data in the Data Catalog.
8. Now if you go back to the Home Tab from the Data Catalog, you will discover that there are now 7 Data Entities are being kept up to data in the Data Catalog.

![New Entities](./images/new_entities.png " ")

10. Click on Entities just to verify that all of the tables are now here.
9. Click on Entities just to verify that all of the tables are now here.

![Entities List](./images/entities_list.png " ")

Expand All @@ -230,4 +232,4 @@ You may now proceed to the next lab.

* **Author** - Michelle Malcher, Database Product Management, Massimo Castelli, Senior Director Product Management
* **Contributors** - Nagwang Gyamtso, Product Management
* **Last Updated By/Date** - Michelle Malcher, Database Product Management, June 2023
* **Last Updated By/Date** - Michelle Malcher, Database Product Management, June 2024
Binary file modified data-lake/images/backtodataload.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified data-lake/images/cloudloactions.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed data-lake/images/create_ADW2.png
Binary file not shown.
Binary file modified data-lake/images/create_dataasset.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified data-lake/images/create_dataasset1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added data-lake/images/create_dataasset2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added data-lake/images/createadw23ai.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added data-lake/images/createadw23ai4.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified data-lake/images/dataload.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added data-lake/images/dbactionsload1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added data-lake/images/dbactionssql.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added data-lake/images/dbactionssqlquery.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified data-lake/images/loadcompleted2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified data-lake/images/runload2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 3 additions & 3 deletions data-lake/lakehouse-configuration-green.md
Original file line number Diff line number Diff line change
Expand Up @@ -163,7 +163,7 @@ In this step, you will create an Oracle Autonomous Data Warehouse.

*Note: You cannot scale up/down an Always Free autonomous database.*

![Enter the required details.](./images/create_ADW2.png " ")
![Enter the required details.](./images/createadw23ai.png " ")

5. Create administrator credentials:

Expand All @@ -186,7 +186,7 @@ In this step, you will create an Oracle Autonomous Data Warehouse.
- __Bring Your Own License (BYOL)__ - Select this type when your organization has existing database licenses.
- __License Included__ - Select this type when you want to subscribe to new database software licenses and the database cloud service.

![](./images/create_ADW4.png " ")
![](./images/createadwai234.png " ")

8. Click __Create Autonomous Database__.

Expand All @@ -204,4 +204,4 @@ You may now proceed to the next lab.

* **Author** - Michelle Malcher, Database Product Management
* **Contributors** - Massimo Castelli, Niay Panchal, Mike Matthew and Marty Gubar, Autonomous Database Product Management
* **Last Updated By/Date** - Michelle Malcher, Database Product Management, September 2021, Nagwang Gyamtso, Solution Engineering, February 2022
* **Last Updated By/Date** - Michelle Malcher, Database Product Management, June 2024
6 changes: 3 additions & 3 deletions data-lake/lakehouse-configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ In this task, you will create an Oracle Autonomous Data Warehouse (ADW).

*Note: You cannot scale up/down an Always Free autonomous database.*

![Enter the required details.](./images/create_ADW2.png " ")
![Enter the required details.](./images/createadw23ai.png " ")

5. Create administrator credentials:

Expand All @@ -190,7 +190,7 @@ In this task, you will create an Oracle Autonomous Data Warehouse (ADW).
- __Bring Your Own License (BYOL)__ - Select this type when your organization has existing database licenses.
- __License Included__ - Select this type when you want to subscribe to new database software licenses and the database cloud service.

![Choose License](./images/create_ADW4.png " ")
![Choose License](./images/createadw23ai4.png " ")

8. Click __Create Autonomous Database__.

Expand All @@ -208,4 +208,4 @@ You may now proceed to the next lab.

* **Author** - Michelle Malcher, Database Product Management
* **Contributors** - Massimo Castelli, Niay Panchal, Mike Matthew and Marty Gubar, Autonomous Database Product Management, Nagwang Gyamtso, Product Management
* **Last Updated By/Date** - Michelle Malcher, Database Product Management, June 2023
* **Last Updated By/Date** - Michelle Malcher, Database Product Management, June 2024
44 changes: 13 additions & 31 deletions data-lake/lakehouse-configuration2-green.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,23 +25,19 @@ In this lab, we will learn more about the Autonomous Database's built-in Data Lo

In this step, you will set up access to the two buckets on Oracle Object Store that contain data that we want to load - the landing area, and the 'gold' area.

1. In your ADW database's details page, click the Tools tab. Click **Open Database Actions**
1. In your ADW database's details page, under the Database actions tab go to Data Load. Click **Data Load**

![Click Tools, then Database Actions](./images/dbactions1.png " ")
![Under Database actions, click Data Load](./images/dbactionsload1.png " ")

2. On the login screen, enter the username ADMIN, then click the blue **Next** button.
3. Under **Data Studio**, click **LOAD DATA**

3. Enter the password for the ADMIN user you entered when creating the database.
![Click LOAD DATA](./images/dataload.png " ")

4. Under **Data Studio**, click **DATA LOAD**
4. Now you can put in the Cloud Store URL to connect to object storage for the files to load. Click **Cloud Store** to see location field.

![Click DATA LOAD](./images/dataload.png " ")
![Click CLOUD Store](./images/cloudlocations.png " ")

5. In the **What do you want to do with your data?** section, click **LOAD DATA** and choose **CLOUD STORE** for **Where is you data?** to set up the connection from your Autonomous Database to OCI Object Storage.

![Click CLOUD LOCATIONS](./images/cloudlocations.png " ")

6. Data Load > Load Cloud Object window will pop up and you need to copy the bucket URI into this field.
5. In the location field, you need to copy the bucket URI into this field.

Copy and paste the following URI into the URI + Load Data from Cloud Store:

Expand All @@ -51,37 +47,23 @@ In this step, you will set up access to the two buckets on Oracle Object Store t
</copy>
```
Select No Credential as this is a public bucket and then click **Create**.


## Task 2: Load data from files in Object Storage using Data Tools

In this step, we will perform some simple data loading tasks, to load in CSV files from Object Storage into tables in our Autonomous Database.

1. To load or link data from our newly configured cloud storage, click the **Data Load** link in the top left of your screen. Or if you are still in the same view with MOVIESTREAMLANDING Cloud Storage, skip down to step 3 to choose the objects you want to load.

![Click Data Load](./images/backtodataload.png " ")

2. Under **What do you want to do with your data?** select **LOAD DATA**, and under **Where is your data?** select **CLOUD STORAGE**, then click **Next**.

![Select Load Data, then Cloud Storage](./images/loadfromstorage.png " ")

3. This time, select **MOVIESTREAMLANDING** in the top left of your screen.

![Click Data Load](./images/selectlanding.png " ")

4. From the MOVIESTREAMLANDING location, drag the **customer\_contact**, **customer\_extension**, and **customer\_segment** folders over to the right hand pane and click **OK** to load all objects into one table for each of these folders.

5. Drag the **genre** and **movie** folders over to the right hand pane and click **OK**.
1. You will see a list of folders which is the data available from the object storage bucket that we can load into our Autonomous Database. Select all of the files and drag them over to the right for adding files.

6. And for fun, drag the **pizza_location** folder over to the right hand pane and click **OK**.
![Drag files to load](./images/backtodataload.png " ")

7. Click the Play button to run the data load job.
2. Click the Start button to run the data load job.

![Run the data load job](./images/runload2.png " ")

The job should take about 20 seconds to run.

8. Check that all three data load cards have green tick marks in them, indicating that the data load tasks have completed successfully.
3. Check that all three data load cards have green tick marks in them, indicating that the data load tasks have completed successfully.

![Check the job is completed](./images/loadcompleted2.png " ")

Expand All @@ -93,4 +75,4 @@ You may now proceed to the next lab.

* **Author** - Michelle Malcher, Database Product Management
* **Contributors** - Niay Panchal, Mike Matthew and Marty Gubar, Autonomous Database Product Management
* **Last Updated By/Date** - Michelle Malcher, Database Product Management, May 2022
* **Last Updated By/Date** - Michelle Malcher, Database Product Management, June 2024
34 changes: 12 additions & 22 deletions data-lake/lakehouse-configuration2.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,23 +25,19 @@ In this lab, we will learn more about the Autonomous Database's built-in Data Lo

In this step, you will set up access to the two buckets on Oracle Object Store that contain data that we want to load - the landing area, and the 'gold' area.

1. In your ADW database's details page, click the Tools tab. Click **Open Database Actions**
1. In your ADW database's details page, under the Database actions tab go to Data Load. Click **Data Load**

![Click Tools, then Database Actions](./images/dbactions1.png " ")
![Under Database actions, click Data Load](./images/dbactionsload1.png " ")

2. On the login screen, enter the username ADMIN, then click the blue **Next** button.
3. Under **Data Studio**, click **LOAD DATA**

3. Enter the password for the ADMIN user you entered when creating the database.
![Click LOAD DATA](./images/dataload.png " ")

4. Under **Data Studio**, click **DATA LOAD**
4. Now you can put in the Cloud Store URL to connect to object storage for the files to load. Click **Cloud Store** to see location field.

![Click DATA LOAD](./images/dataload.png " ")
![Click CLOUD Store](./images/cloudlocations.png " ")

5. In the **What do you want to do with your data?** section, click **LOAD DATA** and choose **CLOUD STORE** for **Where is you data?** to set up the connection from your Autonomous Database to OCI Object Storage.

![Click CLOUD LOCATIONS](./images/cloudlocations.png " ")

6. Data Load > Load Cloud Object window will pop up and you need to copy the bucket URI into this field.
5. In the location field, you need to copy the bucket URI into this field.

Copy and paste the following URI into the URI + Load Data from Cloud Store:

Expand All @@ -57,23 +53,17 @@ In this step, you will set up access to the two buckets on Oracle Object Store t

In this step, we will perform some simple data loading tasks, to load in CSV files from Object Storage into tables in our Autonomous Database.

1. You will see a list of folders which is the data available from the object storage bucket that we can load into our Autonomous Database.

![Click Data Load](./images/backtodataload.png " ")

4. From the listed folders, drag the **customer\_contact**, **customer\_extension**, and **customer\_segment** folders over to the right hand pane and click **OK** to load all objects into one table for each of these folders.

5. Drag the **genre** and **movie** folders over to the right hand pane and click **OK**.
1. You will see a list of folders which is the data available from the object storage bucket that we can load into our Autonomous Database. Select all of the files and drag them over to the right for adding files.

6. And for fun, drag the **pizza_location** folder over to the right hand pane and click **OK**.
![Drag files to load](./images/backtodataload.png " ")

7. Click the Play button to run the data load job.
2. Click the Start button to run the data load job.

![Run the data load job](./images/runload2.png " ")

The job should take about 20 seconds to run.

8. Check that all three data load cards have green tick marks in them, indicating that the data load tasks have completed successfully.
3. Check that all three data load cards have green tick marks in them, indicating that the data load tasks have completed successfully.

![Check the job is completed](./images/loadcompleted2.png " ")

Expand All @@ -85,4 +75,4 @@ You may now proceed to the next lab.

* **Author** - Michelle Malcher, Database Product Management
* **Contributors** - Niay Panchal, Mike Matthew and Marty Gubar, Autonomous Database Product Management
* **Last Updated By/Date** - Michelle Malcher, Database Product Management, June 2023
* **Last Updated By/Date** - Michelle Malcher, Database Product Management, June 2024

0 comments on commit f1acd7e

Please sign in to comment.