Skip to content

GoogleCloudPlatform/solutions-genai-llm-workshop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Overview

This is part of the JAPAC Generative AI Technical Workshop qwiklabs. The workshop walk the audiences through:

  • Google Generative AI Language offerings
  • Langchain integration

Provision Cloud Resources

  1. Configure Google Cloud Environment

    If you are running the lab in Qwiklabs environment, you can skip step 2.

To manually configure the Google Cloud project:

  1. Use Terraform to create and configure required resources.
  • Goto terraform/qwiklabs folder.

    cd terraform/qwiklabs
  • create terraform.tfvars file with the following content

    gcp_project_id = <YOUR GCP PROJECT ID>
    gcp_region = <DEFAULT GCP PROJECT ID> 
    gcp_zone = <DEFAULT GCP PROJECT ID> 
  • Apply terraform to privision Google Cloud Resources.

    terraform init
    terraform plan -var-file=terraform.tfvars
    terraform apply -var-file=terraform.tfvars

    This will create the following resources: 1. A VPC with firewall rules which allows 80, 8080, 23 TCP inbound traffics. 2. Service Network peering with the VPC.

At this point, you have provisioned required cloud resources.

Create Vertex AI Workbench as the lab environment.

In this lab, we use Vertex AI Workbench as the lab environment.

  1. Follow the instruction to provision Vertex AI Workbench Instance.

  2. Once the Workbench instance is created. Open the notebook.

  3. Open terminal.

  4. Run the following commands in the terminal.

    export GOOGLE_CLOUD_PROJECT=$(gcloud config get project)
    export GOOGLE_CLOUD_REGION=us-central1
    export GOOGLE_CLOUD_ZONE=us-central1-a
    
    git clone https://github.com/GoogleCloudPlatform/solutions-genai-llm-workshop
    
    cd solutions-genai-llm-workshop
    
    python3 -m venv .venv
    
    curl -sSL https://raw.githubusercontent.com/python-poetry/install.python-poetry.org/385616cd90816622a087450643fba971d3b46d8c/install-poetry.py | python3 -
    
    source .venv/bin/activate
    curl -sS https://bootstrap.pypa.io/get-pip.py | python3
    
    pip install -r requirements.in
  5. Authenticate to the Google Cloud Project

    gcloud auth login # Login with project owner account
    gcloud auth application-default login # Login with project owner account
    
  6. Assign required roles to the user.

    export USER_EMAIL=<USE ACCOUNT EMAIL>
    gcloud projects add-iam-policy-binding $GOOGLE_CLOUD_PROJECT --member=user:$USER_EMAIL --role=roles/ml.admin
    
    gcloud projects add-iam-policy-binding $GOOGLE_CLOUD_PROJECT --member=user:$USER_EMAIL --role=roles/aiplatform.admin
    
    gcloud projects add-iam-policy-binding $GOOGLE_CLOUD_PROJECT --member=user:$USER_EMAIL --role=roles/aiplatform.user
    
    gcloud projects add-iam-policy-binding $GOOGLE_CLOUD_PROJECT --member=user:$USER_EMAIL --role=roles/serviceusage.serviceUsageConsumer
    
  7. Create BigQuery dataset

    python3 1-create-and-copy-bq-data.py
  8. Create Vertex Matching Engine, this can take around 60 minutes.

    curl -L https://tinyurl.com/genai-202307-dataset --output dataset.zip
    unzip dataset.zip
    rm dataset.zip
    
    python3 0-setup-matching-enging.py	

Releases

No releases published

Packages

No packages published