- Query data from
region-eu.INFORMATION_SCHEMA.JOBS_BY_PROJECT
- OUTPUT: VIEW some_data_monitoring.BigQuery (BigQuery.sql)
You must have Set up Cloud Billing data export to BigQuery first.
- Get preprocessed data from Billing export.
- FREQUENCY: Once an hour.
- OUTPUT: some_data_monitoring.billing
In order to test code locally, you need to export this environnement variable:
export GCP_SA=$(cat secret/some-data-monitoring.json)
You have to create a local venv
environnement in which the code must be run:
$ python3 -m venv venv
$ source venv/bin/activate
$ pip install -r src/requirements.txt
Then you can run the following command:
$ python src/[TASK].py --conf conf/some-data-monitoring.yaml
Make sure to have Airflow2 (pip install "apache-airflow[kubernetes]") & Minikube (minikube_latest_amd64.deb) installed on your machine.
Start your airflow webserver & scheduler:
$ airflow webserver -p PORT
$ airflow scheduler
(from project root directory) Start your minikube, create namespace, ConfigMap and Secret, deploy dag and build code:
$ ./dev/deploy_local.sh full
Open the Airflow webserver at http://127.0.0.1:PORT/home