All-in-one repository to manage Cloud assets in Google Cloud Platform (GCP). This repository contains IaC (Terraform) for all GCP infrastructure and the Python code necessary for automating the export of cloud assets. Using this code, you can then visualize the data using Google's Looker Studio:
- Prerequisites
- Deploy IaC for GCP - Service account, IAM, PubSub Topic, Cloud scheduler
- Deploy Cloud function
- View Looker Studio Dashboard
Before you begin, ensure you have the following:
- Google Cloud SDK installed on your local machine. You can download it here.
- For instructions on configuring the SDK, refer to this link.
- Terraform installed on your local machine for infrastructure provisioning. You can download it here.
- For instructions on installing Terraform, refer to this link.
- Sufficient IAM to deploy and make the following changes:
- Org level IAM
- Project level IAM
- Create a BigQuery dataset
- Deploy Cloud scheduler
- Deploy Cloud functions
- Deploy PubSub Topic
- Create Service account
- From root, change into the terraform directory
cd infrastructure-
Update the .auto.tfvars file with your inputs
asset-exporter.auto.tfvars -
Initiliase Terraform
terraform init- Plan and deploy Terraform
terraform plan
terraform apply -auto-approve- From root, change into the application directory
cd application- Update the variables on line 6 and line 7 in main.py
6 org_id = "123456789"
7 project_id = "example-prj"- Run the gcloud command replacing:
- CLOUD_FUNCTION_NAME
- PUBSUB_TOPIC_NAME
- GCP_PROJECT
- REGION
- SERVICE_ACCOUNT_EMAIL
gcloud beta functions deploy CLOUD_FUNCTION_NAME \
--runtime python311 \
--trigger-topic PUBSUB_TOPIC_NAME \
--entry-point pubsub_to_bigquery \
--project GCP_PROJECT \
--region REGION \
--service-account SERVICE_ACCOUNT_EMAILUsing Looker studio, you can select the dataset created and drill down in to all assets in GCP the organisation.

