Why we created the GCP exporter
Our vision is to bring all the data that’s relevant into the internal developer portal, and Google Cloud Platform data - like all cloud resource data - is an important piece of the puzzle. Our goal is to bring live cloud resource data into the software catalog, so that it can be shown in context inside the catalog, showing both metadata as well as runtime data, making developer and DevOps lives easier.
How does it work?
Port’s GCP exporter is based on our Terraform provider. The Terraform provider allows us to create the relevant Port Blueprints for the GCP objects, while simultaneously ingesting the data and creating entities based on these blueprints.
In addition to creating the data in Port, the GCP exporter is integrated with the GCP provider, allowing you to select anything from the complete GCP resource list, adding data about specific assets such as buckets, disk, container clusters, etc.Terraform is an optimal technology for this solution since it has a native ability to keep the state of the data in Port and compare it to GCP.
{{cta_8}}
Working with the GCP exporter
Just like other Port exporters, such as the AWS exporter or our GitHub exporter, you can use the GCP exporter to take cloud data from your GCP account and insert it into the software catalog. This includes Projects, Storage Buckets, Service Accounts, Memorystores, Compute Instances, Container Clusters and basically anything GCP. Port supports the supported resource types in GCP, giving you the ability to create the right abstractions and views in Port, by developers, teams and more. Seeing your GCP accounts in the Software Catalog lets you tie cloud resources to services, alerts, FinOps and anything else in Port.
.png)
The way this works in Port is by defining blueprints. Blueprints are where you define entities in the software catalog, essentially allowing you to define the data model for a certain type of entity in your software catalog and its relations to other entities. As such, Blueprints are the main building block in Port. Once a blueprint is populated with data (in our case, using Port’s GCP exporter), it creates software catalog entities. Blueprints support the representation of any asset in Port, and in the case of the GCP exporter, entities such as Storage Buckets, Compute Instances, Memorystores and more. The list of blueprints isn’t closed and you can create a blueprint to represent any GCP resource.
Port’s GCP exporter creates the blueprints and exports the data, using Terraform. You can either work with a general GCP asset blueprint that will pull all generic GCP metadata or create specific blueprints for specific asset types.
This shows a Project blueprint and its related entities. First, we’ll see the relevant blueprints, and then the related entities:


Some more information about the GCP exporter
Want to know more? Check our docs.

Book a demo right now to check out Port's developer portal yourself

It's a Trap - Jenkins as Self service UI

How do GitOps affect developer experience?

It's a Trap - Jenkins as Self service UI. Click her to download the eBook
.png)
Learning from CyberArk - building an internal developer platform in-house
Example JSON block
Order Domain
Cart System
Products System
Cart Resource
Cart API
Core Kafka Library
Core Payment Library
Cart Service JSON
Products Service JSON
Component Blueprint
Resource Blueprint
API Blueprint
Domain Blueprint
System Blueprint
Microservices SDLC
Scaffold a new microservice
Deploy (canary or blue-green)
Feature flagging
Revert
Lock deployments
Add Secret
Force merge pull request (skip tests on crises)
Add environment variable to service
Add IaC to the service
Upgrade package version
Development environments
Spin up a developer environment for 5 days
ETL mock data to environment
Invite developer to the environment
Extend TTL by 3 days
Cloud resources
Provision a cloud resource
Modify a cloud resource
Get permissions to access cloud resource
SRE actions
Update pod count
Update auto-scaling group
Execute incident response runbook automation
Data Engineering
Add / Remove / Update Column to table
Run Airflow DAG
Duplicate table
Backoffice
Change customer configuration
Update customer software version
Upgrade - Downgrade plan tier
Create - Delete customer
Machine learning actions
Train model
Pre-process dataset
Deploy
A/B testing traffic route
Revert
Spin up remote Jupyter notebook
Engineering tools
Observability
Tasks management
CI/CD
On-Call management
Troubleshooting tools
DevSecOps
Runbooks
Infrastructure
Cloud Resources
K8S
Containers & Serverless
IaC
Databases
Environments
Regions
Software and more
Microservices
Docker Images
Docs
APIs
3rd parties
Runbooks
Cron jobs