Connect Your GCP Project
Enable Fireconduit to run backfill jobs in your GCP project
To run backfill jobs, Fireconduit needs to connect to your GCP project. This one-time setup takes about 5 minutes and deploys a small Cloud Function that receives triggers from Fireconduit and launches Dataflow jobs in your project.
Your data stays in your GCP project at all times—Fireconduit just orchestrates the jobs.
Prerequisites
Before you begin, make sure you have:
- A Fireconduit account with an API key (find it in Settings → API Keys)
- A GCP project with billing enabled
- Terraform installed (install guide)
- gcloud CLI installed and authenticated (install guide)
Connect with the CLI
The quickest way to connect your project:
npx fireconduit init
This wizard will:
- Check that Terraform and gcloud are installed
- Ask for your GCP project ID and region
- Ask for your Fireconduit API key
- Generate Terraform configuration files
Then deploy:
cd .fireconduit
terraform init
terraform apply
Once complete, your GCP project appears in the Fireconduit dashboard and you can start creating pipelines.
Manual Setup
If you prefer to write the Terraform configuration yourself:
main.tf
module "fireconduit" {
source = "github.com/fireconduit/terraform-fireconduit"
project_id = var.project_id
region = var.region
fireconduit_api_key = var.fireconduit_api_key
}
variables.tf
variable "project_id" {
description = "Your GCP project ID"
type = string
}
variable "region" {
description = "GCP region (e.g., us-central1, europe-west2)"
type = string
default = "us-central1"
}
variable "fireconduit_api_key" {
description = "Your API key from the Fireconduit dashboard"
type = string
sensitive = true
}
terraform.tfvars
project_id = "your-project-id"
region = "us-central1"
fireconduit_api_key = "fc_your_api_key_here"
Then run:
terraform init
terraform apply
Configuration Options
| Variable | Description | Default |
|---|---|---|
project_id | Your GCP project ID | Required |
region | GCP region for resources | europe-west2 |
fireconduit_api_key | API key from dashboard | Required |
function_name | Cloud Function name | fireconduit-trigger |
dataflow_network | VPC network for Dataflow | default |
dataflow_subnetwork | Subnetwork for Dataflow | null |
VPC Service Controls
If your project uses VPC Service Controls:
module "fireconduit" {
source = "github.com/fireconduit/terraform-fireconduit"
project_id = var.project_id
region = var.region
fireconduit_api_key = var.fireconduit_api_key
# VPC-SC configuration
enable_vpc_connector = true
vpc_connector = "projects/my-project/locations/us-central1/connectors/my-connector"
dataflow_network = "my-vpc"
dataflow_subnetwork = "projects/my-project/regions/us-central1/subnetworks/my-subnet"
disable_public_ips = true
}
Troubleshooting
Permission errors during deployment
Your account needs these roles:
roles/cloudfunctions.adminroles/iam.serviceAccountAdminroles/storage.adminroles/secretmanager.admin
Backfill jobs failing
Run the doctor command to diagnose issues:
npx fireconduit doctor
This checks your configuration, permissions, and connectivity.
Updating
To update to the latest version:
terraform init -upgrade
terraform apply
Disconnecting
To remove Fireconduit from your project:
terraform destroy
This removes the Cloud Function and related resources. Your Firestore and BigQuery data are not affected.