Terraform Archive Provider - Create ZIP and TAR Files
Use the Terraform archive provider to create ZIP files for Lambda functions, Cloud Functions, and deployments. archive_file data source with source_dir and...
Terraform
Use the Terraform external data source to run scripts and fetch data from APIs. Shell scripts, Python scripts, JSON output, and common use cases.
data "external" "ip" {
program = ["bash", "-c", "echo '{\"ip\": \"'$(curl -s ifconfig.me)'\"}'" ]
}
output "my_ip" {
value = data.external.ip.result.ip
}The external data source:
data "external" "latest_ami" {
program = ["bash", "${path.module}/scripts/get-latest-ami.sh"]
query = {
region = var.region
os = "ubuntu"
}
}
output "ami_id" {
value = data.external.latest_ami.result.ami_id
}#!/bin/bash
# scripts/get-latest-ami.sh
# Read JSON input from stdin
eval "$(jq -r '@sh "REGION=\(.region) OS=\(.os)"')"
# Query AWS
AMI_ID=$(aws ec2 describe-images \
--region "$REGION" \
--owners 099720109477 \
--filters "Name=name,Values=${OS}/images/hvm-ssd/*" \
--query 'sort_by(Images, &CreationDate)[-1].ImageId' \
--output text)
# Output JSON (all values must be strings)
jq -n --arg ami_id "$AMI_ID" '{"ami_id": $ami_id}'data "external" "config" {
program = ["python3", "${path.module}/scripts/get-config.py"]
query = {
environment = var.environment
service = "api"
}
}#!/usr/bin/env python3
# scripts/get-config.py
import json
import sys
# Read query from stdin
query = json.load(sys.stdin)
env = query["environment"]
service = query["service"]
# Your logic here
config = {
"endpoint": f"https://{service}.{env}.example.com",
"replicas": "3" if env == "production" else "1",
"log_level": "warn" if env == "production" else "debug",
}
# Output JSON (all values must be strings!)
json.dump(config, sys.stdout)data "external" "github_release" {
program = ["bash", "-c", <<-EOF
RELEASE=$(curl -s https://api.github.com/repos/hashicorp/terraform/releases/latest)
VERSION=$(echo $RELEASE | jq -r '.tag_name')
DATE=$(echo $RELEASE | jq -r '.published_at')
jq -n --arg v "$VERSION" --arg d "$DATE" '{"version": $v, "date": $d}'
EOF
]
}data "external" "secret" {
program = ["bash", "-c", <<-EOF
SECRET=$(aws ssm get-parameter \
--name "/myapp/api-key" \
--with-decryption \
--query 'Parameter.Value' \
--output text)
jq -n --arg s "$SECRET" '{"value": $s}'
EOF
]
}# ❌ Wrong — nested object
echo '{"config": {"port": 8080}}'
# ❌ Wrong — non-string value
echo '{"port": 8080}'
# ✅ Correct — flat object with strings
echo '{"port": "8080", "host": "localhost"}'#!/bin/bash
set -e
# Validate inputs
eval "$(jq -r '@sh "REGION=\(.region)"')" 2>/dev/null
if [ -z "$REGION" ]; then
echo "Error: region is required" >&2
exit 1
fi
# Your logic...aws_ami, etc.)http data sourcevault_generic_secret, aws_ssm_parameter)The external data source is an escape hatch for data Terraform can't fetch natively. Use it for custom scripts, legacy APIs, or complex data lookups. Keep scripts fast, handle errors properly, and always output flat JSON with string values. Prefer native Terraform data sources when available.
Use the Terraform archive provider to create ZIP files for Lambda functions, Cloud Functions, and deployments. archive_file data source with source_dir and...
Automate Terraform with Azure DevOps Pipelines. YAML pipelines, service connections, environment approvals, and Azure backend state configuration.
Automate Terraform with GitHub Actions. Plan on PR, apply on merge, OIDC authentication, environment protection, and drift detection workflows.
Automate Terraform with GitLab CI/CD. Plan on merge requests, apply on main, remote state with HTTP backend, and environment-specific pipelines.