Terraform CI/CD with Azure DevOps Pipelines
Automate Terraform with Azure DevOps Pipelines. YAML pipelines, service connections, environment approvals, and Azure backend state configuration.
Terraform
Use the Terraform archive provider to create ZIP files for Lambda functions, Cloud Functions, and deployments. archive_file data source with source_dir and...
data "archive_file" "lambda" {
type = "zip"
source_dir = "${path.module}/lambda"
output_path = "${path.module}/lambda.zip"
}
resource "aws_lambda_function" "main" {
filename = data.archive_file.lambda.output_path
source_code_hash = data.archive_file.lambda.output_base64sha256
# ...
}data "archive_file" "app" {
type = "zip"
source_dir = "${path.module}/src"
output_path = "${path.module}/dist/app.zip"
excludes = [
"node_modules",
".git",
"*.test.js",
"__pycache__",
]
}data "archive_file" "handler" {
type = "zip"
source_file = "${path.module}/handler.py"
output_path = "${path.module}/handler.zip"
}data "archive_file" "config" {
type = "zip"
output_path = "${path.module}/config.zip"
source {
content = jsonencode({ environment = var.environment })
filename = "config.json"
}
source {
content = templatefile("${path.module}/templates/startup.sh", { port = var.port })
filename = "startup.sh"
}
}data "archive_file" "deployment" {
type = "zip"
output_path = "${path.module}/deployment.zip"
source {
content = file("${path.module}/src/main.py")
filename = "main.py"
}
source {
content = file("${path.module}/src/utils.py")
filename = "utils.py"
}
source {
content = file("${path.module}/requirements.txt")
filename = "requirements.txt"
}
}# Build step: install dependencies
resource "terraform_data" "pip_install" {
triggers_replace = [filemd5("${path.module}/requirements.txt")]
provisioner "local-exec" {
command = "pip install -r requirements.txt -t ${path.module}/build/python/lib/python3.12/site-packages/ --upgrade"
}
}
# Lambda layer (dependencies)
data "archive_file" "layer" {
type = "zip"
source_dir = "${path.module}/build"
output_path = "${path.module}/dist/layer.zip"
depends_on = [terraform_data.pip_install]
}
resource "aws_lambda_layer_version" "deps" {
filename = data.archive_file.layer.output_path
source_code_hash = data.archive_file.layer.output_base64sha256
layer_name = "${var.project}-deps"
compatible_runtimes = ["python3.12"]
}
# Lambda function (application code only)
data "archive_file" "function" {
type = "zip"
source_dir = "${path.module}/src"
output_path = "${path.module}/dist/function.zip"
}
resource "aws_lambda_function" "main" {
filename = data.archive_file.function.output_path
source_code_hash = data.archive_file.function.output_base64sha256
function_name = var.project
handler = "main.handler"
runtime = "python3.12"
layers = [aws_lambda_layer_version.deps.arn]
role = aws_iam_role.lambda.arn
}data "archive_file" "function" {
type = "zip"
source_dir = "${path.module}/function-source"
output_path = "${path.module}/function-source.zip"
}
resource "google_storage_bucket_object" "source" {
name = "function-${data.archive_file.function.output_md5}.zip"
bucket = google_storage_bucket.source.name
source = data.archive_file.function.output_path
}| Attribute | Description |
|---|---|
output_path | Path to the created archive |
output_size | File size in bytes |
output_base64sha256 | Base64-encoded SHA256 (for Lambda) |
output_md5 | MD5 hash (for GCS object naming) |
output_sha | SHA1 hash |
Use archive_file to package Lambda functions, Cloud Functions, and deployment artifacts directly in Terraform. The source_code_hash/output_md5 attributes trigger redeployment when source code changes. Use excludes to skip test files and dependencies.
Automate Terraform with Azure DevOps Pipelines. YAML pipelines, service connections, environment approvals, and Azure backend state configuration.
Automate Terraform with GitHub Actions. Plan on PR, apply on merge, OIDC authentication, environment protection, and drift detection workflows.
Automate Terraform with GitLab CI/CD. Plan on merge requests, apply on main, remote state with HTTP backend, and environment-specific pipelines.
Automate Terraform with Jenkins pipelines. Declarative and scripted pipelines, credentials management, approval gates, and multi-environment deployments.