Terraform Archive Provider - Create ZIP and TAR Files
Use the Terraform archive provider to create ZIP files for Lambda functions, Cloud Functions, and deployments. archive_file data source with source_dir and...
Terraform
Use Terraform null_resource for custom provisioners, triggers, and workarounds. Local-exec, remote-exec, dependency management, and when to use...
resource "null_resource" "deploy" {
triggers = {
script_hash = filemd5("${path.module}/deploy.sh")
}
provisioner "local-exec" {
command = "./deploy.sh"
}
}resource "null_resource" "init_db" {
triggers = {
db_id = aws_db_instance.main.id
}
provisioner "local-exec" {
command = "psql -h ${aws_db_instance.main.address} -U admin -f schema.sql"
environment = {
PGPASSWORD = var.db_password
}
}
depends_on = [aws_db_instance.main]
}resource "null_resource" "build" {
triggers = {
source_hash = sha256(join("", [
filemd5("${path.module}/src/main.py"),
filemd5("${path.module}/src/requirements.txt"),
]))
}
provisioner "local-exec" {
command = "cd src && pip install -r requirements.txt -t build/ && cp *.py build/"
working_dir = path.module
}
}resource "null_resource" "configure" {
triggers = {
instance_id = aws_instance.web.id
}
connection {
type = "ssh"
user = "ubuntu"
private_key = file("~/.ssh/id_rsa")
host = aws_instance.web.public_ip
}
provisioner "file" {
source = "configs/nginx.conf"
destination = "/tmp/nginx.conf"
}
provisioner "remote-exec" {
inline = [
"sudo mv /tmp/nginx.conf /etc/nginx/nginx.conf",
"sudo systemctl restart nginx",
]
}
}resource "null_resource" "cluster_config" {
triggers = {
cluster_version = var.cluster_version
config_hash = filemd5("${path.module}/cluster-config.yaml")
always_run = timestamp() # Runs every apply
}
provisioner "local-exec" {
command = "kubectl apply -f cluster-config.yaml"
}
}terraform_data is the built-in replacement for null_resource — no provider needed:
resource "terraform_data" "deploy" {
triggers_replace = [
filemd5("${path.module}/deploy.sh"),
var.app_version,
]
provisioner "local-exec" {
command = "./deploy.sh ${var.app_version}"
}
}
# Store and reference values
resource "terraform_data" "values" {
input = {
timestamp = timestamp()
version = var.app_version
}
}
output "deploy_time" {
value = terraform_data.values.output.timestamp
}| Feature | null_resource | terraform_data |
|---|---|---|
| Provider needed | Yes (hashicorp/null) | No (built-in) |
| Available since | Always | Terraform 1.4 |
| Triggers | triggers map | triggers_replace list |
| Store values | No | Yes (input/output) |
| Provisioners | Yes | Yes |
http data sourceUse terraform_data (Terraform 1.4+) instead of null_resource for new projects — it's built-in and can store values. Use triggers to control when provisioners re-run. Both are escape hatches — prefer native Terraform resources when possible.
Use the Terraform archive provider to create ZIP files for Lambda functions, Cloud Functions, and deployments. archive_file data source with source_dir and...
Automate Terraform with Azure DevOps Pipelines. YAML pipelines, service connections, environment approvals, and Azure backend state configuration.
Automate Terraform with GitHub Actions. Plan on PR, apply on merge, OIDC authentication, environment protection, and drift detection workflows.
Automate Terraform with GitLab CI/CD. Plan on merge requests, apply on main, remote state with HTTP backend, and environment-specific pipelines.