Terraform for AWS Serverless: Lambda, API Gateway & DynamoDB
Provision a complete serverless stack on AWS using Terraform ā Lambda functions, API Gateway REST and HTTP APIs, DynamoDB tables, IAM roles, and CloudWatch log groups.
Serverless on AWS: The Architecture
A classic serverless stack uses three AWS services working together:
Client
ā
ā¼
API Gateway (HTTP triggers)
ā
ā¼
Lambda Function (business logic, no servers)
ā
ā¼
DynamoDB (fast, scalable NoSQL)Benefits of serverless:
- No servers to patch or manage
- Pay per invocation (pennies at low traffic)
- Automatic scaling from 0 to thousands of concurrent requests
- AWS handles availability and durability
In this lesson you'll provision this entire stack with Terraform ā fully reproducible, version-controlled infrastructure.
Project Structure
terraform-serverless/
āāā main.tf
āāā variables.tf
āāā outputs.tf
āāā versions.tf
āāā iam.tf # IAM roles and policies
āāā lambda.tf # Lambda functions
āāā api_gateway.tf # API Gateway
āāā dynamodb.tf # DynamoDB tables
āāā cloudwatch.tf # Log groups and alarms
āāā lambda_src/
āāā handler.py # Lambda function codeVersions and Provider
# versions.tf
terraform {
required_version = ">= 1.6"
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
archive = {
source = "hashicorp/archive"
version = "~> 2.0"
}
}
backend "s3" {
bucket = "my-terraform-state"
key = "serverless/terraform.tfstate"
region = "us-east-1"
encrypt = true
dynamodb_table = "terraform-state-locks"
}
}
provider "aws" {
region = var.aws_region
default_tags {
tags = {
Project = var.project_name
Environment = var.environment
ManagedBy = "terraform"
}
}
}Variables
# variables.tf
variable "aws_region" {
description = "AWS region"
type = string
default = "us-east-1"
}
variable "environment" {
description = "Deployment environment"
type = string
default = "dev"
validation {
condition = contains(["dev", "staging", "prod"], var.environment)
error_message = "Must be dev, staging, or prod."
}
}
variable "project_name" {
description = "Project name prefix"
type = string
default = "learnixo"
}
variable "lambda_runtime" {
description = "Lambda runtime"
type = string
default = "python3.12"
}
variable "lambda_memory_mb" {
description = "Lambda memory allocation in MB"
type = number
default = 256
}
variable "lambda_timeout_seconds" {
description = "Lambda execution timeout"
type = number
default = 30
}
variable "log_retention_days" {
description = "CloudWatch log retention period"
type = number
default = 14
}DynamoDB Table
# dynamodb.tf
resource "aws_dynamodb_table" "items" {
name = "${var.project_name}-${var.environment}-items"
billing_mode = "PAY_PER_REQUEST" # No capacity planning needed
hash_key = "pk"
range_key = "sk"
attribute {
name = "pk"
type = "S"
}
attribute {
name = "sk"
type = "S"
}
attribute {
name = "gsi1pk"
type = "S"
}
# Global Secondary Index for alternate access patterns
global_secondary_index {
name = "gsi1"
hash_key = "gsi1pk"
projection_type = "ALL"
}
# Enable point-in-time recovery for production
point_in_time_recovery {
enabled = var.environment == "prod"
}
# Encryption at rest (AWS managed key by default)
server_side_encryption {
enabled = true
}
ttl {
attribute_name = "expires_at"
enabled = true
}
tags = {
Name = "${var.project_name}-${var.environment}-items"
}
}
output "dynamodb_table_name" {
value = aws_dynamodb_table.items.name
}
output "dynamodb_table_arn" {
value = aws_dynamodb_table.items.arn
}IAM Role for Lambda
Lambda needs an execution role ā it must be able to write logs and access DynamoDB.
# iam.tf
data "aws_iam_policy_document" "lambda_assume_role" {
statement {
effect = "Allow"
actions = ["sts:AssumeRole"]
principals {
type = "Service"
identifiers = ["lambda.amazonaws.com"]
}
}
}
resource "aws_iam_role" "lambda_execution" {
name = "${var.project_name}-${var.environment}-lambda-role"
assume_role_policy = data.aws_iam_policy_document.lambda_assume_role.json
}
# Attach the AWS managed policy for basic Lambda execution (CloudWatch Logs)
resource "aws_iam_role_policy_attachment" "lambda_basic_execution" {
role = aws_iam_role.lambda_execution.name
policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
}
# Custom policy: DynamoDB access (least privilege)
data "aws_iam_policy_document" "lambda_dynamodb" {
statement {
effect = "Allow"
actions = [
"dynamodb:GetItem",
"dynamodb:PutItem",
"dynamodb:UpdateItem",
"dynamodb:DeleteItem",
"dynamodb:Query",
"dynamodb:Scan",
]
resources = [
aws_dynamodb_table.items.arn,
"${aws_dynamodb_table.items.arn}/index/*",
]
}
}
resource "aws_iam_role_policy" "lambda_dynamodb" {
name = "dynamodb-access"
role = aws_iam_role.lambda_execution.id
policy = data.aws_iam_policy_document.lambda_dynamodb.json
}Lambda Function
Terraform packages the Lambda code as a zip file using the archive provider.
# lambda.tf
# Package the Lambda source code
data "archive_file" "lambda_zip" {
type = "zip"
source_dir = "${path.module}/lambda_src"
output_path = "${path.module}/.terraform/lambda.zip"
}
resource "aws_lambda_function" "api" {
function_name = "${var.project_name}-${var.environment}-api"
role = aws_iam_role.lambda_execution.arn
runtime = var.lambda_runtime
handler = "handler.lambda_handler"
filename = data.archive_file.lambda_zip.output_path
# Redeploy only when code changes (content hash)
source_code_hash = data.archive_file.lambda_zip.output_base64sha256
memory_size = var.lambda_memory_mb
timeout = var.lambda_timeout_seconds
environment {
variables = {
ENVIRONMENT = var.environment
DYNAMODB_TABLE = aws_dynamodb_table.items.name
LOG_LEVEL = var.environment == "prod" ? "WARNING" : "DEBUG"
}
}
depends_on = [
aws_iam_role_policy_attachment.lambda_basic_execution,
aws_cloudwatch_log_group.lambda,
]
}
# Lambda alias for stable reference (e.g., "live")
resource "aws_lambda_alias" "live" {
name = "live"
function_name = aws_lambda_function.api.function_name
function_version = "$LATEST"
}The Lambda handler code:
# lambda_src/handler.py
import json
import os
import boto3
from boto3.dynamodb.conditions import Key
dynamodb = boto3.resource("dynamodb")
table = dynamodb.Table(os.environ["DYNAMODB_TABLE"])
def lambda_handler(event, context):
http_method = event.get("httpMethod") or event.get("requestContext", {}).get("http", {}).get("method")
path = event.get("path") or event.get("rawPath", "/")
try:
if http_method == "GET" and path.startswith("/items/"):
item_id = path.split("/items/")[1]
return get_item(item_id)
elif http_method == "POST" and path == "/items":
body = json.loads(event.get("body") or "{}")
return create_item(body)
else:
return respond(404, {"error": "Not found"})
except Exception as e:
print(f"Error: {e}")
return respond(500, {"error": "Internal server error"})
def get_item(item_id):
result = table.get_item(Key={"pk": f"ITEM#{item_id}", "sk": "METADATA"})
item = result.get("Item")
if not item:
return respond(404, {"error": "Item not found"})
return respond(200, item)
def create_item(body):
import uuid
item_id = str(uuid.uuid4())
item = {"pk": f"ITEM#{item_id}", "sk": "METADATA", "id": item_id, **body}
table.put_item(Item=item)
return respond(201, item)
def respond(status_code, body):
return {
"statusCode": status_code,
"headers": {"Content-Type": "application/json"},
"body": json.dumps(body, default=str),
}API Gateway (HTTP API)
AWS offers two API Gateway variants. HTTP API (v2) is simpler, cheaper, and better for Lambda integrations.
# api_gateway.tf
resource "aws_apigatewayv2_api" "main" {
name = "${var.project_name}-${var.environment}-api"
protocol_type = "HTTP"
cors_configuration {
allow_headers = ["content-type", "authorization"]
allow_methods = ["GET", "POST", "PUT", "DELETE", "OPTIONS"]
allow_origins = var.environment == "prod" ? ["https://learnixo.io"] : ["*"]
max_age = 300
}
}
# Integration: connect API Gateway to Lambda
resource "aws_apigatewayv2_integration" "lambda" {
api_id = aws_apigatewayv2_api.main.id
integration_type = "AWS_PROXY"
integration_uri = aws_lambda_function.api.invoke_arn
integration_method = "POST"
payload_format_version = "2.0"
}
# Route: catch-all proxy to Lambda
resource "aws_apigatewayv2_route" "proxy" {
api_id = aws_apigatewayv2_api.main.id
route_key = "$default"
target = "integrations/${aws_apigatewayv2_integration.lambda.id}"
}
# Stage: auto-deploy
resource "aws_apigatewayv2_stage" "default" {
api_id = aws_apigatewayv2_api.main.id
name = "$default"
auto_deploy = true
access_log_settings {
destination_arn = aws_cloudwatch_log_group.api_gateway.arn
format = jsonencode({
requestId = "$context.requestId"
sourceIp = "$context.identity.sourceIp"
httpMethod = "$context.httpMethod"
path = "$context.path"
status = "$context.status"
responseLength = "$context.responseLength"
responseTime = "$context.responseTimeInMillis"
})
}
default_route_settings {
throttling_burst_limit = 100
throttling_rate_limit = 50
}
}
# Permission: allow API Gateway to invoke Lambda
resource "aws_lambda_permission" "api_gateway" {
statement_id = "AllowAPIGatewayInvoke"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.api.function_name
principal = "apigateway.amazonaws.com"
source_arn = "${aws_apigatewayv2_api.main.execution_arn}/*/*"
}CloudWatch Log Groups
# cloudwatch.tf
resource "aws_cloudwatch_log_group" "lambda" {
name = "/aws/lambda/${var.project_name}-${var.environment}-api"
retention_in_days = var.log_retention_days
}
resource "aws_cloudwatch_log_group" "api_gateway" {
name = "/aws/api-gateway/${var.project_name}-${var.environment}"
retention_in_days = var.log_retention_days
}
# Alarm: Lambda error rate > 5%
resource "aws_cloudwatch_metric_alarm" "lambda_errors" {
alarm_name = "${var.project_name}-${var.environment}-lambda-errors"
comparison_operator = "GreaterThanThreshold"
evaluation_periods = 2
metric_name = "Errors"
namespace = "AWS/Lambda"
period = 60
statistic = "Sum"
threshold = 5
alarm_description = "Lambda error rate too high"
treat_missing_data = "notBreaching"
dimensions = {
FunctionName = aws_lambda_function.api.function_name
}
}Outputs
# outputs.tf
output "api_endpoint" {
description = "API Gateway invoke URL"
value = aws_apigatewayv2_stage.default.invoke_url
}
output "lambda_function_name" {
description = "Lambda function name"
value = aws_lambda_function.api.function_name
}
output "lambda_function_arn" {
description = "Lambda function ARN"
value = aws_lambda_function.api.arn
}
output "lambda_log_group" {
description = "CloudWatch log group for Lambda"
value = aws_cloudwatch_log_group.lambda.name
}Deploy and Test
# Initialize
terraform init
# Preview
terraform plan -var="environment=dev"
# Deploy
terraform apply -var="environment=dev" -auto-approve
# Get the API URL
terraform output api_endpoint
# ā https://abc123.execute-api.us-east-1.amazonaws.com
# Test endpoints
BASE_URL=$(terraform output -raw api_endpoint)
curl -X POST "$BASE_URL/items" \
-H "Content-Type: application/json" \
-d '{"name": "Test Item", "description": "Created via Terraform"}'
curl "$BASE_URL/items/{id}"Updating Lambda Code
When you change your Lambda source code, Terraform detects the change via the content hash:
# Edit lambda_src/handler.py
# Then:
terraform apply -var="environment=dev"
# Terraform will show:
# ~ aws_lambda_function.api
# source_code_hash: "oldHash" ā "newHash"
# Plan: 0 to add, 1 to change, 0 to destroy.Cost Breakdown (dev workload, ~10k req/day)
| Service | Free Tier | Estimated Monthly | |---------|-----------|-------------------| | Lambda | 1M req/mo free | ~$0.00 | | API Gateway HTTP | 1M req/mo free | ~$0.00 | | DynamoDB | 25GB + 200M req free | ~$0.00 | | CloudWatch Logs | 5GB free | ~$0.50 | | Total | | ~$1/month |
Serverless is genuinely free at development scale.
Production Checklist
- [ ] Enable DynamoDB point-in-time recovery (
enabled = true) - [ ] Set
log_retention_days = 90or higher - [ ] Configure reserved concurrency to cap Lambda scaling
- [ ] Add X-Ray tracing for distributed tracing
- [ ] Restrict CORS to your actual domain
- [ ] Store secrets in AWS Secrets Manager, not Lambda env vars
- [ ] Enable DynamoDB deletion protection
# Production: prevent accidental DynamoDB deletion
resource "aws_dynamodb_table" "items" {
# ...existing config...
deletion_protection_enabled = var.environment == "prod"
}Summary
You've provisioned a complete AWS serverless stack with Terraform:
| Resource | Purpose |
|----------|---------|
| aws_dynamodb_table | Serverless NoSQL database |
| aws_iam_role | Lambda execution permissions |
| aws_lambda_function | Business logic (no servers) |
| aws_apigatewayv2_api | HTTP trigger with CORS |
| aws_apigatewayv2_integration | Wires API GW to Lambda |
| aws_lambda_permission | Grants API GW invoke rights |
| aws_cloudwatch_log_group | Centralised logs |
Next up: Terraform Modules ā how to structure reusable, composable infrastructure components.
Enjoyed this article?
Explore the Cloud & DevOps learning path for more.
Found this helpful?
Leave a comment
Have a question, correction, or just found this helpful? Leave a note below.