Use an LLM to Generate Terraform from Natural Language in 2026
Build a tool that converts plain English infrastructure descriptions into valid Terraform code using Claude AI — with validation, state awareness, and GitOps integration.
"Create an EKS cluster in us-east-1 with 3 t3.medium nodes, in a private VPC with a NAT gateway."
That sentence should produce valid, production-ready Terraform. Let's build the tool that does it.
What We're Building
A Python CLI that:
- Takes plain English infrastructure descriptions
- Sends them to Claude with context about your existing Terraform codebase
- Returns valid, opinionated Terraform code
- Validates the output with
terraform validate - Optionally creates a PR to your GitOps repo
Why This Is Useful
Writing Terraform for common patterns (VPC, EKS, RDS, IAM roles) is repetitive. Every team writes the same code with slightly different variable names. An LLM that knows your conventions and module patterns can accelerate this dramatically.
The key difference from "just asking ChatGPT for Terraform" — this tool:
- Knows your existing module structure
- Follows your naming conventions
- References your existing variables and outputs
- Validates the generated code before you see it
Step 1: Set Up the Project
pip install anthropic boto3 python-dotenv gitpythontf-generator/
├── main.py
├── context_builder.py
├── validator.py
├── .env
└── templates/
└── system_prompt.txt
Step 2: Build the Context Builder
The LLM needs context about your existing Terraform to generate code that fits:
# context_builder.py
import os
import glob
def build_context(tf_dir: str = ".") -> str:
"""Read existing Terraform files to give Claude context"""
context_parts = []
# Read variable definitions
for tf_file in glob.glob(f"{tf_dir}/**/*.tf", recursive=True):
if "/.terraform/" in tf_file:
continue
with open(tf_file) as f:
content = f.read()
relative_path = os.path.relpath(tf_file, tf_dir)
context_parts.append(f"### {relative_path}\n```hcl\n{content}\n```")
# Read terraform.tfvars if exists
if os.path.exists(f"{tf_dir}/terraform.tfvars"):
with open(f"{tf_dir}/terraform.tfvars") as f:
context_parts.append(f"### terraform.tfvars\n```hcl\n{f.read()}\n```")
return "\n\n".join(context_parts[:20]) # limit context size
def get_module_registry() -> str:
"""Common modules to prefer"""
return """
Available Terraform Registry modules to use when appropriate:
- terraform-aws-modules/eks/aws (for EKS clusters)
- terraform-aws-modules/vpc/aws (for VPC)
- terraform-aws-modules/rds/aws (for RDS)
- terraform-aws-modules/s3-bucket/aws (for S3)
- terraform-aws-modules/iam/aws (for IAM roles/policies)
"""Step 3: The System Prompt
# templates/system_prompt.txt
You are an expert Terraform engineer. Generate production-ready Terraform code based on user descriptions.
Rules:
1. Always use the latest stable provider versions
2. Always include required_providers block
3. Use variables for all configurable values
4. Include outputs for important resource IDs
5. Add descriptions to all variables and outputs
6. Use data sources to reference existing resources where appropriate
7. Follow these naming conventions: snake_case, descriptive names
8. Always include tags: {Environment, ManagedBy: "terraform", Team}
9. Never hardcode credentials or secrets
10. Prefer community modules from registry when available
Output format:
- Separate files by resource type (main.tf, variables.tf, outputs.tf)
- Mark each file with: # === FILE: filename.tf ===
- Include a brief comment explaining each major resourceStep 4: The Generator
# main.py
import anthropic
import os
import sys
import subprocess
import tempfile
from context_builder import build_context, get_module_registry
from dotenv import load_dotenv
load_dotenv()
def generate_terraform(description: str, tf_dir: str = ".") -> dict[str, str]:
"""Generate Terraform files from a natural language description"""
client = anthropic.Anthropic()
# Build context from existing code
existing_context = build_context(tf_dir)
module_registry = get_module_registry()
with open("templates/system_prompt.txt") as f:
system_prompt = f.read()
user_message = f"""Generate Terraform code for the following infrastructure:
{description}
Existing project context (follow these conventions):
{existing_context if existing_context else "No existing Terraform files found."}
{module_registry}
Generate complete, valid Terraform code. Separate into main.tf, variables.tf, and outputs.tf."""
print("🤖 Generating Terraform with Claude...")
message = client.messages.create(
model="claude-sonnet-4-6",
max_tokens=4096,
system=system_prompt,
messages=[{"role": "user", "content": user_message}]
)
response_text = message.content[0].text
# Parse files from response
files = {}
current_file = None
current_content = []
for line in response_text.split("\n"):
if line.startswith("# === FILE:"):
if current_file and current_content:
content = "\n".join(current_content).strip()
# Remove markdown code fences if present
if content.startswith("```"):
content = "\n".join(content.split("\n")[1:])
if content.endswith("```"):
content = "\n".join(content.split("\n")[:-1])
files[current_file] = content
current_file = line.replace("# === FILE:", "").replace("===", "").strip()
current_content = []
else:
current_content.append(line)
if current_file and current_content:
files[current_file] = "\n".join(current_content).strip()
return files
def validate_terraform(files: dict[str, str]) -> tuple[bool, str]:
"""Validate generated Terraform using terraform validate"""
with tempfile.TemporaryDirectory() as tmpdir:
# Write files
for filename, content in files.items():
filepath = os.path.join(tmpdir, filename)
with open(filepath, "w") as f:
f.write(content)
# terraform init (with plugin cache to speed up)
init_result = subprocess.run(
["terraform", "init", "-backend=false"],
cwd=tmpdir,
capture_output=True,
text=True
)
if init_result.returncode != 0:
return False, f"terraform init failed:\n{init_result.stderr}"
# terraform validate
validate_result = subprocess.run(
["terraform", "validate"],
cwd=tmpdir,
capture_output=True,
text=True
)
if validate_result.returncode != 0:
return False, f"terraform validate failed:\n{validate_result.stderr}"
return True, "Validation passed ✅"
def main():
if len(sys.argv) < 2:
print("Usage: python main.py 'describe your infrastructure'")
print("Example: python main.py 'Create an S3 bucket for static website hosting with CloudFront'")
sys.exit(1)
description = " ".join(sys.argv[1:])
tf_dir = os.environ.get("TF_DIR", ".")
print(f"📝 Request: {description}\n")
# Generate
files = generate_terraform(description, tf_dir)
if not files:
print("❌ Could not parse generated files. Raw output saved to output.txt")
sys.exit(1)
print(f"📦 Generated {len(files)} file(s): {', '.join(files.keys())}\n")
# Validate
print("🔍 Validating with terraform validate...")
valid, message = validate_terraform(files)
print(message)
if not valid:
print("\n⚠️ Validation failed. Saving files anyway for review.")
# Save files
output_dir = "generated"
os.makedirs(output_dir, exist_ok=True)
for filename, content in files.items():
filepath = os.path.join(output_dir, filename)
with open(filepath, "w") as f:
f.write(content)
print(f"✅ Saved: {filepath}")
if valid:
print(f"\n🚀 Ready to use! Run:")
print(f" cd {output_dir} && terraform init && terraform plan")
if __name__ == "__main__":
main()Using It
export ANTHROPIC_API_KEY="your-key"
# Simple S3 bucket
python main.py "Create an S3 bucket for application logs with lifecycle policy to delete objects after 90 days"
# Complex VPC + EKS
python main.py "Set up a production VPC with public and private subnets across 3 AZs, NAT gateway, and an EKS cluster with a managed node group of t3.medium instances"
# RDS with secrets
python main.py "PostgreSQL RDS instance in the existing VPC, Multi-AZ, t3.medium, 100GB storage, password stored in AWS Secrets Manager"Sample output:
📝 Request: Create an S3 bucket for application logs...
🤖 Generating Terraform with Claude...
📦 Generated 3 file(s): main.tf, variables.tf, outputs.tf
🔍 Validating with terraform validate...
Validation passed ✅
✅ Saved: generated/main.tf
✅ Saved: generated/variables.tf
✅ Saved: generated/outputs.tf
🚀 Ready to use! Run:
cd generated && terraform init && terraform plan
Add Self-Healing: Auto-Fix Validation Errors
If validation fails, send the error back to Claude:
def generate_with_retry(description: str, max_retries: int = 2) -> dict[str, str]:
files = generate_terraform(description)
for attempt in range(max_retries):
valid, error = validate_terraform(files)
if valid:
return files
print(f"⚠️ Attempt {attempt + 1} failed. Asking Claude to fix...")
# Ask Claude to fix the error
client = anthropic.Anthropic()
fix_message = client.messages.create(
model="claude-sonnet-4-6",
max_tokens=4096,
messages=[{
"role": "user",
"content": f"Fix this Terraform validation error:\n\nError:\n{error}\n\nFiles:\n{files}\n\nReturn corrected files in the same format."
}]
)
# Re-parse files from fix response
files = parse_files(fix_message.content[0].text)
return files # Return best effort even if still failingGitOps Integration
Automatically create a PR to your infra repo:
import git
def create_pr(files: dict[str, str], description: str, repo_path: str):
repo = git.Repo(repo_path)
branch_name = f"tf-gen/{description[:30].lower().replace(' ', '-')}"
repo.git.checkout('-b', branch_name)
for filename, content in files.items():
filepath = os.path.join(repo_path, "generated", filename)
with open(filepath, "w") as f:
f.write(content)
repo.index.add([filepath])
repo.index.commit(f"feat: generated terraform for '{description}'")
repo.remote('origin').push(branch_name)
print(f"✅ Created branch: {branch_name}")
print("Open a PR from this branch to review the generated Terraform")Resources
- Anthropic Claude API Docs — model capabilities and pricing
- Terraform Registry — community modules to reference
- DevOpsBoys Terraform Remote State Guide — essential complement
- Build AI DevOps Assistant — more AI + DevOps patterns
- LangChain Docs — if you want to build more complex agent workflows
The best Terraform is the Terraform you don't have to write from scratch. Let the LLM handle the boilerplate while you focus on architecture decisions.
Stay ahead of the curve
Get the latest DevOps, Kubernetes, AWS, and AI/ML guides delivered straight to your inbox. No spam — just practical engineering content.
Related Articles
Build an AI DevOps Assistant with LangChain + Real Tools (2026)
Build a DevOps AI agent that can actually run kubectl, check AWS costs, read logs, and create GitHub issues — using LangChain tool calling and Claude API.
Build a Complete AWS Infrastructure with Terraform from Scratch (2026)
Full project walkthrough: provision a production-grade AWS VPC, EKS cluster, RDS, S3, and IAM with Terraform. Real code, real architecture, ready to use.
Build a Kubernetes Cost Optimization Bot with AI in 2026
Build an AI-powered bot that analyzes your Kubernetes cluster, finds idle resources, oversized pods, and unused namespaces — and gives cost-cutting recommendations.