Claude Agent Skill · by Hashicorp

Terraform Style Guide

Install Terraform Style Guide skill for Claude Code from hashicorp/agent-skills.

Install
Terminal · npx
$npx skills add https://github.com/hashicorp/agent-skills --skill terraform-style-guide
Works with Paperclip

How Terraform Style Guide fits into a Paperclip company.

Terraform Style Guide drops into any Paperclip agent that handles this kind of work. Assign it to a specialist inside a pre-configured PaperclipOrg company and the skill becomes available on every heartbeat — no prompt engineering, no tool wiring.

S
SaaS FactoryPaired

Pre-configured AI company — 18 agents, 18 skills, one-time purchase.

$27$59
Explore pack
Source file
SKILL.md353 lines
Expand
---name: terraform-style-guidedescription: Generate Terraform HCL code following HashiCorp's official style conventions and best practices. Use when writing, reviewing, or generating Terraform configurations.--- # Terraform Style Guide Generate and maintain Terraform code following HashiCorp's official style conventions and best practices. **Reference:** [HashiCorp Terraform Style Guide](https://developer.hashicorp.com/terraform/language/style) ## Code Generation Strategy When generating Terraform code: 1. Start with provider configuration and version constraints2. Create data sources before dependent resources3. Build resources in dependency order4. Add outputs for key resource attributes5. Use variables for all configurable values ## File Organization | File | Purpose ||------|---------|| `terraform.tf` | Terraform and provider version requirements || `providers.tf` | Provider configurations || `main.tf` | Primary resources and data sources || `variables.tf` | Input variable declarations (alphabetical) || `outputs.tf` | Output value declarations (alphabetical) || `locals.tf` | Local value declarations | ### Example Structure ```hcl# terraform.tfterraform {  required_version = ">= 1.7"   required_providers {    aws = {      source  = "hashicorp/aws"      version = "~> 5.0"    }  }} # variables.tfvariable "environment" {  description = "Target deployment environment"  type        = string   validation {    condition     = contains(["dev", "staging", "prod"], var.environment)    error_message = "Environment must be dev, staging, or prod."  }} # locals.tflocals {  common_tags = {    Environment = var.environment    ManagedBy   = "Terraform"  }} # main.tfresource "aws_vpc" "main" {  cidr_block           = var.vpc_cidr  enable_dns_hostnames = true   tags = merge(local.common_tags, {    Name = "${var.project_name}-${var.environment}-vpc"  })} # outputs.tfoutput "vpc_id" {  description = "ID of the created VPC"  value       = aws_vpc.main.id}``` ## Code Formatting ### Indentation and Alignment - Use **two spaces** per nesting level (no tabs)- Align equals signs for consecutive arguments ```hclresource "aws_instance" "web" {  ami           = "ami-0c55b159cbfafe1f0"  instance_type = "t2.micro"  subnet_id     = "subnet-12345678"   tags = {    Name        = "web-server"    Environment = "production"  }}``` ### Block Organization Arguments precede blocks, with meta-arguments first: ```hclresource "aws_instance" "example" {  # Meta-arguments  count = 3   # Arguments  ami           = "ami-0c55b159cbfafe1f0"  instance_type = "t2.micro"   # Blocks  root_block_device {    volume_size = 20  }   # Lifecycle last  lifecycle {    create_before_destroy = true  }}``` ## Naming Conventions - Use **lowercase with underscores** for all names- Use **descriptive nouns** excluding the resource type- Be specific and meaningful- Resource names must be singular, not plural- Default to `main` for resources where a specific descriptive name is redundant or unavailable, provided only one instance exists ```hcl# Badresource "aws_instance" "webAPI-aws-instance" {}resource "aws_instance" "web_apis" {}variable "name" {} # Goodresource "aws_instance" "web_api" {}resource "aws_vpc" "main" {}variable "application_name" {}``` ## Variables Every variable must include `type` and `description`: ```hclvariable "instance_type" {  description = "EC2 instance type for the web server"  type        = string  default     = "t2.micro"   validation {    condition     = contains(["t2.micro", "t2.small", "t2.medium"], var.instance_type)    error_message = "Instance type must be t2.micro, t2.small, or t2.medium."  }} variable "database_password" {  description = "Password for the database admin user"  type        = string  sensitive   = true}``` ## Outputs Every output must include `description`: ```hcloutput "instance_id" {  description = "ID of the EC2 instance"  value       = aws_instance.web.id} output "database_password" {  description = "Database administrator password"  value       = aws_db_instance.main.password  sensitive   = true}``` ## Dynamic Resource Creation ### Prefer for_each over count ```hcl# Bad - count for multiple resourcesresource "aws_instance" "web" {  count = var.instance_count  tags  = { Name = "web-${count.index}" }} # Good - for_each with named instancesvariable "instance_names" {  type    = set(string)  default = ["web-1", "web-2", "web-3"]} resource "aws_instance" "web" {  for_each = var.instance_names  tags     = { Name = each.key }}``` ### count for Conditional Creation ```hclresource "aws_cloudwatch_metric_alarm" "cpu" {  count = var.enable_monitoring ? 1 : 0   alarm_name = "high-cpu-usage"  threshold  = 80}``` ## Security Best Practices When generating code, apply security hardening: - Enable encryption at rest by default- Configure private networking where applicable- Apply principle of least privilege for security groups- Enable logging and monitoring- Never hardcode credentials or secrets- Mark sensitive outputs with `sensitive = true` ### Example: Secure S3 Bucket ```hclresource "aws_s3_bucket" "data" {  bucket = "${var.project}-${var.environment}-data"  tags   = local.common_tags} resource "aws_s3_bucket_versioning" "data" {  bucket = aws_s3_bucket.data.id   versioning_configuration {    status = "Enabled"  }} resource "aws_s3_bucket_server_side_encryption_configuration" "data" {  bucket = aws_s3_bucket.data.id   rule {    apply_server_side_encryption_by_default {      sse_algorithm     = "aws:kms"      kms_master_key_id = aws_kms_key.s3.arn    }  }} resource "aws_s3_bucket_public_access_block" "data" {  bucket = aws_s3_bucket.data.id   block_public_acls       = true  block_public_policy     = true  ignore_public_acls      = true  restrict_public_buckets = true}``` ## Version Pinning ```hclterraform {  required_version = ">= 1.7"   required_providers {    aws = {      source  = "hashicorp/aws"      version = "~> 5.0"  # Allow minor updates    }  }}``` **Version constraint operators:**- `= 1.0.0` - Exact version- `>= 1.0.0` - Greater than or equal- `~> 1.0` - Allow rightmost component to increment- `>= 1.0, < 2.0` - Version range ## Provider Configuration ```hclprovider "aws" {  region = "us-west-2"   default_tags {    tags = {      ManagedBy = "Terraform"      Project   = var.project_name    }  }} # Aliased provider for multi-regionprovider "aws" {  alias  = "east"  region = "us-east-1"}``` ## Version Control **Never commit:**- `terraform.tfstate`, `terraform.tfstate.backup`- `.terraform/` directory- `*.tfplan`- `.tfvars` files with sensitive data **Always commit:**- All `.tf` configuration files- `.terraform.lock.hcl` (dependency lock file) ## Validation Tools Run before committing: ```bashterraform fmt -recursiveterraform validate``` Additional tools:- `tflint` - Linting and best practices- `checkov` / `tfsec` - Security scanning ## Code Review Checklist - [ ] Code formatted with `terraform fmt`- [ ] Configuration validated with `terraform validate`- [ ] Files organized according to standard structure- [ ] All variables have type and description- [ ] All outputs have descriptions- [ ] Resource names use descriptive nouns with underscores- [ ] Version constraints pinned explicitly- [ ] Sensitive values marked with `sensitive = true`- [ ] No hardcoded credentials or secrets- [ ] Security best practices applied --- *Based on: [HashiCorp Terraform Style Guide](https://developer.hashicorp.com/terraform/language/style)*