Using CodePipeline to deploy Terraform and CloudFormation templates
Notice
AWS CodeCommit is no longer available to new customers. Existing customers of AWS CodeCommit can continue to use the service as normal.
Learn more
In the DPA, you use building blocks for AWS CodePipeline to create accelerators for Terraform and CloudFormation IaC. This section describes the following for this use case:
-
Standardized pipeline structure
-
Reusable stages and jobs
-
Integrated tools for security scans
The DPA repository contains folders for Terraform
-
pipeline-modules – This folder contains the code for deploying the standardized pipeline structure.
-
shared – This folder contains ready-to-use buildspec files for the DPA stages and jobs.
Prerequisites
-
An active AWS account
-
Permissions to provision resources using IaC templates
-
Permissions to create AWS CodeCommit repositories and CodePipeline components
Tools
-
cfn-lint
is a linter that checks CloudFormation YAML or JSON templates against the AWS CloudFormation resource specification. It also performs other checks, such as checking for valid values for resource properties and adherence to best practices. -
cfn_nag
is an open source tool that identifies potential security issues in CloudFormation templates by searching for patterns. -
Checkov
is a static code-analysis tool that checks IaC for security and compliance misconfigurations. -
TFLint
is a linter that checks Terraform code for potential errors and adherence to best practices. -
tfsec
is a static code-analysis tool that checks Terraform code for potential misconfigurations.
Instructions
Create CodeCommit repositories
-
Create two separate CodeCommit repositories as follows:
-
common-repo
– This repository contains the shared libraries, buildspec files, and dependencies. -
app-repo
– This repository contains the Terraform or CloudFormation templates to deploy your infrastructure.
For instructions, see Create an AWS CodeCommit repository.
-
-
In the
common-repo
repo, create a folder namedshared
. Copy the buildspec files from the Terraformor CloudFormation shared folder in the GitHub DPA repo to the new folder. For instructions, see Create or add a file to an AWS CodeCommit repository. -
In the
app-repo
repository, create a folder namedentrypoint
. Copy the file from the Terraformor CloudFormation entrypoint folder in the GitHub DPA repo to the new folder. For more information about these files, see Understanding the entry point JSON file. -
Review the Terraform
or CloudFormation examples directory, and then structure your app-repo
folder according to these examples. These directories contain examples for deploying an Amazon Elastic Compute Cloud (Amazon EC2) instance or Amazon Simple Storage Service (Amazon S3) bucket. -
Continue to the one of the following two sections:
Create the pipeline and define stages (Terraform)
-
Clone the DevOps Pipeline Accelerator (DPA) repository
from GitHub to your local workstation. -
In the cloned repository, navigate to the
aws-codepipeline/terraform/pipeline-modules
folder. -
In the terraform.tfvars file, update and validate the Terraform state and AWS Identity and Access Management (IAM) role-specific variables.
-
Create a Docker image. For instructions, see Docker image creation for using in CodeBuild
(GitHub). -
Update the
builder_image
variable that is defined in the terraform.tfvars file. -
Enter the following commands. This initializes, previews, and deploys the infrastructure through Terraform.
terraform init terraform plan terraform apply
-
Sign in to the AWS account. In the CodePipeline console
, confirm that the new pipeline has been created. Note: If the first run is in a
failed
state, repeat the previous step. -
When the new CodePipeline pipeline is created, a new IAM role for AWS CodeBuild is created automatically. The name of this automatically created role ends in
-codebuild-role
. Update this role with the permissions that are required to deploy your infrastructure.
Create the pipeline and define stages (CloudFormation)
-
Clone the DevOps Pipeline Accelerator (DPA) repository
from GitHub to your local workstation. -
In the cloned repository, navigate to the
aws-codepipeline/cloudformation/pipeline-modules
folder. -
Deploy the pipeline-cft.yaml CloudFormation template. The following are the required parameters that you must pass to the stack.
-
ArtifactsBucket
– Name of the repo that contains the pipeline artifacts to be updated -
EcrDockerRepository
– Uniform resource identifier (URI) of the Amazon ECR repository with theimage
tag -
CodeCommitAppRepo
– Name of the CodeCommit repository that contains the templates -
CodeCommitBaseRepo
– Name of the CodeCommit repository that contains the shared files -
CodeCommitRepoBranch
– Name of the CodeCommit repository branch -
SNSMailAddress
– Email address that will receive Amazon Simple Notification Service (Amazon SNS) notifications about pipeline status
For instructions, see Working with stacks in the CloudFormation documentation.
-
-
Sign in to the AWS account. In the CodePipeline console
, confirm that the new pipeline has been created. -
When the new CodePipeline pipeline is created, a new IAM role for AWS CodeBuild is created automatically. The name of this automatically created role ends in
-codebuild-role
. Update this role with the permissions that are required to deploy your infrastructure.
Understanding the entry point JSON file
Terraform entry point file
This is the main configuration file. In this file, you can customize and enable or disable a stage. If you disable a stage, it does not delete or remove the stage from the pipeline. Instead, the stage is skipped during runtime.
{ "build_stage_required" : "true", "test_stage_required" : "true", "predeploy_stage_required": "true", "deploy_stage_required": "true", "postdeploy_stage_required": "true", "destroy_stage_required": "true", "bucket":"tf-state-dpa", # S3 bucket used for Terraform backend "key":"terraform_test.tfstate", # S3 key to be used "region":"us-east-1", "dynamodb_table":"tf-state-dpa" # DynamoDB Table for Terraform backend }
CloudFormation entry point file
This is the main configuration file. In this file, you customize stages and enable or disable them. If you disable a stage, it does not delete or remove the stage from the pipeline. Instead, the pipeline skips the stage during runtime.
{ "init_stage_required" : "true", "test_stage_required" : "true", "createinfra_stage_required": "true", "envType" : "cloudformation", "stage_required" : "true", "cft_s3_bucket" : "pipeline-bucket", #S3 bucket from the destination account to keep CFT templates "stack_name" : "aws-cft-poc", #CloudFormation stack name "account" : "************", #Destination AWS account to deploy stack "roleName" : "codestack-poc-cross-account-role", #Cross-account IAM role name "region" : "us-east-1", "destroy_stack" : "false" #To destroy the provisioned stack, set this value to "true" }