CodePipeline
User Guide (API Version 2015-07-09)

Integrations with CodePipeline Action Types

The integrations information in this topic is organized by CodePipeline action type.

Source Action Integrations

The following information is organized by CodePipeline action type and can help you configure CodePipeline to integrate with the products and services you use.

Amazon Simple Storage Service (Amazon S3)

Amazon S3 is storage for the internet. You can use Amazon S3 to store and retrieve any amount of data at any time, from anywhere on the web. You can configure CodePipeline to use a versioned Amazon S3 bucket as the source stage for your code. You must first create the bucket and then enable versioning on it before you can create a pipeline that uses the bucket as part of a source action within a stage.

Note

Amazon S3 can also be included in a pipeline as a deploy action.

Learn more:

AWS CodeCommit

CodeCommit is a version control service hosted by AWS that you can use to privately store and manage assets (such as documents, source code, and binary files) in the cloud. You can configure CodePipeline to use a branch in a CodeCommit repository as the source stage for your code. You must first create the repository and associate it with a working directory on your local machine before you can create a pipeline that uses the branch as part of a source action within a stage. You can connect to the CodeCommit repository by either creating a new pipeline or editing an existing one.

Learn more:

GitHub

You can configure CodePipeline to use a GitHub repository as the source stage for your code. You must have previously created a GitHub account and at least one GitHub repository. You can connect to the GitHub repository by either creating a new pipeline or editing an existing one.

Note

CodePipeline integration with GitHub Enterprise is not supported.

The first time you add a GitHub repository to a pipeline, you will be asked to authorize CodePipeline access to your repositories. To integrate with GitHub, CodePipeline creates an OAuth application for your pipeline and, if your pipeline is created or updated in the console, CodePipeline creates a GitHub webhook that starts your pipeline when a change occurs in the repository. The token and webhook require the following GitHub scopes:

  • The repo scope, which is used for full control to read and pull artifacts from public and private repositories into a pipeline.

  • The admin:repo_hook scope, which is used for full control of repository hooks.

For more information about GitHub scopes, see the GitHub Developer API Reference.

Access for CodePipeline is configured for all repositories to which that GitHub account has access; it cannot currently be configured for individual repositories. You can revoke this access from GitHub by choosing Settings, choosing Applications, and then, under Authorized applications, finding CodePipeline in the list of authorized applications and choosing Revoke. Revoking access will immediately prevent CodePipeline from accessing any GitHub repositories previously configured for access with that GitHub account.

If you want to limit the access CodePipeline has to specific set of repositories, create a GitHub account, grant that account access to only the repositories you want to integrate with CodePipeline, and then use that account when configuring CodePipeline to use GitHub repositories for source stages in pipelines.

Learn more:

Amazon ECR Amazon ECR is an AWS Docker image repository service. You use Docker push and pull commands to upload Docker images to your repository. An Amazon ECR repository URI and image are used in Amazon ECS task definitions to reference source image information.

Learn more:

Build Action Integrations

AWS CodeBuild

CodeBuild is a fully managed build service in the cloud. CodeBuild compiles your source code, runs unit tests, and produces artifacts that are ready to deploy.

You can add CodeBuild as a build action to the build stage of a pipeline. You can use an existing build project or create one in the CodePipeline console. The output of the build project can then be deployed as part of a pipeline.

Note

CodeBuild can also be included in a pipeline as a test action, with or without a build output.

You can configure your pipeline to manage a CodeBuild project with multiple input and output artifacts. For more information, see CodePipeline Integration with CodeBuild and Multiple Input Sources and Output Artifacts Sample.

Note

For multiple output artifacts, you can configure your pipeline to to map output artifacts to the secondary artifacts in the build spec. For multiple input artifacts, use the optional PrimarySource parameter in your pipeline structure to designate the directory where CodeBuild looks for and runs your build spec file.

Learn more:

CloudBees You can configure CodePipeline to use CloudBees to build or test your code in one or more actions in a pipeline.

Learn more:

Jenkins

You can configure CodePipeline to use Jenkins CI to build or test your code in one or more actions in a pipeline. You must have previously created a Jenkins project and installed and configured the CodePipeline Plugin for Jenkins for that project. You can connect to the Jenkins project by either creating a new pipeline or editing an existing one.

Access for Jenkins is configured on a per-project basis. You must install the CodePipeline Plugin for Jenkins on every Jenkins instance you want to use with CodePipeline. In addition, you must configure CodePipeline access to the Jenkins project. You should secure your Jenkins project by configuring it to accept HTTPS/SSL connections only. If your Jenkins project is installed on an Amazon EC2 instance, consider providing your AWS credentials by installing the AWS CLI on each instance and configuring an AWS profile on those instances with the IAM user profile and AWS credentials you want to use for connections between CodePipeline and Jenkins, rather than adding them or storing them through the Jenkins web interface.

Learn more:

TeamCity

You can configure CodePipeline to use TeamCity to build and test your code in one or more actions in a pipeline.

Learn more:

Test Action Integrations

AWS CodeBuild

CodeBuild is a fully managed build service in the cloud. CodeBuild compiles your source code, runs unit tests, and produces artifacts that are ready to deploy.

You can add CodeBuild to a pipeline as a test action to run unit tests against your code, with or without a build output artifact. If you generate an output artifact for the test action, it can be deployed as part of a pipeline. You can use an existing build project or create one in the CodePipeline console.

Note

CodeBuild can also be included in a pipeline as a build action, with a mandatory build output artifact.

Learn more:

AWS Device Farm

AWS Device Farm is an app testing service that you can use to test and interact with your Android, iOS, and web applications on real, physical phones and tablets that are hosted by Amazon Web Services (AWS). You can configure CodePipeline to use AWS Device Farm to test your code in one or more actions in a pipeline. AWS Device Farm allows you to upload your own tests or use built-in, script-free compatibility tests. Because testing is automatically performed in parallel, tests on multiple devices begin in minutes. A test report containing high-level results, low-level logs, pixel-to-pixel screenshots, and performance data is updated as tests are completed. AWS Device Farm supports testing of native and hybrid Android, iOS, and Fire OS apps, including those created with PhoneGap, Titanium, Xamarin, Unity, and other frameworks. It supports remote access of Android apps, which allows you to interact directly with test devices.

Learn more:

BlazeMeter You can configure CodePipeline to use BlazeMeter to test your code in one or more actions in a pipeline.

Learn more:

Ghost Inspector

You can configure CodePipeline to use Ghost Inspector to test your code in one or more actions in a pipeline.

Learn more:

Micro Focus StormRunner Load You can configure CodePipeline to use Micro Focus StormRunner Load in one or more actions in a pipeline.

Learn more:

Nouvola You can configure CodePipeline to use Nouvola to test your code in one or more actions in a pipeline.

Learn more:

Runscope You can configure CodePipeline to use Runscope to test your code in one or more actions in a pipeline.

Learn more:

Deploy Action Integrations

Amazon Simple Storage Service (Amazon S3)

Amazon S3 is storage for the internet. You can use Amazon S3 to store and retrieve any amount of data at any time, from anywhere on the web. You can add an action to a pipeline that uses Amazon S3 as a deployment provider.

Note

Amazon S3 can also be included in a pipeline as a source action.

Learn more:

AWS CloudFormation

AWS CloudFormation gives developers and systems administrators an easy way to create and manage a collection of related AWS resources, using templates to provision and update those resources. You can use AWS CloudFormation’s sample templates or create your own templates to describe the AWS resources, and any associated dependencies or runtime parameters, required to run your application.

The AWS Serverless Application Model (AWS SAM) extends AWS CloudFormation to provide a simplified way to define and deploy serverless applications. AWS SAM supports Amazon API Gateway APIs, AWS Lambda functions, and Amazon DynamoDB tables. You can use CodePipeline with AWS CloudFormation and the AWS Serverless Application Model to continuously deliver your serverless applications.

You can add an action to a pipeline that uses AWS CloudFormation as a deployment provider. The unique role of AWS CloudFormation as a deployment provider enables you to take action on AWS CloudFormation stacks and change sets as part of a pipeline execution. AWS CloudFormation can create, update, replace, and delete stacks and change sets when a pipeline runs. As a result, AWS and custom resources can be created, provisioned, updated, or terminated automatically during a pipeline execution according to the specifications you provide in AWS CloudFormation templates and parameter definitions.

Learn more:

AWS CodeDeploy

CodeDeploy coordinates application deployments to Amazon EC2 instances, on-premise instances, or both. You can configure CodePipeline to use CodeDeploy to deploy your code. You can create the CodeDeploy application, deployment, and deployment group to use in a deploy action within a stage either before you create the pipeline or when you use the Create Pipeline wizard.

Learn more:

Amazon Elastic Container Service

Amazon ECS is a highly scalable, high performance container management service that allows you to run container-based applications in the AWS Cloud. When you create a pipeline, you can select Amazon ECS as a deployment provider. A change to code in your source control repository triggers your pipeline to build a new Docker image, push it to your container registry, and then deploy the updated image to Amazon ECS. You can also use the ECS (Blue/Green) provider action in CodePipeline to route and deploy traffic to Amazon ECS with CodeDeploy.

Learn more:

AWS Elastic Beanstalk

Elastic Beanstalk is an easy-to-use service for deploying and scaling web applications and services developed with Java, .NET, PHP, Node.js, Python, Ruby, Go, and Docker on familiar servers such as Apache, Nginx, Passenger, and IIS. You can configure CodePipeline to use Elastic Beanstalk to deploy your code. You can create the Elastic Beanstalk application and environment to use in a deploy action within a stage either before you create the pipeline or when you use the Create Pipeline wizard.

Learn more:

AWS OpsWorks Stacks

AWS OpsWorks is a configuration management service that helps you configure and operate applications of all shapes and sizes using Chef. Using AWS OpsWorks Stacks, you can define the application’s architecture and the specification of each component including package installation, software configuration and resources such as storage. You can configure CodePipeline to use AWS OpsWorks Stacks to deploy your code in conjunction with custom Chef cookbooks and applications in AWS OpsWorks.

  • Custom Chef Cookbooks – AWS OpsWorks uses Chef Cookbooks to handle tasks such as installing and configuring packages and deploying applications.

  • Applications – An AWS OpsWorks applications consists of code that you want to run on an application server. The application code is stored in a repository, such as an Amazon S3 bucket.

You create the AWS OpsWorks stack and layer you want to use before you create the pipeline. You can create the AWS OpsWorks application to use in a deploy action within a stage either before you create the pipeline or when you use the Create Pipeline wizard.

CodePipeline support for AWS OpsWorks is currently available in the US East (N. Virginia) Region (us-east-1) only.

Learn more:

AWS Service Catalog

AWS Service Catalog enables organizations to create and manage catalogs of products that are approved for use on AWS.

You can configure CodePipeline to deploy updates and versions of your product templates to AWS Service Catalog. You can create the AWS Service Catalog product to use in a deployment action and then use the Create Pipeline wizard to create the pipeline.

Learn more:

Alexa Skills Kit

Amazon Alexa Skills Kit lets you build and distribute cloud-based skills to users of Alexa-enabled devices.

You can add an action to a pipeline that uses Alexa Skills Kit as a deployment provider. Source changes are detected by your pipeline, and then your pipeline deploys updates to your Alexa skill in the Alexa service.

Learn more:

XebiaLabs You can configure CodePipeline to use XebiaLabs to deploy your code in one or more actions in a pipeline.

Learn more:

Approval Action Integrations

Amazon Simple Notification Service

Amazon SNS is a fast, flexible, fully managed push notification service that lets you send individual messages or to fan-out messages to large numbers of recipients. Amazon SNS makes it simple and cost effective to send push notifications to mobile device users, email recipients or even send messages to other distributed services.

When you create an manual approval request in CodePipeline, you can optionally publish to a topic in to Amazon SNS so that all IAM users subscribed to it are notified that the Approval action is ready to be approved or rejected.

Learn more:

Invoke Action Integrations

AWS Lambda

Lambda lets you run code without provisioning or managing servers. You can configure CodePipeline to use Lambda functions to add flexibility and functionality to your pipelines. You can create the Lambda function to add as an action in a stage either before you create the pipeline or when you use the Create Pipeline wizard.

Learn more: