AWS Lambda
Developer Guide

Test Your Serverless Applications Locally Using SAM Local (Public Beta)

This feature is available as part of a public beta and is subject to change at any time.

Described preceding, AWS SAM is a fast and easy way of deploying your serverless applications, allowing you to write simple templates to describe your functions and their event sources (Amazon API Gateway, Amazon S3, Kinesis, and so on). Based on AWS SAM, SAM Local is an AWS CLI tool that provides an environment for you to develop, test, and analyze your serverless applications locally before uploading them to the Lambda runtime. Whether you're developing on Linux, Mac, or Microsoft Windows, you can use SAM Local to create a local testing environment that simulates the AWS runtime environment. Doing so helps you address issues such as performance. Working with SAM Local also allows faster, iterative development of your Lambda function code because there is no need to redeploy your application package to the AWS Lambda runtime. For more information, see Building a Simple Application Using SAM Local.

SAM Local works with AWS SAM, allowing you to invoke functions defined in SAM templates, whether directly or through API Gateway endpoints. By using SAM Local features, you can analyze your serverless application's performance in your own testing environment and update accordingly. The following examples outline additional advantages of using SAM Local with sample operation code. For instance, you can do the following:

  • Generate sample function payloads (for example, an Amazon S3 event).

    $ sam local generate-event s3 --bucket bucket-name --key key-name > event_file.json
  • Test a sample function payload locally with your Lambda functions.

    $ sam local invoke function-name -e event_file.json
  • Spawn a local API Gateway to test HTTP request and response functionality. By using the hot reloading feature, you can test and iterate your functions without having to restart or reload them to the AWS runtime.

    $ sam local start-api
  • Validate that any runtime constraints, such as maximum memory use or timeout limits of your Lambda function invocations, are honored.

  • Inspect AWS Lambda runtime logs, and also any customized logging output specified in your Lambda function code (for example, console.log). SAM Local automatically displays this output. The following shows an example.

    START RequestId: 2137da9a-c79c-1d43-5716-406b4e6b5c0a Version: $LATEST 2017-05-18T13:18:57.852Z 2137da9a-c79c-1d43-5716-406b4e6b5c0a Error: any error information END RequestId: 2137da9a-c79c-1d43-5716-406b4e6b5c0a REPORT RequestId: 2137da9a-c79c-1d43-5716-406b4e6b5c0a Duration: 12.78 ms Billed Duration: 100 ms Memory Size: 128 MB Max Memory Used: 29 MB
  • Honor security credentials that you've established by using the AWS CLI. Doing so means your Lambda function can make remote calls to the AWS services that make up your serverless application. If you have not installed the AWS CLI, see Installing the AWS Command Line Interface.

    As with the AWS CLI and SDKs, SAM Local looks for credentials in the following order:

    • Environment variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)

    • The AWS credentials file, located at ~/.aws/credentials on Linux, MacOS, or Unix, or at C:\Users\USERNAME \.aws\credentials on Windows)

    • Instance profile credentials, if running on an Amazon EC2 instance with an assigned instance role

Supported Runtimes

SAM Local supports the following AWS runtimes:

  • node.js 4.3

  • node.js 6.10

  • python 2.7

  • python 3.6

  • java8

Requirements for Using SAM Local

To use SAM Local, you need to install Docker and SAM Local.

Installing Docker

Docker is an open-source software container platform that allows you to build, manage and test applications, whether you're running on Linux, Mac or Windows. For more information and download instructions, see Docker.

Once you have Docker installed, SAM Local automatically provides a customized Docker image called docker-lambda. This image is designed specifically by an AWS partner to simulate the live AWS Lambda execution environment. This environment includes installed software, libraries, security permissions, environment variables, and other features outlined at Lambda Execution Environment and Available Libraries.

Using docker-lambda, you can invoke your Lambda function locally. In this environment, your serverless applications execute and perform much as in the AWS Lambda runtime, without your having to redeploy the runtime. Their execution and performance in this environment reflect such considerations as timeouts and memory use.


Because this is a simulated environment, there is no guarantee that your local testing results will exactly match those in the actual AWS runtime.

For more information, see Docker Lambda.

Installing SAM Local

You can run SAM Local on Linux, Mac, and Windows environments. The easiest way to install SAM Local is to use NPM.

npm install -g aws-sam-local

Then verify that the installation succeeded.

sam --version

If NPM doesn't work for you, you can download the latest binary and start using SAM Local immediately. You can find the binaries under the Releases section in the SAM CLI GitHub Repository.

Getting Started Using SAM Local

SAM Local consists of the following CLI operations:

  • start-api: Creates a local HTTP server hosting all of your Lambda functions. When accessed by using a browser or the CLI, this operation launches a Docker container locally to invoke your function. It reads the CodeUri property of the AWS::Serverless::Function resource to find the path in your file system containing the Lambda function code. This path can be the project's root directory for interpreted languages like Node.js or Python, a build directory that stores your compiled artifacts, or for Java, a .jar file.

    If you use an interpreted language, local changes are made available within the same Docker container. This approach means you can reinvoke your Lambda function with no need for redeployment. For compiled languages or projects requiring complex packing support, we recommend that you run your own build solution and point AWS SAM to the directory that contains the build dependency files needed.

  • invoke: Invokes a local Lambda function once and terminates after invocation completes.

    # Invoking function with event file $ sam local invoke "Ratings" -e event.json # Invoking function with event via stdin $ echo '{"message": "Hey, are you there?" }' | sam local invoke "Ratings" # For more options $ sam local invoke --help
  • generate-event: Generates mock serverless events. Using these, you can develop and test locally on functions that respond to asynchronous events such as those from Amazon S3, Kinesis, and DynamoDB. The following displays the command options available to the generate-event operation.

    sam local generate-event NAME: sam local generate-event - Generates Lambda events (e.g. for S3/Kinesis etc) that can be piped to 'sam local invoke' USAGE: sam local generate-event command [command options] [arguments...] COMMANDS: s3 Generates a sample Amazon S3 event sns Generates a sample Amazon SNS event kinesis Generates a sample Amazon Kinesis event dynamodb Generates a sample Amazon DynamoDB event api Generates a sample Amazon API Gateway event schedule Generates a sample scheduled event OPTIONS: --help, -h show help
  • validate: Validates your template against the official AWS Serverless Application Model specification. The following is an example.

    $ sam validate ERROR: Resource "HelloWorld", property "Runtime": Invalid value node. Valid values are "nodejs4.3", "nodejs6.10", "java8", "python2.7", "python3.6"(line: 11; col: 6) # Let's fix that error... $ sed -i 's/node/nodejs6.10/g' template.yaml $ sam validate Valid!
  • package and deploy: sam package and sam deploy implicitly call AWS CloudFormation's package and deploy commands. For more information on packaging and deployment of SAM applications, see Packaging and Deployment.

    The following demonstrates how to use the package and deploy commands in SAM Local.

    # Package SAM template $ sam package --template-file sam.yaml --s3-bucket mybucket --output-template-file packaged.yaml # Deploy packaged SAM template $ sam deploy --template-file ./packaged.yaml --stack-name mystack --capabilities CAPABILITY_IAM

Building a Simple Application Using SAM Local

Suppose you want to build a simple RESTful API operation that creates, reads, updates, and deletes a list of products. You begin by creating the following directory structure:



The template.yaml file is the AWS SAM template that describes a single Lambda function that handles all the API requests.


By default, the start-api and invoke commands search your working directory for the template.yaml file. If you reference a template.yaml file that is in a different directory, add the -t or --template parameter to these operations and pass an absolute or relative path to that file.

Copy and paste the following into the template.yaml file.

AWSTemplateFormatVersion : '2010-09-09' Transform: AWS::Serverless-2016-10-31 Description: My first serverless application. Resources: Products: Type: AWS::Serverless::Function Properties: Handler: products.handler Runtime: nodejs6.10 Events: ListProducts: Type: Api Properties: Path: /products Method: get CreateProduct: Type: Api Properties: Path: /products Method: post Product: Type: Api Properties: Path: /products/{product} Method: any

The preceding example configures the following RESTful API endpoints:

  • Create a new product with a PUT request to /products.

  • List all products with a GET request to /products.

  • Read, update, or delete a product with GET, PUT or DELETE request to /products/{product}.

Next, copy and paste the following code into the products.js file.

'use strict'; exports.handler = (event, context, callback) => { let id = event.pathParameters.product || false; switch(event.httpMethod){ case "GET": if(id) { callback(null, "This is a READ operation on product ID " + id); return; } callback(null, "This is a LIST operation, return all products"); break; case "POST": callback(null, "This is a CREATE operation"); break; case "PUT": callback(null, "This is a UPDATE operation on product ID " + id); break; case "DELETE": callback(null, "This is a DELETE operation on product ID " + id); break; default: // Send HTTP 501: Not Implemented console.log("Error: unsupported HTTP method (" + event.httpMethod + ")"); callback(null, { statusCode: 501 }) } }

Start a local copy of your API operations by calling the start-api command.

$ sam local start-api 2017/05/18 14:03:01 Successfully parsed template.yaml (AWS::Serverless-2016-10-31) 2017/05/18 14:03:01 Found 1 AWS::Serverless::Function 2017/05/18 14:03:01 Mounting products.handler (nodejs6.10) at /products [POST] 2017/05/18 14:03:01 Mounting products.handler (nodejs6.10) at /products/{product} [OPTIONS GET HEAD POST PUT DELETE TRACE CONNECT] 2017/05/18 14:03:01 Mounting products.handler (nodejs6.10) at /products [GET] 2017/05/18 14:03:01 Listening on http://localhost:3000 You can now browse to the above endpoints to invoke your functions. You do not need to restart/reload while working on your functions, changes will be reflected instantly/automatically. You only need to restart if you update your AWS SAM template.

You can then test your API endpoint locally using either a browser or the CLI.

$ curl http://localhost:3000/products "This is a LIST operation, return all products" $ curl -XDELETE http://localhost:3000/products/1 "This is a DELETE operation on product ID 1"

To see more samples, see aws sam local/samples.

Local Logging

Using the invoke and start-api commands, you can pipe logs from your Lambda function's invocation into a file. This approach is useful if you run automated tests against SAM Local and want to capture logs for analysis. The following is an example.

$ sam local invoke --log-file ./output.log

Using an Environment Variables File

If your Lambda function uses Environment Variables, SAM Local provides an --env-vars argument for both the invoke and start-api commands. With this argument, you can use a JSON file that contains values for environment variables defined in your function. The JSON file's structure should be similar to the following.

{ "MyFunction1": { "TABLE_NAME": "localtable", "BUCKET_NAME": "testBucket" }, "MyFunction2": { "TABLE_NAME": "localtable", "STAGE": "dev" }, }

You then access the JSON file using the following command:

$ sam local start-api --env-vars env.json

Using a Shell Environment

Variables defined in your shell environment are passed to the Docker container if they map to a variable in your Lambda function. Shell variables are globally accessible to functions. For example, suppose you have two functions, MyFunction1 and MyFunction2, which have a variable called TABLE_NAME. In this case, the value for TABLE_NAME provided through your shell's environment is available to both functions.

The following command sets the value of TABLE_NAME to myTable for both functions.

$ TABLE_NAME=mytable sam local start-api


For greater flexibility, you can use a combination of shell variables and an external JSON file that holds environment variables. If a variable is defined in both places, the one from the external file override the shell version. Following is the order of priority, highest to lowest:

  • Environment variable file

  • Shell environment

  • Hard-coded values contained in the SAM template

Debugging With SAM Local

Both sam local invoke and sam local start-api support local debugging of your functions. To run SAM Local with debugging support enabled, specify --debug-port or -d on the command line.

# Invoke a function locally in debug mode on port 5858 $ sam local invoke -d 5858 function logical id # Start local API Gateway in debug mode on port 5858 $ sam local start-api -d 5858


If you use sam local start-api, the local API Gateway exposes all of your Lambda functions. But because you can specify only one debug port, you can only debug one function at a time.

Debugging Functions Written in Python

Unlike Node.js or Java, Python requires you to enable remote debugging in your Lambda function code. If you enable debugging (using the --debug-port or -d options mentioned above) for a function that uses one of the Python runtimes (2.7 or 3.6), SAM Local maps through that port from your host machine to the Lambda container. To enable remote debugging, use a Python package such as remote-pdb.


When configuring the host, the debugger listens in on your code, so make sure to use and not