Tutorial: Using an Amazon S3 trigger to invoke a Lambda function - AWS Lambda

Tutorial: Using an Amazon S3 trigger to invoke a Lambda function

In this tutorial, you use the console to create a Lambda function and configure a trigger for an Amazon Simple Storage Service (Amazon S3) bucket. Every time that you add an object to your Amazon S3 bucket, your function runs and outputs the object type to Amazon CloudWatch Logs.


      Diagram showing flow of data between an Amazon S3 bucket, a Lambda function and Amazon CloudWatch Logs

You can use a Lambda function with an Amazon S3 trigger to perform many types of file processing tasks. For example, you can use a Lambda function to create a thumbnail whenever an image file is uploaded to your Amazon S3 bucket, or to convert uploaded documents into different formats. After you’ve completed this tutorial, you can try the Using an Amazon S3 trigger to create thumbnail images tutorial to perform an image processing task.

To complete this tutorial, you carry out the following steps:

  1. Create an Amazon S3 bucket.

  2. Create a Lambda function that returns the object type of objects in an Amazon S3 bucket.

  3. Configure a Lambda trigger that invokes your function when objects are uploaded to your bucket.

  4. Test your function, first with a dummy event, and then using the trigger.

By completing these steps, you’ll learn how to configure a Lambda function to run whenever objects are added to or deleted from an Amazon S3 bucket. You can complete this tutorial using only the AWS Management Console.

Prerequisites

If you do not have an AWS account, complete the following steps to create one.

To sign up for an AWS account
  1. Open https://portal.aws.amazon.com/billing/signup.

  2. Follow the online instructions.

    Part of the sign-up procedure involves receiving a phone call and entering a verification code on the phone keypad.

    When you sign up for an AWS account, an AWS account root user is created. The root user has access to all AWS services and resources in the account. As a security best practice, assign administrative access to an administrative user, and use only the root user to perform tasks that require root user access.

AWS sends you a confirmation email after the sign-up process is complete. At any time, you can view your current account activity and manage your account by going to https://aws.amazon.com/ and choosing My Account.

After you sign up for an AWS account, secure your AWS account root user, enable AWS IAM Identity Center, and create an administrative user so that you don't use the root user for everyday tasks.

Secure your AWS account root user
  1. Sign in to the AWS Management Console as the account owner by choosing Root user and entering your AWS account email address. On the next page, enter your password.

    For help signing in by using root user, see Signing in as the root user in the AWS Sign-In User Guide.

  2. Turn on multi-factor authentication (MFA) for your root user.

    For instructions, see Enable a virtual MFA device for your AWS account root user (console) in the IAM User Guide.

Create an administrative user
  1. Enable IAM Identity Center.

    For instructions, see Enabling AWS IAM Identity Center in the AWS IAM Identity Center User Guide.

  2. In IAM Identity Center, grant administrative access to an administrative user.

    For a tutorial about using the IAM Identity Center directory as your identity source, see Configure user access with the default IAM Identity Center directory in the AWS IAM Identity Center User Guide.

Sign in as the administrative user
  • To sign in with your IAM Identity Center user, use the sign-in URL that was sent to your email address when you created the IAM Identity Center user.

    For help signing in using an IAM Identity Center user, see Signing in to the AWS access portal in the AWS Sign-In User Guide.

Create an Amazon S3 bucket


        Tutorial workflow diagram showing you are in the Amazon S3 bucket step creating the bucket

First create an Amazon S3 bucket using the AWS Management Console.

To create an Amazon S3 bucket
  1. Open the Amazon S3 console and select the Buckets page.

  2. Choose Create bucket.

  3. Under General configuration, do the following:

    1. For Bucket name, enter a globally unique name that meets the Amazon S3 Bucket naming rules. Bucket names can contain only lower case letters, numbers, dots (.), and hyphens (-).

    2. For AWS Region, choose a Region. Later in the tutorial, you must create your Lambda function in the same Region.

  4. Leave all other options set to their default values and choose Create bucket.

Upload a test object to your bucket


        Tutorial workflow diagram showing you are in the Amazon S3 bucket step uploading a test object

Later in the tutorial, you’ll test your Lambda function in the Lambda console. To confirm that your function’s code is working correctly, your Amazon S3 bucket needs to contain a test object. This object can be any file you choose (for example HappyFace.jpg).

To upload a test object
  1. Open the Buckets page of the Amazon S3 console and choose the bucket you created during the previous step.

  2. Choose Upload.

  3. Choose Add files and use the file selector to choose the object you want to upload.

  4. Choose Open, then choose Upload.

When you test your function code later in the tutorial, you pass it data containing the file name of the object you uploaded, so make a note of it now.

Create a permissions policy


        Tutorial workflow diagram showing you are in the Lambda function step creating the permissions policy

Before you can create an execution role for you Lambda function, you first create a permissions policy to give your function permission to access the required AWS resources. For this tutorial, the policy allows Lambda to get objects from an Amazon S3 bucket and to write to Amazon CloudWatch Logs.

To create the policy
  1. Open the Policies page of the IAM console.

  2. Choose Create Policy.

  3. Choose the JSON tab, and then paste the following custom policy into the JSON editor.

    { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "logs:PutLogEvents", "logs:CreateLogGroup", "logs:CreateLogStream" ], "Resource": "arn:aws:logs:*:*:*" }, { "Effect": "Allow", "Action": [ "s3:GetObject" ], "Resource": "arn:aws:s3:::*/*" } ] }
  4. Choose Next: Tags.

  5. Choose Next: Review.

  6. Under Review policy, for the policy Name, enter s3-trigger-tutorial.

  7. Choose Create policy.

Create an execution role


        Tutorial workflow diagram showing you are in the Lambda function step creating the execution role

An execution role is an AWS Identity and Access Management (IAM) role that grants a Lambda function permission to access AWS services and resources. To enable your function to get objects from an Amazon S3 bucket, you attach the permissions policy you created in the previous step.

To create an execution role and attach your custom permissions policy
  1. Open the Roles page of the IAM console.

  2. Choose Create role.

  3. For the type of trusted entity, choose AWS service, then for the use case, choose Lambda.

  4. Choose Next.

  5. In the policy search box, enter s3-trigger-tutorial.

  6. In the search results, select the policy that you created (s3-trigger-tutorial), and then choose Next.

  7. Under Role details, for the Role name, enter lambda-s3-trigger-role, then choose Create role.

Create the Lambda function


        Tutorial workflow diagram showing you are in the Lambda function step creating the function

In this example, you create a Lambda function in the console using the Node.js 16.x runtime. The function you create in the console contains some basic ‘Hello World’ code. In the next step, you’ll replace this with the function code to get an object from your Amazon S3 bucket.

To create the Lambda function
  1. Open the Functions page of the Lambda console.

  2. Make sure you're working in the same AWS Region you created your Amazon S3 bucket in. You can change your Region using the drop-down list at the top of the screen.

    
            Image showing drop down region menu in Lambda console
  3. Choose Create function.

  4. Choose Author from scratch

  5. Under Basic information, do the following:

    1. For Function name, enter s3-trigger-tutorial

    2. For Runtime, choose Node.js 16.x.

    3. For Architecture, choose x86_64.

  6. In the Change default execution role tab, do the following:

    1. Expand the tab, then choose Use an existing role.

    2. Select the lambda-s3-trigger-role you created earlier.

  7. Choose Create function.

Deploy the function code


        Tutorial workflow diagram showing you are in the Lambda function step deploying the code

Your Lambda function will retrieve the key name of the uploaded object and the name of the bucket from the event parameter it receives from Amazon S3. The function then uses the HeadObject API call in the AWS SDK for JavaScript to get the object type for the uploaded object.

This tutorial uses the Node.js 16.x runtime, but we’ve also provided example code files for other runtimes. You can select the tab in the following box to see the code for the runtime you’re interested in. The JavaScript code you’ll deploy is the first example shown in the tab labeled JavaScript.

.NET
AWS SDK for .NET
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Serverless examples repository.

Consuming an S3 event with Lambda using .NET.

using System.Threading.Tasks; using Amazon.Lambda.Core; using Amazon.S3; using System; using Amazon.Lambda.S3Events; using System.Web; // Assembly attribute to enable the Lambda function's JSON input to be converted into a .NET class. [assembly: LambdaSerializer(typeof(Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer))] namespace S3Integration { public class Function { private static AmazonS3Client _s3Client; public Function() : this(null) { } internal Function(AmazonS3Client s3Client) { _s3Client = s3Client ?? new AmazonS3Client(); } public async Task<string> Handler(S3Event evt, ILambdaContext context) { try { if (evt.Records.Count <= 0) { context.Logger.LogLine("Empty S3 Event received"); return string.Empty; } var bucket = evt.Records[0].S3.Bucket.Name; var key = HttpUtility.UrlDecode(evt.Records[0].S3.Object.Key); context.Logger.LogLine($"Request is for {bucket} and {key}"); var objectResult = await _s3Client.GetObjectAsync(bucket, key); context.Logger.LogLine($"Returning {objectResult.Key}"); return objectResult.Key; } catch (Exception e) { context.Logger.LogLine($"Error processing request - {e.Message}"); return string.Empty; } } } }
Go
SDK for Go V2
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Serverless examples repository.

Consuming an S3 event with Lambda using Go.

package main import ( "context" "log" "github.com/aws/aws-lambda-go/events" "github.com/aws/aws-lambda-go/lambda" "github.com/aws/aws-sdk-go-v2/config" "github.com/aws/aws-sdk-go-v2/service/s3" ) func handler(ctx context.Context, s3Event events.S3Event) error { sdkConfig, err := config.LoadDefaultConfig(ctx) if err != nil { log.Printf("failed to load default config: %s", err) return err } s3Client := s3.NewFromConfig(sdkConfig) for _, record := range s3Event.Records { bucket := record.S3.Bucket.Name key := record.S3.Object.URLDecodedKey headOutput, err := s3Client.HeadObject(ctx, &s3.HeadObjectInput{ Bucket: &bucket, Key: &key, }) if err != nil { log.Printf("error getting head of object %s/%s: %s", bucket, key, err) return err } log.Printf("successfully retrieved %s/%s of type %s", bucket, key, *headOutput.ContentType) } return nil } func main() { lambda.Start(handler) }
Java
SDK for Java 2.x
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Serverless examples repository.

Consuming an S3 event with Lambda using Java.

package example; import software.amazon.awssdk.services.s3.model.HeadObjectRequest; import software.amazon.awssdk.services.s3.model.HeadObjectResponse; import software.amazon.awssdk.services.s3.S3Client; import com.amazonaws.services.lambda.runtime.Context; import com.amazonaws.services.lambda.runtime.RequestHandler; import com.amazonaws.services.lambda.runtime.events.S3Event; import com.amazonaws.services.lambda.runtime.events.models.s3.S3EventNotification.S3EventNotificationRecord; import org.slf4j.Logger; import org.slf4j.LoggerFactory; public class Handler implements RequestHandler<S3Event, String> { private static final Logger logger = LoggerFactory.getLogger(Handler.class); @Override public String handleRequest(S3Event s3event, Context context) { try { S3EventNotificationRecord record = s3event.getRecords().get(0); String srcBucket = record.getS3().getBucket().getName(); String srcKey = record.getS3().getObject().getUrlDecodedKey(); S3Client s3Client = S3Client.builder().build(); HeadObjectResponse headObject = getHeadObject(s3Client, srcBucket, srcKey); logger.info("Successfully retrieved " + srcBucket + "/" + srcKey + " of type " + headObject.contentType()); return "Ok"; } catch (Exception e) { throw new RuntimeException(e); } } private HeadObjectResponse getHeadObject(S3Client s3Client, String bucket, String key) { HeadObjectRequest headObjectRequest = HeadObjectRequest.builder() .bucket(bucket) .key(key) .build(); return s3Client.headObject(headObjectRequest); } }
JavaScript
SDK for JavaScript (v2)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Serverless examples repository.

Consuming an S3 event with Lambda using JavaScript.

const aws = require('aws-sdk'); const s3 = new aws.S3({ apiVersion: '2006-03-01' }); exports.handler = async (event, context) => { // Get the object from the event and show its content type const bucket = event.Records[0].s3.bucket.name; const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' ')); const params = { Bucket: bucket, Key: key, }; try { const { ContentType } = await s3.headObject(params).promise(); console.log('CONTENT TYPE:', ContentType); return ContentType; } catch (err) { console.log(err); const message = `Error getting object ${key} from bucket ${bucket}. Make sure they exist and your bucket is in the same region as this function.`; console.log(message); throw new Error(message); } };

Consuming an S3 event with Lambda using TypeScript.

import { S3Event } from 'aws-lambda'; import { S3Client, HeadObjectCommand } from '@aws-sdk/client-s3'; const s3 = new S3Client({ region: process.env.AWS_REGION }); export const handler = async (event: S3Event): Promise<string | undefined> => { // Get the object from the event and show its content type const bucket = event.Records[0].s3.bucket.name; const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' ')); const params = { Bucket: bucket, Key: key, }; try { const { ContentType } = await s3.send(new HeadObjectCommand(params)); console.log('CONTENT TYPE:', ContentType); return ContentType; } catch (err) { console.log(err); const message = `Error getting object ${key} from bucket ${bucket}. Make sure they exist and your bucket is in the same region as this function.`; console.log(message); throw new Error(message); } };
Python
SDK for Python (Boto3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Serverless examples repository.

Consuming an S3 event with Lambda using Python.

import json import urllib.parse import boto3 print('Loading function') s3 = boto3.client('s3') def lambda_handler(event, context): #print("Received event: " + json.dumps(event, indent=2)) # Get the object from the event and show its content type bucket = event['Records'][0]['s3']['bucket']['name'] key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8') try: response = s3.get_object(Bucket=bucket, Key=key) print("CONTENT TYPE: " + response['ContentType']) return response['ContentType'] except Exception as e: print(e) print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket)) raise e
Rust
SDK for Rust
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the Serverless examples repository.

Consuming an S3 event with Lambda using Rust.

use aws_lambda_events::event::s3::S3Event; use aws_sdk_s3::{Client}; use lambda_runtime::{run, service_fn, Error, LambdaEvent}; /// Main function #[tokio::main] async fn main() -> Result<(), Error> { tracing_subscriber::fmt() .with_max_level(tracing::Level::INFO) .with_target(false) .without_time() .init(); // Initialize the AWS SDK for Rust let config = aws_config::load_from_env().await; let s3_client = Client::new(&config); let res = run(service_fn(|request: LambdaEvent<S3Event>| { function_handler(&s3_client, request) })).await; res } async fn function_handler( s3_client: &Client, evt: LambdaEvent<S3Event> ) -> Result<(), Error> { tracing::info!(records = ?evt.payload.records.len(), "Received request from SQS"); if evt.payload.records.len() == 0 { tracing::info!("Empty S3 event received"); } let bucket = evt.payload.records[0].s3.bucket.name.as_ref().expect("Bucket name to exist"); let key = evt.payload.records[0].s3.object.key.as_ref().expect("Object key to exist"); tracing::info!("Request is for {} and object {}", bucket, key); let s3_get_object_result = s3_client .get_object() .bucket(bucket) .key(key) .send() .await; match s3_get_object_result { Ok(_) => tracing::info!("S3 Get Object success, the s3GetObjectResult contains a 'body' property of type ByteStream"), Err(_) => tracing::info!("Failure with S3 Get Object request") } Ok(()) }
To deploy the function code
  1. Open the Functions page of the Lambda console.

  2. Choose the function you created in the previous step (s3-trigger-tutorial).

  3. Choose the Code tab.

  4. Copy and paste the provided JavaScript code into the index.js tab in the Code source pane.

  5. Choose Deploy.

Create the Amazon S3 trigger


        Tutorial workflow diagram showing you are in the S3 trigger step creating the trigger

Now you’ve deployed your function code, you create the Amazon S3 trigger that will invoke your function.

To create the Amazon S3 trigger
  1. In the Function overview pane of your function’s console page, choose Add trigger.

  2. Select S3.

  3. Under Bucket, select the bucket you created earlier in the tutorial.

  4. Under Event types, select All object create events. You can also configure a trigger to invoke Lambda when an object is deleted, but we won’t be using that option in this tutorial.

  5. Under Recursive invocation, select the check box to acknowledge that using the same Amazon S3 bucket for input and output is not recommended. You can learn more about recursive invocation patterns in Lambda by reading Recursive patterns that cause run-away Lambda functions in Serverless Land.

  6. Choose Add.

Test your Lambda function with a dummy event


        Tutorial workflow diagram showing you are in the testing step testing with a dummy event

Now that you’ve created and configured your Lambda function, you’re ready to test it. You first test your function by sending it a dummy Amazon S3 event to confirm it’s working correctly.

To test the Lambda function with a dummy event
  1. In the Lambda console page for your function, choose the Code tab.

  2. In the Code source pane, choose Test.

  3. In the Configure test event box, do the following:

    1. For Event name, enter MyTestEvent.

    2. For Template, choose S3 Put.

    3. In the Event JSON, replace the following values:

      • Replace us-east-1 with the region you created your Amazon S3 bucket in.

      • Replace both instances of my-bucket with the name of your own Amazon S3 bucket.

      • Replace test%2FKey with the name of the test object you uploaded to your bucket earlier (for example, HappyFace.jpg).

      { "Records": [ { "eventVersion": "2.0", "eventSource": "aws:s3", "awsRegion": "us-east-1", "eventTime": "1970-01-01T00:00:00.000Z", "eventName": "ObjectCreated:Put", "userIdentity": { "principalId": "EXAMPLE" }, "requestParameters": { "sourceIPAddress": "127.0.0.1" }, "responseElements": { "x-amz-request-id": "EXAMPLE123456789", "x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH" }, "s3": { "s3SchemaVersion": "1.0", "configurationId": "testConfigRule", "bucket": { "name": "my-bucket", "ownerIdentity": { "principalId": "EXAMPLE" }, "arn": "arn:aws:s3:::my-bucket" }, "object": { "key": "test%2Fkey", "size": 1024, "eTag": "0123456789abcdef0123456789abcdef", "sequencer": "0A1B2C3D4E5F678901" } } } ] }
    4. Choose Save.

  4. In the Code source pane, choose Test.

  5. If your function runs successfully, you’ll see output similar to the following in the Execution results tab.

    Response "image/jpeg" Function Logs START RequestId: 12b3cae7-5f4e-415e-93e6-416b8f8b66e6 Version: $LATEST 2021-02-18T21:40:59.280Z 12b3cae7-5f4e-415e-93e6-416b8f8b66e6 INFO INPUT BUCKET AND KEY: { Bucket: 'my-bucket', Key: 'HappyFace.jpg' } 2021-02-18T21:41:00.215Z 12b3cae7-5f4e-415e-93e6-416b8f8b66e6 INFO CONTENT TYPE: image/jpeg END RequestId: 12b3cae7-5f4e-415e-93e6-416b8f8b66e6 REPORT RequestId: 12b3cae7-5f4e-415e-93e6-416b8f8b66e6 Duration: 976.25 ms Billed Duration: 977 ms Memory Size: 128 MB Max Memory Used: 90 MB Init Duration: 430.47 ms Request ID 12b3cae7-5f4e-415e-93e6-416b8f8b66e6

Test the Lambda function with the Amazon S3 trigger


          Tutorial workflow diagram showing you are in the testing step testing using the S3 trigger

To test your function with the configured trigger, you upload an object to your Amazon S3 bucket using the console. To verify that your Lambda function has been invoked correctly, you then use CloudWatch Logs to view your function’s output.

To upload an object to your Amazon S3 bucket
  1. Open the Buckets page of the Amazon S3 console and choose the bucket you created earlier.

  2. Choose Upload.

  3. Choose Add files and use the file selector to choose an object you want to upload. This object can be any file you choose.

  4. Choose Open, then choose Upload.

To verify correct operation using CloudWatch Logs
  1. Open the CloudWatch console.

  2. Make sure you're working in the same AWS Region you created your Lambda function in. You can change your Region using the drop-down list at the top of the screen.

    
              Image showing drop down region menu in Lambda console
  3. Choose Logs, then choose Log groups.

  4. Choose the log group for your function (/aws/lambda/s3-trigger-tutorial).

  5. Under Log streams, choose the most recent log stream.

  6. If your function has been invoked correctly in response to your Amazon S3 trigger, you’ll see output similar to the following. The CONTENT TYPE you see depends on the type of file you uploaded to your bucket.

    2022-05-09T23:17:28.702Z 0cae7f5a-b0af-4c73-8563-a3430333cc10 INFO CONTENT TYPE: image/jpeg

Clean up your resources

You can now delete the resources that you created for this tutorial, unless you want to retain them. By deleting AWS resources that you're no longer using, you prevent unnecessary charges to your AWS account.

To delete the Lambda function
  1. Open the Functions page of the Lambda console.

  2. Select the function that you created.

  3. Choose Actions, Delete.

  4. Type delete in the text input field and choose Delete.

To delete the execution role
  1. Open the Roles page of the IAM console.

  2. Select the execution role that you created.

  3. Choose Delete.

  4. Enter the name of the role in the text input field and choose Delete.

To delete the S3 bucket
  1. Open the Amazon S3 console.

  2. Select the bucket you created.

  3. Choose Delete.

  4. Enter the name of the bucket in the text input field.

  5. Choose Delete bucket.

Next steps

Try the more advanced tutorial. In this tutorial, the Amazon S3 trigger invokes a function to create a thumbnail image for each image file that is uploaded to your bucket. This tutorial requires a moderate level of AWS and Lambda domain knowledge. You use the AWS Command Line Interface (AWS CLI) to create resources, and you create a .zip file archive deployment package for your function and its dependencies.