Tutorial: Using an Amazon S3 trigger to invoke a Lambda function - AWS Lambda

Tutorial: Using an Amazon S3 trigger to invoke a Lambda function

In this tutorial, you use the console to create a Lambda function and configure a trigger for Amazon Simple Storage Service (Amazon S3). The trigger invokes your function every time that you add an object to your Amazon S3 bucket.

Prerequisites

If you do not have an AWS account, complete the following steps to create one.

To sign up for an AWS account
  1. Open https://portal.aws.amazon.com/billing/signup.

  2. Follow the online instructions.

    Part of the sign-up procedure involves receiving a phone call and entering a verification code on the phone keypad.

    When you sign up for an AWS account, an AWS account root user is created. The root user has access to all AWS services and resources in the account. As a security best practice, assign administrative access to an administrative user, and use only the root user to perform tasks that require root user access.

This tutorial assumes that you have some knowledge of basic Lambda operations and the Lambda console. If you haven't already, follow the instructions in Create a Lambda function with the console to create your first Lambda function.

Create an Amazon S3 bucket and upload a sample object

Follow these steps to create an Amazon S3 bucket and upload an object.

  1. Open the Amazon S3 console.

  2. Choose Create bucket.

  3. Under General configuration, do the following:

    1. For Bucket name, enter a unique name.

    2. For AWS Region, choose a Region. Note that you must create your Lambda function in the same Region.

  4. Choose Create bucket.

  5. Upload an object to the bucket (for example, HappyFace.jpg).

    You must create this sample object before you test your Lambda function. When you test the function manually later in this tutorial, you pass sample event data to the function that specifies the bucket name and object name.

Create the Lambda function

Use a function blueprint to create the Lambda function. A blueprint provides a sample function that demonstrates how to use Lambda with other AWS services. Also, a blueprint includes sample code and function configuration presets for a certain runtime. For this tutorial, you can choose the blueprint for the Node.js or Python runtime.

To create a Lambda function from a blueprint in the console
  1. Open the Functions page of the Lambda console.

  2. Choose Create function.

  3. On the Create function page, choose Use a blueprint.

  4. Under Blueprints, enter s3 in the search box.

  5. In the search results, do one of the following:

    • For a Node.js function, choose s3-get-object.

    • For a Python function, choose s3-get-object-python.

  6. Choose Configure.

  7. Under Basic information, do the following:

    1. For Function name, enter my-s3-function.

    2. For Execution role, choose Create a new role from AWS policy templates.

    3. For Role name, enter my-s3-function-role.

  8. Under S3 trigger, choose the S3 bucket that you created previously.

    When you configure an S3 trigger using the Lambda console, the console modifies your function's resource-based policy to allow Amazon S3 to invoke the function.

  9. Choose Create function.

Review the function code

The Lambda function retrieves the source S3 bucket name and the key name of the uploaded object from the event parameter that it receives. The function uses the Amazon S3 getObject API to retrieve the content type of the object.

While viewing your function in the Lambda console, you can review the function code on the Code tab, under Code source. The code looks like the following:

Node.js
Example index.js
console.log('Loading function'); const aws = require('aws-sdk'); const s3 = new aws.S3({ apiVersion: '2006-03-01' }); exports.handler = async (event, context) => { //console.log('Received event:', JSON.stringify(event, null, 2)); // Get the object from the event and show its content type const bucket = event.Records[0].s3.bucket.name; const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' ')); const params = { Bucket: bucket, Key: key, }; try { const { ContentType } = await s3.getObject(params).promise(); console.log('CONTENT TYPE:', ContentType); return ContentType; } catch (err) { console.log(err); const message = `Error getting object ${key} from bucket ${bucket}. Make sure they exist and your bucket is in the same region as this function.`; console.log(message); throw new Error(message); } };
Python
Example lambda-function.py
import json import urllib.parse import boto3 print('Loading function') s3 = boto3.client('s3') def lambda_handler(event, context): #print("Received event: " + json.dumps(event, indent=2)) # Get the object from the event and show its content type bucket = event['Records'][0]['s3']['bucket']['name'] key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8') try: response = s3.get_object(Bucket=bucket, Key=key) print("CONTENT TYPE: " + response['ContentType']) return response['ContentType'] except Exception as e: print(e) print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket)) raise e

Test in the console

Invoke the Lambda function manually using sample Amazon S3 event data.

To test the Lambda function using the console
  1. On the Code tab, under Code source, choose the arrow next to Test, and then choose Configure test events from the dropdown list.

  2. In the Configure test event window, do the following:

    1. Choose Create new test event.

    2. For Event template, choose S3 Put (s3-put).

    3. For Event name, enter a name for the test event. For example, mys3testevent.

    4. In the Event JSON, replace the following values:

      • us-east-1 – The AWS Region where you created the Amazon S3 bucket and the Lambda function.

      • example-bucket – The Amazon S3 bucket that you created earlier.

      • test%2Fkey – The name of the sample object that you uploaded to the bucket (for example, HappyFace.jpg).

      { "Records": [ { "eventVersion": "2.0", "eventSource": "aws:s3", "awsRegion": "us-east-1", "eventTime": "1970-01-01T00:00:00.000Z", "eventName": "ObjectCreated:Put", "userIdentity": { "principalId": "EXAMPLE" }, "requestParameters": { "sourceIPAddress": "127.0.0.1" }, "responseElements": { "x-amz-request-id": "EXAMPLE123456789", "x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH" }, "s3": { "s3SchemaVersion": "1.0", "configurationId": "testConfigRule", "bucket": { "name": "example-bucket", "ownerIdentity": { "principalId": "EXAMPLE" }, "arn": "arn:aws:s3:::example-bucket" }, "object": { "key": "test%2Fkey", "size": 1024, "eTag": "0123456789abcdef0123456789abcdef", "sequencer": "0A1B2C3D4E5F678901" } } } ] }
    5. Choose Save.

  3. To invoke the function with your test event, under Code source, choose Test.

    The Execution results tab displays the response, function logs, and request ID, similar to the following:

    Response "image/jpeg" Function Logs START RequestId: 12b3cae7-5f4e-415e-93e6-416b8f8b66e6 Version: $LATEST 2021-02-18T21:40:59.280Z 12b3cae7-5f4e-415e-93e6-416b8f8b66e6 INFO INPUT BUCKET AND KEY: { Bucket: 'my-s3-bucket', Key: 'HappyFace.jpg' } 2021-02-18T21:41:00.215Z 12b3cae7-5f4e-415e-93e6-416b8f8b66e6 INFO CONTENT TYPE: image/jpeg END RequestId: 12b3cae7-5f4e-415e-93e6-416b8f8b66e6 REPORT RequestId: 12b3cae7-5f4e-415e-93e6-416b8f8b66e6 Duration: 976.25 ms Billed Duration: 977 ms Memory Size: 128 MB Max Memory Used: 90 MB Init Duration: 430.47 ms Request ID 12b3cae7-5f4e-415e-93e6-416b8f8b66e6

Test the S3 trigger

Invoke your function when you upload a file to the Amazon S3 source bucket.

To test the Lambda function using the S3 trigger
  1. On the Buckets page of the Amazon S3 console, choose the name of the source bucket that you created earlier.

  2. On the Upload page, upload more .jpg or .png image files to the bucket.

  3. Open the Functions page of the Lambda console.

  4. Choose the name of your function (my-s3-function).

  5. To verify that the function ran once for each file that you uploaded, choose the Monitor tab. This page shows graphs for the metrics that Lambda sends to CloudWatch. The count in the Invocations graph should match the number of files that you uploaded to the Amazon S3 bucket.

    For more information on these graphs, see Monitoring functions on the Lambda console.

  6. (Optional) To view the logs in the CloudWatch console, choose View logs in CloudWatch. Choose a log stream to view the logs output for one of the function invocations.

Clean up your resources

You can now delete the resources that you created for this tutorial, unless you want to retain them. By deleting AWS resources that you're no longer using, you prevent unnecessary charges to your AWS account.

To delete the Lambda function
  1. Open the Functions page of the Lambda console.

  2. Select the function that you created.

  3. Choose Actions, Delete.

  4. Type delete in the text input field and choose Delete.

To delete the IAM policy
  1. Open the Policies page of the AWS Identity and Access Management (IAM) console.

  2. Select the policy that Lambda created for you. The policy name begins with AWSLambdaS3ExecutionRole-.

  3. Choose Policy actions, Delete.

  4. Choose Delete.

To delete the execution role
  1. Open the Roles page of the IAM console.

  2. Select the execution role that you created.

  3. Choose Delete.

  4. Enter the name of the role in the text input field and choose Delete.

To delete the S3 bucket
  1. Open the Amazon S3 console.

  2. Select the bucket you created.

  3. Choose Delete.

  4. Enter the name of the bucket in the text input field.

  5. Choose Delete bucket.

Next steps

Try the more advanced tutorial. In this tutorial, the S3 trigger invokes a function to create a thumbnail image for each image file that is uploaded to your S3 bucket. This tutorial requires a moderate level of AWS and Lambda domain knowledge. You use the AWS Command Line Interface (AWS CLI) to create resources, and you create a .zip file archive deployment package for your function and its dependencies.