Solution Components - Distributed Load Testing on AWS

Solution Components

The Distributed Load Testing on AWS solution consists of two high-level components, a front end and a backend.

Front End

The front end consists of a load testing API and web console you use to interact with the solution’s backend.

Load Testing API

Distributed Load Testing on AWS configures Amazon API Gateway to host the solution’s RESTful API. Users can interact with testing data securely through the included web console and RESTful API. The API acts as a “front door” for access to testing data stored in Amazon DynamoDB. You can also use the APIs to access any extended functionality you build into the solution.

This solution takes advantage of the user authentication features of Amazon Cognito User Pools. After successfully authenticating a user, Amazon Cognito issues a JSON web token that is used to allow the console to submit requests to the solution’s APIs (Amazon API Gateway endpoints). HTTPS requests are sent by the console to the APIs with the authorization header that includes the token.

Based on the request, API Gateway invokes the appropriate AWS Lambda function to perform the necessary tasks on the data stored in the DynamoDB tables, store test scenarios as JSON objects in Amazon Simple Storage Service (Amazon S3), retrieve Amazon CloudWatch metrics images, and submit test scenarios to the Amazon Simple Queue Service (Amazon SQS) queue.

For more information on the solution’s API, see Appendix D.

Web Console

This solution includes a simple web console you can use to configure and run tests, monitor running tests, and view detailed test results. The console is a ReactJS application hosted in Amazon S3 and accessed through Amazon CloudFront. The application leverages AWS Amplify to integrate with Amazon Cognito to authenticate users.

The web console is designed to demonstrate how you can interact with this load testing solution. In a production environment, we recommend customizing the web console to meet your specific needs or building your own console.

Backend

The backend consists of a Docker image pipeline and load testing engine you use to generate load for the tests. You interact with the backend through the front end.

Docker Image Pipeline

This solution leverages a Docker image of the Taurus Load Testing framework. During launch, a Docker file is uploaded to an Amazon S3 bucket. AWS CodePipeline uses AWS CodeBuild to create a new image from the Docker file and registers the image with Amazon Elastic Container Registry. The image is used to run tasks in the AWS Fargate cluster.

CodePipeline is only used to register the container image during initial deployment. But, you can also use CodePipeline to deploy updates to the image. For more information, see Appendix A.

Load Testing Engine

The Distributed Load Testing solution uses Amazon Elastic Container Service (Amazon ECS) and AWS Fargate to simulate thousands of connected users generating a select number of transactions per second.

You define the parameters for the tasks that will be run as part of the test using the included web console. The solution uses these parameters to generate a JSON test scenario and stores it in Amazon Simple Storage Service (Amazon S3). The test scenario is added to an Amazon Simple Queue Service (Amazon SQS) queue where it is stored until it is processed.

A task-runner AWS Lambda function is subscribed to the Amazon SQS queue. The function starts each task in the Amazon ECS cluster that is provisioned by Fargate. During the test, the output from each task is logged in Amazon CloudWatch.

When a task completes, an XML file is generated and stored in Amazon S3. This S3 event triggers the results-parser Lambda function which processes the XML file and stores the results in a test results DynamoDB table. When all tasks are complete, the function consolidates all the results into a final report which is then stored in a test scenario DynamoDB table. For more information on the workflow, see Appendix B. For more information on test results, see Appendix C.