Menu
Microservices on AWS
AWS Whitepaper

Reducing Operational Complexity

The architecture we have described is already using managed services, but you still have to operate EC2 instances. We can further reduce the operational efforts needed to run, maintain, and monitor microservices by using a fully serverless architecture.

API Implementation

Architecting, continuously improving, deploying, monitoring, and maintaining an API can be a time-consuming task. Sometimes different versions of APIs need to be run to assure backward compatibility of all APIs for clients. The different stages of the development cycle (development, testing, and production) further multiply operational efforts.

Access authorization is a critical feature for all APIs, but it is usually complex to build and involves repetitive work. When an API is published and becomes successful, the next challenge is to manage, monitor, and monetize the ecosystem of third-party developers utilizing the API.

Other important features and challenges include throttling requests to protect the backend, caching API responses, request and response transformation, and generating API definitions and documentation with tools such as Swagger.

Amazon API Gateway addresses those challenges and reduces the operational complexity of creating and maintaining RESTful APIs.

Note

API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale.

API Gateway allows you to create your APIs programmatically by importing Swagger definitions by using the AWS API or by using the AWS Management Console. API Gateway serves as a front door to any web application running on Amazon EC2, Amazon ECS, AWS Lambda, or on any on-premises environment. In a nutshell: It allows you to run APIs without managing servers.

The following figure illustrates how API Gateway handles API calls and interacts with other components. Requests from mobile devices, websites, or other backend services are routed to the closest CloudFront Point of Presence (PoP) to minimize latency and provide optimum user experience. Additionally, CloudFront offers Regional Edge Caches. These locations are deployed globally at close proximity to your viewers. They sit between your origin server and the global edge locations that serve traffic directly to your viewers. API Gateway first checks if the request is in the cache at either an edge location or Regional Edge Cache location and, if no cached records are available, then forwards it to the backend for processing. This only applies to GET requests—all other request methods are automatically passed through. After the backend has processed the request, API call metrics are logged in Amazon CloudWatch, and content is returned to the client.

Serverless Microservices

“No server is easier to manage than no server”. Getting rid of servers is the ultimate way to eliminate operational complexity.

Note

AWS Lambda lets you run code without provisioning or managing servers. You pay only for the compute time you consume – there is no charge when your code is not running. With Lambda, you can run code for virtually any type of application or backend service—all with zero administration.

You simply upload your code and let Lambda take care of everything required to run and scale the execution to meet your actual demand curve with high availability. Lambda supports several programming languages and can be triggered from other AWS services or be called directly from any web or mobile application.

Lambda is highly integrated with API Gateway. The possibility of making synchronous calls from API Gateway to AWS Lambda enables the creation of fully serverless applications and is described in detail in our documentation.

The following figure shows the architecture of a serverless microservice where the complete service is built out of managed services. This eliminates the architectural burden of designing for scale and high availability and eliminates the operational efforts of running and monitoring the microservice’s underlying infrastructure.

Deploying Lambda-Based Applications

You can use AWS CloudFormation to specify, deploy, and configure serverless applications.

Note

CloudFormation is a service that helps you model and set up your AWS resources so that you can spend less time managing those resources and more time focusing on your applications that run in AWS.

The AWS Serverless Application Model (AWS SAM) is a convenient way to define serverless applications. AWS SAM is natively supported by CloudFormation and defines a simplified syntax for expressing serverless resources. To deploy your application, simply specify the resources you need as part of your application, along with their associated permissions policies in a CloudFormation template, package your deployment artifacts, and deploy the template.