.NET functions with native AOT compilation - AWS Lambda

.NET functions with native AOT compilation

.NET 7 supports native ahead-of-time (AOT) compilation. With native AOT, you can compile your Lambda function code to a native runtime format, which removes the need to compile .NET code at runtime. Native AOT compilation can reduce the cold start time for Lambda functions that you write in .NET. For more information, see Building serverless .NET applications on AWS Lambda using .NET 7 on the AWS Compute Blog.

Lambda runtime

To deploy a Lambda function build with native AOT compilation, use the provided.al2023 or provided.al2 OS-only runtime with the x86_64 architecture. When you use a managed .NET Lambda runtime, your application is compiled into Intermediate Language (IL) code. At runtime, the just-in-time (JIT) compiler in the runtime takes the IL code and compiles it into machine code as needed. With a Lambda function that is compiled ahead of time with native AOT, the runtime environment doesn't include the .NET SDK or .NET runtime. You compile your code into machine code before it runs.

One limitiation of AOT is that your application code must be compiled in an environment with the same operating system and processor architecture as the Amazon Linux 2 (AL2) runtime environment. The .NET Lambda CLI provides functionality to compile your application in a Docker container using an AL2 image.

Prerequisites

Docker

To use native AOT, your function code must be compiled in an environment with the same Amazon Linux 2 operating system as the OS-only runtime. The .NET CLI commands in the following sections use Docker to develop and build Lambda functions in an Amazon Linux 2 environment.

.NET 7 SDK

Native AOT compilation is a feature of .NET 7. You must install the .NET 7 SDK on your build machine, not only the runtime.

Amazon.Lambda.Tools

To create your Lambda functions, you use the Amazon.Lambda.Tools .NET Global Tools extension. To install Amazon.Lambda.Tools, run the following command:

dotnet tool install -g Amazon.Lambda.Tools

For more information about the Amazon.Lambda.Tools .NET CLI extension, see the AWS Extensions for .NET CLI repository on GitHub.

Amazon.Lambda.Templates

To generate your Lambda function code, use the Amazon.Lambda.Templates NuGet package. To install this template package, run the following command:

dotnet new install Amazon.Lambda.Templates

Getting started

Both the .NET Global CLI and the AWS Serverless Application Model (AWS SAM) provide getting started templates for building applications using native AOT. To build your first native AOT Lambda function, carry out the steps in the following instructions.

To initialize and deploy a native AOT compiled Lambda function
  1. Initialize a new project using the native AOT template and then navigate into the directory containing the created .cs and .csproj files. In this example, we name our function NativeAotSample.

    dotnet new lambda.NativeAOT -n NativeAotSample cd ./NativeAotSample/src/NativeAotSample

    The Function.cs file created by the native AOT template contains the following function code.

    using Amazon.Lambda.Core; using Amazon.Lambda.RuntimeSupport; using Amazon.Lambda.Serialization.SystemTextJson; using System.Text.Json.Serialization; namespace NativeAotSample; public class Function { private static async Task Main() { Func<string, ILambdaContext, string> handler = FunctionHandler; await LambdaBootstrapBuilder.Create(handler, new SourceGeneratorLambdaJsonSerializer<LambdaFunctionJsonSerializerContext>()) .Build() .RunAsync(); } public static string FunctionHandler(string input, ILambdaContext context) { return input.ToUpper(); } } [JsonSerializable(typeof(string))] public partial class LambdaFunctionJsonSerializerContext : JsonSerializerContext { }

    Native AOT compiles your application into a single, native binary. The entrypoint of that binary is the static Main method. Within static Main, the Lambda runtime is bootstrapped and the FunctionHandler method set up. As part of the runtime bootstrap, a source generated serializer is configured using new SourceGeneratorLambdaJsonSerializer<LambdaFunctionJsonSerializerContext>()

  2. To deploy your application to Lambda, ensure that Docker is running in your local environment and run the following command.

    dotnet lambda deploy-function

    Behind the scenes, the .NET global CLI downloads an Amazon Linux 2 Docker image and compiles your application code inside a running container. The compiled binary is output back to your local filesystem before being deployed to Lambda.

  3. Test your function by running the following command. Replace <FUNCTION_NAME> with the name you chose for your function in the deployment wizard.

    dotnet lambda invoke-function <FUNCTION_NAME> --payload "hello world"

    The response from the CLI includes performance details for the cold start (initialization duration) and total run time for your function invocation.

  4. To delete the AWS resources you created by following the preceding steps, run the following command. Replace <FUNCTION_NAME> with the name you chose for your function in the deployment wizard. By deleting AWS resources that you're no longer using, you prevent unnecessary charges being billed to your AWS account.

    dotnet lambda delete-function <FUNCTION_NAME>

Serialization

To deploy functions to Lambda using native AOT, your function code must use source generated serialization. Instead of using run-time reflection to gather the metadata needed to access object properties for serialization, source generators generate C# source files that are compiled when you build your application. To configure your source generated serializer correctly, ensure that you include any input and output objects your function uses, as well as any custom types. For example, a Lambda function that receives events from an API Gateway REST API and returns a custom Product type would include a serializer defined as follows.

[JsonSerializable(typeof(APIGatewayProxyRequest))] [JsonSerializable(typeof(APIGatewayProxyResponse))] [JsonSerializable(typeof(Product))] public partial class CustomSerializer : JsonSerializerContext { }

Trimming

Native AOT trims your application code as part of the compilation to ensure that the binary is as small as possible. Much of the .NET ecosystem is not yet trim friendly. This means that parts of libraries that your function relies on may be trimmed out as part of the compilation step. You can avoid this by defining TrimmerRootAssemblies as part of your .csproj file.

<ItemGroup> <TrimmerRootAssembly Include="AWSSDK.Core" /> <TrimmerRootAssembly Include="AWSXRayRecorder.Core" /> <TrimmerRootAssembly Include="AWSXRayRecorder.Handlers.AwsSdk" /> <TrimmerRootAssembly Include="Amazon.Lambda.APIGatewayEvents" /> <TrimmerRootAssembly Include="bootstrap" /> <TrimmerRootAssembly Include="Shared" /> </ItemGroup>

Troubleshooting

Error: Cross-OS native compilation is not supported.

Your version of the Amazon.Lambda.Tools .NET Core global tool is out of date. Update to the latest version and try again.

Docker: image operating system "linux" cannot be used on this platform.

Docker on your system is configured to use Windows containers. Swap to Linux containers to run the native AOT build environment.

Unhandled Exception: System.ApplicationException: The serializer NativeAoT.MyCustomJsonSerializerContext is missing a constructor that takes in JsonSerializerOptions object

If you encounter this error when you invoke your Lambda function, add a runtime directive (rd.xml) file to your project, and then redeploy.

For more information about common errors, see the AWS NativeAOT for .NET repository on GitHub.