ResultWriter - AWS Step Functions

ResultWriter

The ResultWriter field is a JSON object that specifies the Amazon S3 location where Step Functions writes the results of the child workflow executions started by a Distributed Map state. By default, Step Functions doesn't export these results.

Important

Make sure that the Amazon S3 bucket you use to export the results of a Map Run is under the same AWS account and AWS Region as your state machine. Otherwise, your state machine execution will fail with the States.ResultWriterFailed error.

Exporting the results to an Amazon S3 bucket is helpful if your output payload size exceeds 256 KB. Step Functions consolidates all child workflow execution data, such as execution input and output, ARN, and execution status. It then exports executions with the same status to their respective files in the specified Amazon S3 location. The following example shows the syntax of the ResultWriter field if you export the child workflow execution results. In this example, you store the results in a bucket named myOutputBucket within a prefix called csvProcessJobs.

{ "ResultWriter": { "Resource": "arn:aws:states:::s3:putObject", "Parameters": { "Bucket": "myOutputBucket", "Prefix": "csvProcessJobs" } } }
Tip

In Workflow Studio, you can export the child workflow execution results by selecting Export Map state results to Amazon S3. Then, provide the name of the Amazon S3 bucket and prefix where you want to export the results to.

Step Functions needs appropriate permissions to access the bucket and folder where you want to export the results. For information about the required IAM policy, see IAM policies for ResultWriter.

If you export the child workflow execution results, the Distributed Map state execution returns the Map Run ARN and data about the Amazon S3 export location in the following format:

{ "MapRunArn": "arn:aws:states:us-east-2:123456789012:mapRun:csvProcess/Map:ad9b5f27-090b-3ac6-9beb-243cd77144a7", "ResultWriterDetails": { "Bucket": "myOutputBucket", "Key": "csvProcessJobs/ad9b5f27-090b-3ac6-9beb-243cd77144a7/manifest.json" } }

Step Functions exports executions with the same status to their respective files. For example, if your child workflow executions resulted in 500 success and 200 failure results, Step Functions creates two files in the specified Amazon S3 location for the success and failure results. In this example, the success results file contains the 500 success results, while the failure results file contains the 200 failure results.

For a given execution attempt, Step Functions creates the following files in the specified Amazon S3 location depending on your execution output:

  • manifest.json – Contains Map Run metadata, such as export location, Map Run ARN, and information about the result files.

    If you've redriven a Map Run, the manifest.json file, contains references to all the successful child workflow executions across all the attempts of a Map Run. However, this file contains references to the failed and pending executions for a specific redrive.

  • SUCCEEDED_n.json – Contains the consolidated data for all successful child workflow executions. n represents the index number of the file. The index number starts from 0. For example, SUCCEEDED_1.json.

  • FAILED_n.json – Contains the consolidated data for all failed, timed out, and aborted child workflow executions. Use this file to recover from failed executions. n represents the index of the file. The index number starts from 0. For example, FAILED_1.json.

  • PENDING_n.json – Contains the consolidated data for all child workflow executions that weren’t started because the Map Run failed or aborted. n represents the index of the file. The index number starts from 0. For example, PENDING_1.json.

Step Functions supports individual result files of up to 5 GB. If a file size exceeds 5 GB, Step Functions creates another file to write the remaining execution results and appends an index number to the file name. For example, if size of the Succeeded_0.json file exceeds 5 GB, Step Functions creates Succeeded_1.json file to record the remaining results.

If you didn’t specify to export the child workflow execution results, the state machine execution returns an array of child workflow execution results as shown in the following example:

Note

If the returned output size exceeds 256 KB, the state machine execution fails and returns a States.DataLimitExceeded error.

[ { "statusCode": 200, "inputReceived": { "show_id": "s1", "release_year": "2020", "rating": "PG-13", "type": "Movie" } }, { "statusCode": 200, "inputReceived": { "show_id": "s2", "release_year": "2021", "rating": "TV-MA", "type": "TV Show" } }, ... ]

IAM policies for ResultWriter

When you create workflows with the Step Functions console, Step Functions can automatically generate IAM policies based on the resources in your workflow definition. These policies include the least privileges necessary to allow the state machine role to invoke the StartExecution API action for the Distributed Map state. These policies also include the least privileges necessary Step Functions to access AWS resources, such as Amazon S3 buckets and objects and Lambda functions. We highly recommend that you include only those permissions that are necessary in your IAM policies. For example, if your workflow includes a Map state in Distributed mode, scope your policies down to the specific Amazon S3 bucket and folder that contains your dataset.

Important

If you specify an Amazon S3 bucket and object, or prefix, with a reference path to an existing key-value pair in your Distributed Map state input, make sure that you update the IAM policies for your workflow. Scope the policies down to the bucket and object names the path resolves to at runtime.

The following IAM policy example grants the least privileges required to write your child workflow execution results to a folder named csvJobs in an Amazon S3 bucket using the PutObject API action.

{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:PutObject", "s3:GetObject", "s3:ListMultipartUploadParts", "s3:AbortMultipartUpload" ], "Resource": [ "arn:aws:s3:::resultBucket/csvJobs/*" ] } ] }

If the Amazon S3 bucket to which you're writing the child workflow execution result is encrypted using an AWS Key Management Service (AWS KMS) key, you must include the necessary AWS KMS permissions in your IAM policy. For more information, see IAM permissions for AWS KMS key encrypted Amazon S3 bucket.