Amazon Elastic MapReduce
API Reference (API Version 2009-03-31)


AddJobFlowSteps adds new steps to a running cluster. A maximum of 256 steps are allowed in each job flow.

If your cluster is long-running (such as a Hive data warehouse) or complex, you may require more than 256 steps to process your data. You can bypass the 256-step limitation in various ways, including using SSH to connect to the master node and submitting queries directly to the software running on the master node, such as Hive and Hadoop. For more information on how to do this, see Add More than 256 Steps to a Cluster in the Amazon EMR Management Guide.

A step specifies the location of a JAR file stored either on the master node of the cluster or in Amazon S3. Each step is performed by the main function of the main class of the JAR file. The main class can be specified either in the manifest of the JAR or by using the MainFunction parameter of the step.

Amazon EMR executes each step in the order listed. For a step to be considered complete, the main function must exit with a zero exit code and all Hadoop jobs started while the step was running must have completed and run successfully.

You can only add steps to a cluster that is in one of the following states: STARTING, BOOTSTRAPPING, RUNNING, or WAITING.

Request Syntax

{ "JobFlowId": "string", "Steps": [ { "ActionOnFailure": "string", "HadoopJarStep": { "Args": [ "string" ], "Jar": "string", "MainClass": "string", "Properties": [ { "Key": "string", "Value": "string" } ] }, "Name": "string" } ] }

Request Parameters

For information about the parameters that are common to all actions, see Common Parameters.

The request accepts the following data in JSON format.


A string that uniquely identifies the job flow. This identifier is returned by RunJobFlow and can also be obtained from ListClusters.

Type: String

Length Constraints: Minimum length of 0. Maximum length of 256.

Pattern: [\u0020-\uD7FF\uE000-\uFFFD\uD800\uDC00-\uDBFF\uDFFF\r\n\t]*

Required: Yes


A list of StepConfig to be executed by the job flow.

Type: Array of StepConfig objects

Required: Yes

Response Syntax

{ "StepIds": [ "string" ] }

Response Elements

If the action is successful, the service sends back an HTTP 200 response.

The following data is returned in JSON format by the service.


The identifiers of the list of steps added to the job flow.

Type: Array of strings

Length Constraints: Minimum length of 0. Maximum length of 256.

Pattern: [\u0020-\uD7FF\uE000-\uFFFD\uD800\uDC00-\uDBFF\uDFFF\r\n\t]*


For information about the errors that are common to all actions, see Common Errors.


Indicates that an error occurred while processing the request and that the request was not completed.

HTTP Status Code: 400


Sample Request

POST / HTTP/1.1 Content-Type: application/x-amz-json-1.1 X-Amz-Target: ElasticMapReduce.AddJobFlowSteps Content-Length: 426 User-Agent: aws-sdk-ruby/1.9.2 ruby/1.9.3 i386-mingw32 Host: X-Amz-Date: 20130716T210948Z X-Amz-Content-Sha256: 9e5ad0a93c22224947ce98eea94f766103d91b28fa82eb60d0cb8b6f9555a6b2 Authorization: AWS4-HMAC-SHA256 Credential=AKIAIOSFODNN7EXAMPLE/20130716/us-east-1/elasticmapreduce/aws4_request, SignedHeaders=content-length;content-type;host;user-agent;x-amz-content-sha256;x-amz-date;x-amz-target, Signature=2a2393390760ae85eb74ee3a539e1d758bfdd8815a1a6d6f14d4a2fbcfdcd5b7 Accept: */* { "JobFlowId": "j-3TS0OIYO4NFN", "Steps": [{ "Name": "Example Jar Step", "ActionOnFailure": "CANCEL_AND_WAIT", "HadoopJarStep": { "Jar": "s3n:\\/\\/elasticmapreduce\\/samples\\/cloudburst\\/cloudburst.jar", "Args": [ "s3n:\\/\\/elasticmapreduce\\/samples\\/cloudburst\\/input\\/", "s3n:\\/\\/elasticmapreduce\\/samples\\/cloudburst\\/input\\/", "s3n:\\/\\/examples-bucket\\/cloudburst\\/output", "36", "3", "0", "1", "240", "48", "24", "24", "128", "16" ] } }] }

Sample Response

HTTP/1.1 200 OK x-amzn-RequestId: 6514261f-ee5b-11e2-9345-5332e9ab2e6d Content-Type: application/x-amz-json-1.1 Content-Length: 0 Date: Tue, 16 Jul 2013 21:05:07 GMT

See Also

For more information about using this API in one of the language-specific AWS SDKs, see the following: