Amazon Personalize
Developer Guide

DescribeBatchInferenceJob

Gets the properties of a batch inference job including name, Amazon Resource Name (ARN), status, input and output configurations, and the ARN of the solution version used to generate the recommendations.

Request Syntax

{ "batchInferenceJobArn": "string" }

Request Parameters

The request accepts the following data in JSON format.

batchInferenceJobArn

The ARN of the batch inference job to describe.

Type: String

Length Constraints: Maximum length of 256.

Pattern: arn:([a-z\d-]+):personalize:.*:.*:.+

Required: Yes

Response Syntax

{ "batchInferenceJob": { "batchInferenceJobArn": "string", "creationDateTime": number, "failureReason": "string", "jobInput": { "s3DataSource": { "kmsKeyArn": "string", "path": "string" } }, "jobName": "string", "jobOutput": { "s3DataDestination": { "kmsKeyArn": "string", "path": "string" } }, "lastUpdatedDateTime": number, "numResults": number, "roleArn": "string", "solutionVersionArn": "string", "status": "string" } }

Response Elements

If the action is successful, the service sends back an HTTP 200 response.

The following data is returned in JSON format by the service.

batchInferenceJob

Information on the specified batch inference job.

Type: BatchInferenceJob object

Errors

InvalidInputException

Provide a valid value for the field or parameter.

HTTP Status Code: 400

ResourceNotFoundException

Could not find the specified resource.

HTTP Status Code: 400

See Also

For more information about using this API in one of the language-specific AWS SDKs, see the following: