Get details about a batch inference job - Amazon Bedrock

Get details about a batch inference job

Note

Batch inference is in preview and is subject to change. Batch inference is currently only available through the API. Access batch APIs through the following SDKs.

We recommend that you create a virtual environment to use the SDK. Because batch inference APIs aren't available in the latest SDKs, we recommend that you uninstall the latest version of the SDK from the virtual environment before installing the version with the batch inference APIs. For a guided example, see Code samples.

Request format
GET /model-invocation-job/jobIdentifier HTTP/1.1
Response format
HTTP/1.1 200 Content-type: application/json { "clientRequestToken": "string", "endTime": "string", "inputDataConfig": { "s3InputDataConfig": { "s3Uri": "string", "s3InputFormat": "JSONL" } }, "jobArn": "string", "jobName": "string", "lastModifiedTime": "string", "message": "string", "modelId": "string", "outputDataConfig": { "s3OutputDataConfig": { "s3Uri": "string" } }, "roleArn": "string", "status": "Submitted | InProgress | Completed | Failed | Stopping | Stopped", "submitTime": "string" }

To get information about a batch inference job, send a GetModelInvocationJob and provide the ARN of the job in the jobIdentifier field.

See the GetModelInvocationJob page for details about the information provided in the response.