Get details about a batch inference job
Note
Batch inference is in preview and is subject to change. Batch inference is currently only available through the API. Access batch APIs through the following SDKs.
We recommend that you create a virtual environment to use the SDK. Because batch inference APIs aren't available in the latest SDKs, we recommend that you uninstall the latest version of the SDK from the virtual environment before installing the version with the batch inference APIs. For a guided example, see Code samples.
To get information about a batch inference job, send a GetModelInvocationJob
and provide the ARN of the job in the jobIdentifier
field.
See the GetModelInvocationJob
page for details about the information provided in the response.