@aws-sdk/client-bedrock

CreateModelInvocationJobCommandInput Interface

Members

Name
Type
Details
inputDataConfig RequiredModelInvocationJobInputDataConfig | undefined

Details about the location of the input to the batch inference job.

jobName Requiredstring | undefined

A name to give the batch inference job.

modelId Requiredstring | undefined

The unique identifier of the foundation model to use for the batch inference job.

outputDataConfig RequiredModelInvocationJobOutputDataConfig | undefined

Details about the location of the output of the batch inference job.

roleArn Requiredstring | undefined

The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference. You can use the console to create a default service role or follow the steps at Create a service role for batch inference .

clientRequestToken string | undefined

A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency .

tags Tag[] | undefined

Any tags to associate with the batch inference job. For more information, see Tagging Amazon Bedrock resources .

timeoutDurationInHours number | undefined

The number of hours after which to force the batch inference job to time out.

vpcConfig VpcConfig | undefined

The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job. For more information, see Protect batch inference jobs using a VPC .

Full Signature

export interface CreateModelInvocationJobCommandInput extends CreateModelInvocationJobRequest