Create a service role for batch inference - Amazon Bedrock

Create a service role for batch inference

To use a custom service role for agents instead of the one Amazon Bedrock automatically creates, create an IAM role and attach the following permissions by following the steps at Creating a role to delegate permissions to an AWS service:

  • Trust policy

  • A policy containing the following identity-based permissions

    • Access to the Amazon S3 buckets containing the input data for your batch inference jobs and to write the output data.

Whether you use a custom role or not, you also need to attach a resource-based policy to the Lambda functions for the action groups in your agents to provide permissions for the service role to access the functions. For more information, see Resource-based policy to allow Amazon Bedrock to invoke an action group Lambda function.

Trust relationship

The following trust policy allows Amazon Bedrock to assume this role and submit and manage batch inference jobs. Replace the values as necessary. The policy contains optional condition keys (see Condition keys for Amazon Bedrock and AWS global condition context keys) in the Condition field that we recommend you use as a security best practice.

Note

As a best practice for security purposes, replace the * with specific batch inference job IDs after you have created them.

{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "bedrock.amazonaws.com" }, "Action": "sts:AssumeRole", "Condition": { "StringEquals": { "aws:SourceAccount": "account-id" }, "ArnEquals": { "aws:SourceArn": "arn:aws:bedrock:region:account-id:model-invocation-job/*" } } } ] }

Identity-based permissions for the batch inference service role.

Attach the following policy to provide permissions for the service role, replacing values as necessary. The policy contains the following statements. Omit a statement if it isn't applicable to your use-case. The policy contains optional condition keys (see Condition keys for Amazon Bedrock and AWS global condition context keys) in the Condition field that we recommend you use as a security best practice.

  • Permissions to access the S3 bucket containing your input data and the S3 bucket to which to write your output data.

{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:GetObject", "s3:PutObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::my_input_bucket", "arn:aws:s3:::my_input_bucket/*", "arn:aws:s3:::my_output_bucket", "arn:aws:s3:::my_output_bucket/*" ], "Condition": { "StringEquals": { "aws:ResourceAccount": [ "account-id" ] } } } ] }