Model inference using foundation models is supported in all Regions and with all models supported by Amazon Bedrock. To see the Regions and models supported by Amazon Bedrock, refer to Supported foundation models in Amazon Bedrock.
You can also run model inference with Amazon Bedrock resources other than foundation models. Refer to the following pages to see Region and model availability for different resources:
-
Supported Regions and models for Prompt management
Note
InvokeModel and InvokeModelWithResponseStream only work on prompts from Prompt management whose configuration specifies an Anthropic Claude or Meta Llama model.