The Mistral AI chat completion API lets you create conversational applications. You can also use the Amazon Bedrock Converse API with this model. You can use tools to make function calls.
Tip
You can use the Mistral AI chat completion API with the base inference operations (InvokeModel or InvokeModelWithResponseStream). However, we recommend that you use the Converse API to implement messages in your application. The Converse API provides a unified set of parameters that work across all models that support messages. For more information, see Carry out a conversation with the Converse API operations.
Mistral AI models are available under the Apache 2.0 license
Supported models
You can use following Mistral AI models with the code examples on this page..
-
Mistral Large 2 (24.07)
You need the model ID for the model that you want to use. To get the model ID, see Supported foundation models in Amazon Bedrock.
Request and Response Examples
Mistral AI Large 2 (24.07) invoke model example.
import boto3
import json
bedrock = session.client('bedrock-runtime', 'us-west-2')
response = bedrock.invoke_model(
modelId='mistral.mistral-large-2407-v1:0',
body=json.dumps({
'messages': [
{
'role': 'user',
'content': 'which llm are you?'
}
],
})
)
print(json.dumps(json.loads(response['body']), indent=4))