AWS Tools for Windows PowerShell
Command Reference

AWS services or capabilities described in AWS Documentation may vary by region/location. Click Getting Started with Amazon AWS to see specific differences applicable to the China (Beijing) Region.

Synopsis

Calls the Amazon Bedrock Runtime ConverseStream API operation.

Syntax

Invoke-BDRRConverseStream
-ModelId <String>
-AdditionalModelRequestField <PSObject>
-AdditionalModelResponseFieldPath <String[]>
-ToolChoice_Any <AnyToolChoice>
-ToolChoice_Auto <AutoToolChoice>
-GuardrailConfig_GuardrailIdentifier <String>
-GuardrailConfig_GuardrailVersion <String>
-PerformanceConfig_Latency <PerformanceConfigLatency>
-InferenceConfig_MaxToken <Int32>
-Message <Message[]>
-Tool_Name <String>
-PromptVariable <Hashtable>
-RequestMetadata <Hashtable>
-InferenceConfig_StopSequence <String[]>
-GuardrailConfig_StreamProcessingMode <GuardrailStreamProcessingMode>
-System <SystemContentBlock[]>
-InferenceConfig_Temperature <Single>
-ToolConfig_Tool <Tool[]>
-InferenceConfig_TopP <Single>
-GuardrailConfig_Trace <GuardrailTrace>
-Select <String>
-Force <SwitchParameter>
-ClientConfig <AmazonBedrockRuntimeConfig>

Description

Sends messages to the specified Amazon Bedrock model and returns the response in a stream. ConverseStream provides a consistent API that works with all Amazon Bedrock models that support messages. This allows you to write code once and use it with different models. Should a model have unique inference parameters, you can also pass those unique parameters to the model. To find out if a model supports streaming, call GetFoundationModel and check the responseStreamingSupported field in the response. The CLI doesn't support streaming operations in Amazon Bedrock, including ConverseStream. Amazon Bedrock doesn't store any text, images, or documents that you provide as content. The data is only used to generate the response. You can submit a prompt by including it in the messages field, specifying the modelId of a foundation model or inference profile to run inference on it, and including any other fields that are relevant to your use case. You can also submit a prompt from Prompt management by specifying the ARN of the prompt version and including a map of variables to values in the promptVariables field. You can append more messages to the prompt by using the messages field. If you use a prompt from Prompt management, you can't include the following fields in the request: additionalModelRequestFields, inferenceConfig, system, or toolConfig. Instead, these fields must be defined through Prompt management. For more information, see Use a prompt from Prompt management. For information about the Converse API, see Use the Converse API in the Amazon Bedrock User Guide. To use a guardrail, see Use a guardrail with the Converse API in the Amazon Bedrock User Guide. To use a tool with a model, see Tool use (Function calling) in the Amazon Bedrock User Guide For example code, see Conversation streaming example in the Amazon Bedrock User Guide. This operation requires permission for the bedrock:InvokeModelWithResponseStream action. To deny all inference access to resources that you specify in the modelId field, you need to deny access to the bedrock:InvokeModel and bedrock:InvokeModelWithResponseStream actions. Doing this also denies access to the resource through the base inference actions (InvokeModel and InvokeModelWithResponseStream). For more information see Deny access for inference on specific models. For troubleshooting some of the common errors you might encounter when using the ConverseStream API, see Troubleshooting Amazon Bedrock API Error Codes in the Amazon Bedrock User Guide

Parameters

-AdditionalModelRequestField <PSObject>
Additional inference parameters that the model supports, beyond the base set of inference parameters that Converse and ConverseStream support in the inferenceConfig field. For more information, see Model parameters.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesAdditionalModelRequestFields
-AdditionalModelResponseFieldPath <String[]>
Additional model parameters field paths to return in the response. Converse and ConverseStream return the requested fields as a JSON Pointer object in the additionalModelResponseFields field. The following is example JSON for additionalModelResponseFieldPaths.[ "/stop_sequence" ]For information about the JSON Pointer syntax, see the Internet Engineering Task Force (IETF) documentation.Converse and ConverseStream reject an empty JSON Pointer or incorrectly structured JSON Pointer with a 400 error code. if the JSON Pointer is valid, but the requested field is not in the model response, it is ignored by Converse. Starting with version 4 of the SDK this property will default to null. If no data for this property is returned from the service the property will also be null. This was changed to improve performance and allow the SDK and caller to distinguish between a property not set or a property being empty to clear out a value. To retain the previous SDK behavior set the AWSConfigs.InitializeCollections static property to true.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesAdditionalModelResponseFieldPaths
Amazon.PowerShell.Cmdlets.BDRR.AmazonBedrockRuntimeClientCmdlet.ClientConfig
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
This parameter overrides confirmation prompts to force the cmdlet to continue its operation. This parameter should always be used with caution.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-GuardrailConfig_GuardrailIdentifier <String>
The identifier for the guardrail.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-GuardrailConfig_GuardrailVersion <String>
The version of the guardrail.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-GuardrailConfig_StreamProcessingMode <GuardrailStreamProcessingMode>
The processing mode. The processing mode. For more information, see Configure streaming response behavior in the Amazon Bedrock User Guide.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-GuardrailConfig_Trace <GuardrailTrace>
The trace behavior for the guardrail.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-InferenceConfig_MaxToken <Int32>
The maximum number of tokens to allow in the generated response. The default value is the maximum allowed value for the model that you are using. For more information, see Inference parameters for foundation models.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesInferenceConfig_MaxTokens
-InferenceConfig_StopSequence <String[]>
A list of stop sequences. A stop sequence is a sequence of characters that causes the model to stop generating the response. Starting with version 4 of the SDK this property will default to null. If no data for this property is returned from the service the property will also be null. This was changed to improve performance and allow the SDK and caller to distinguish between a property not set or a property being empty to clear out a value. To retain the previous SDK behavior set the AWSConfigs.InitializeCollections static property to true.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesInferenceConfig_StopSequences
-InferenceConfig_Temperature <Single>
The likelihood of the model selecting higher-probability options while generating a response. A lower value makes the model more likely to choose higher-probability options, while a higher value makes the model more likely to choose lower-probability options.The default value is the default value for the model that you are using. For more information, see Inference parameters for foundation models.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-InferenceConfig_TopP <Single>
The percentage of most-likely candidates that the model considers for the next token. For example, if you choose a value of 0.8 for topP, the model selects from the top 80% of the probability distribution of tokens that could be next in the sequence.The default value is the default value for the model that you are using. For more information, see Inference parameters for foundation models.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Message <Message[]>
The messages that you want to send to the model. Starting with version 4 of the SDK this property will default to null. If no data for this property is returned from the service the property will also be null. This was changed to improve performance and allow the SDK and caller to distinguish between a property not set or a property being empty to clear out a value. To retain the previous SDK behavior set the AWSConfigs.InitializeCollections static property to true.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesMessages
-ModelId <String>
Specifies the model or throughput with which to run inference, or the prompt resource to use in inference. The value depends on the resource that you use:The Converse API doesn't support imported models.
Required?True
Position?1
Accept pipeline input?True (ByValue, ByPropertyName)
-PerformanceConfig_Latency <PerformanceConfigLatency>
To use a latency-optimized version of the model, set to optimized.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-PromptVariable <Hashtable>
Contains a map of variables in a prompt from Prompt management to objects containing the values to fill in for them when running model invocation. This field is ignored if you don't specify a prompt resource in the modelId field. Starting with version 4 of the SDK this property will default to null. If no data for this property is returned from the service the property will also be null. This was changed to improve performance and allow the SDK and caller to distinguish between a property not set or a property being empty to clear out a value. To retain the previous SDK behavior set the AWSConfigs.InitializeCollections static property to true.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesPromptVariables
-RequestMetadata <Hashtable>
Key-value pairs that you can use to filter invocation logs. Starting with version 4 of the SDK this property will default to null. If no data for this property is returned from the service the property will also be null. This was changed to improve performance and allow the SDK and caller to distinguish between a property not set or a property being empty to clear out a value. To retain the previous SDK behavior set the AWSConfigs.InitializeCollections static property to true.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Select <String>
Use the -Select parameter to control the cmdlet output. The default value is 'Stream'. Specifying -Select '*' will result in the cmdlet returning the whole service response (Amazon.BedrockRuntime.Model.ConverseStreamResponse). Specifying the name of a property of type Amazon.BedrockRuntime.Model.ConverseStreamResponse will result in that property being returned. Specifying -Select '^ParameterName' will result in the cmdlet returning the selected cmdlet parameter value.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
A prompt that provides instructions or context to the model about the task it should perform, or the persona it should adopt during the conversation. Starting with version 4 of the SDK this property will default to null. If no data for this property is returned from the service the property will also be null. This was changed to improve performance and allow the SDK and caller to distinguish between a property not set or a property being empty to clear out a value. To retain the previous SDK behavior set the AWSConfigs.InitializeCollections static property to true.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Tool_Name <String>
The name of the tool that the model must request.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesToolConfig_ToolChoice_Tool_Name
-ToolChoice_Any <AnyToolChoice>
The model must request at least one tool (no text is generated).
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesToolConfig_ToolChoice_Any
-ToolChoice_Auto <AutoToolChoice>
(Default). The Model automatically decides if a tool should be called or whether to generate text instead.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesToolConfig_ToolChoice_Auto
-ToolConfig_Tool <Tool[]>
An array of tools that you want to pass to a model. Starting with version 4 of the SDK this property will default to null. If no data for this property is returned from the service the property will also be null. This was changed to improve performance and allow the SDK and caller to distinguish between a property not set or a property being empty to clear out a value. To retain the previous SDK behavior set the AWSConfigs.InitializeCollections static property to true.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesToolConfig_Tools

Common Credential and Region Parameters

-AccessKey <String>
The AWS access key for the user account. This can be a temporary access key if the corresponding session token is supplied to the -SessionToken parameter.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesAK
-Credential <AWSCredentials>
An AWSCredentials object instance containing access and secret key information, and optionally a token for session-based credentials.
Required?False
Position?Named
Accept pipeline input?True (ByValue, ByPropertyName)
-EndpointUrl <String>
The endpoint to make the call against.Note: This parameter is primarily for internal AWS use and is not required/should not be specified for normal usage. The cmdlets normally determine which endpoint to call based on the region specified to the -Region parameter or set as default in the shell (via Set-DefaultAWSRegion). Only specify this parameter if you must direct the call to a specific custom endpoint.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-NetworkCredential <PSCredential>
Used with SAML-based authentication when ProfileName references a SAML role profile. Contains the network credentials to be supplied during authentication with the configured identity provider's endpoint. This parameter is not required if the user's default network identity can or should be used during authentication.
Required?False
Position?Named
Accept pipeline input?True (ByValue, ByPropertyName)
-ProfileLocation <String>
Used to specify the name and location of the ini-format credential file (shared with the AWS CLI and other AWS SDKs)If this optional parameter is omitted this cmdlet will search the encrypted credential file used by the AWS SDK for .NET and AWS Toolkit for Visual Studio first. If the profile is not found then the cmdlet will search in the ini-format credential file at the default location: (user's home directory)\.aws\credentials.If this parameter is specified then this cmdlet will only search the ini-format credential file at the location given.As the current folder can vary in a shell or during script execution it is advised that you use specify a fully qualified path instead of a relative path.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesAWSProfilesLocation, ProfilesLocation
-ProfileName <String>
The user-defined name of an AWS credentials or SAML-based role profile containing credential information. The profile is expected to be found in the secure credential file shared with the AWS SDK for .NET and AWS Toolkit for Visual Studio. You can also specify the name of a profile stored in the .ini-format credential file used with the AWS CLI and other AWS SDKs.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesStoredCredentials, AWSProfileName
-Region <Object>
The system name of an AWS region or an AWSRegion instance. This governs the endpoint that will be used when calling service operations. Note that the AWS resources referenced in a call are usually region-specific.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesRegionToCall
-SecretKey <String>
The AWS secret key for the user account. This can be a temporary secret key if the corresponding session token is supplied to the -SessionToken parameter.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSK, SecretAccessKey
-SessionToken <String>
The session token if the access and secret keys are temporary session-based credentials.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesST

Outputs

This cmdlet returns an Amazon.BedrockRuntime.Model.ConverseStreamOutput object. The service call response (type Amazon.BedrockRuntime.Model.ConverseStreamResponse) can be returned by specifying '-Select *'.

Supported Version

AWS Tools for PowerShell: 2.x.y.z