interface PromptModelInferenceConfigurationProperty
Language | Type name |
---|---|
.NET | Amazon.CDK.aws_bedrock.CfnFlow.PromptModelInferenceConfigurationProperty |
Go | github.com/aws/aws-cdk-go/awscdk/v2/awsbedrock#CfnFlow_PromptModelInferenceConfigurationProperty |
Java | software.amazon.awscdk.services.bedrock.CfnFlow.PromptModelInferenceConfigurationProperty |
Python | aws_cdk.aws_bedrock.CfnFlow.PromptModelInferenceConfigurationProperty |
TypeScript | aws-cdk-lib » aws_bedrock » CfnFlow » PromptModelInferenceConfigurationProperty |
Contains inference configurations related to model inference for a prompt.
For more information, see Inference parameters .
Example
// The code below shows an example of how to instantiate this type.
// The values are placeholders you should change.
import { aws_bedrock as bedrock } from 'aws-cdk-lib';
const promptModelInferenceConfigurationProperty: bedrock.CfnFlow.PromptModelInferenceConfigurationProperty = {
maxTokens: 123,
stopSequences: ['stopSequences'],
temperature: 123,
topP: 123,
};
Properties
Name | Type | Description |
---|---|---|
max | number | The maximum number of tokens to return in the response. |
stop | string[] | A list of strings that define sequences after which the model will stop generating. |
temperature? | number | Controls the randomness of the response. |
top | number | The percentage of most-likely candidates that the model considers for the next token. |
maxTokens?
Type:
number
(optional)
The maximum number of tokens to return in the response.
stopSequences?
Type:
string[]
(optional)
A list of strings that define sequences after which the model will stop generating.
temperature?
Type:
number
(optional)
Controls the randomness of the response.
Choose a lower value for more predictable outputs and a higher value for more surprising outputs.
topP?
Type:
number
(optional)
The percentage of most-likely candidates that the model considers for the next token.