Class CfnAgent.InferenceConfigurationProperty.Builder
java.lang.Object
software.amazon.awscdk.services.bedrock.CfnAgent.InferenceConfigurationProperty.Builder
- All Implemented Interfaces:
software.amazon.jsii.Builder<CfnAgent.InferenceConfigurationProperty>
- Enclosing interface:
CfnAgent.InferenceConfigurationProperty
@Stability(Stable)
public static final class CfnAgent.InferenceConfigurationProperty.Builder
extends Object
implements software.amazon.jsii.Builder<CfnAgent.InferenceConfigurationProperty>
A builder for
CfnAgent.InferenceConfigurationProperty
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()
Builds the configured instance.maximumLength
(Number maximumLength) Sets the value ofCfnAgent.InferenceConfigurationProperty.getMaximumLength()
stopSequences
(List<String> stopSequences) Sets the value ofCfnAgent.InferenceConfigurationProperty.getStopSequences()
temperature
(Number temperature) Sets the value ofCfnAgent.InferenceConfigurationProperty.getTemperature()
Sets the value ofCfnAgent.InferenceConfigurationProperty.getTopK()
Sets the value ofCfnAgent.InferenceConfigurationProperty.getTopP()
-
Constructor Details
-
Builder
public Builder()
-
-
Method Details
-
maximumLength
@Stability(Stable) public CfnAgent.InferenceConfigurationProperty.Builder maximumLength(Number maximumLength) Sets the value ofCfnAgent.InferenceConfigurationProperty.getMaximumLength()
- Parameters:
maximumLength
- The maximum number of tokens allowed in the generated response.- Returns:
this
-
stopSequences
@Stability(Stable) public CfnAgent.InferenceConfigurationProperty.Builder stopSequences(List<String> stopSequences) Sets the value ofCfnAgent.InferenceConfigurationProperty.getStopSequences()
- Parameters:
stopSequences
- A list of stop sequences. A stop sequence is a sequence of characters that causes the model to stop generating the response.- Returns:
this
-
temperature
@Stability(Stable) public CfnAgent.InferenceConfigurationProperty.Builder temperature(Number temperature) Sets the value ofCfnAgent.InferenceConfigurationProperty.getTemperature()
- Parameters:
temperature
- The likelihood of the model selecting higher-probability options while generating a response. A lower value makes the model more likely to choose higher-probability options, while a higher value makes the model more likely to choose lower-probability options.The default value is the default value for the model that you are using. For more information, see Inference parameters for foundation models .
- Returns:
this
-
topK
Sets the value ofCfnAgent.InferenceConfigurationProperty.getTopK()
- Parameters:
topK
- While generating a response, the model determines the probability of the following token at each point of generation. The value that you set fortopK
is the number of most-likely candidates from which the model chooses the next token in the sequence. For example, if you settopK
to 50, the model selects the next token from among the top 50 most likely choices.- Returns:
this
-
topP
Sets the value ofCfnAgent.InferenceConfigurationProperty.getTopP()
- Parameters:
topP
- The percentage of most-likely candidates that the model considers for the next token. For example, if you choose a value of 0.8 fortopP
, the model selects from the top 80% of the probability distribution of tokens that could be next in the sequence.The default value is the default value for the model that you are using. For more information, see Inference parameters for foundation models .
- Returns:
this
-
build
Builds the configured instance.- Specified by:
build
in interfacesoftware.amazon.jsii.Builder<CfnAgent.InferenceConfigurationProperty>
- Returns:
- a new instance of
CfnAgent.InferenceConfigurationProperty
- Throws:
NullPointerException
- if any required attribute was not provided
-