AWS SDK Version 3 for .NET
API Reference

AWS services or capabilities described in AWS Documentation may vary by region/location. Click Getting Started with Amazon AWS to see specific differences applicable to the China (Beijing) Region.

Container for the parameters to the ListInferenceProfiles operation. Returns a list of inference profiles that you can use. For more information, see Increase throughput and resilience with cross-region inference in Amazon Bedrock. in the Amazon Bedrock User Guide.

Inheritance Hierarchy

System.Object
  Amazon.Runtime.AmazonWebServiceRequest
    Amazon.Bedrock.AmazonBedrockRequest
      Amazon.Bedrock.Model.ListInferenceProfilesRequest

Namespace: Amazon.Bedrock.Model
Assembly: AWSSDK.Bedrock.dll
Version: 3.x.y.z

Syntax

C#
public class ListInferenceProfilesRequest : AmazonBedrockRequest
         IAmazonWebServiceRequest

The ListInferenceProfilesRequest type exposes the following members

Constructors

Properties

NameTypeDescription
Public Property MaxResults System.Int32

Gets and sets the property MaxResults.

The maximum number of results to return in the response. If the total number of results is greater than this value, use the token returned in the response in the nextToken field when making another request to return the next batch of results.

Public Property NextToken System.String

Gets and sets the property NextToken.

If the total number of results is greater than the maxResults value provided in the request, enter the token returned in the nextToken field in the response in this field to return the next batch of results.

Public Property TypeEquals Amazon.Bedrock.InferenceProfileType

Gets and sets the property TypeEquals.

Filters for inference profiles that match the type you specify.

  • SYSTEM_DEFINED – The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.

  • APPLICATION – The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.

Version Information

.NET:
Supported in: 8.0 and newer, Core 3.1

.NET Standard:
Supported in: 2.0

.NET Framework:
Supported in: 4.5 and newer, 3.5