interface InferenceSpecificationProperty
| Language | Type name |
|---|---|
.NET | Amazon.CDK.Mixins.Preview.AWS.SageMaker.Mixins.CfnModelPackagePropsMixin.InferenceSpecificationProperty |
Go | github.com/aws/aws-cdk-go/awscdkmixinspreview/v2/awssagemaker/mixins#CfnModelPackagePropsMixin_InferenceSpecificationProperty |
Java | software.amazon.awscdk.mixins.preview.services.sagemaker.mixins.CfnModelPackagePropsMixin.InferenceSpecificationProperty |
Python | aws_cdk.mixins_preview.aws_sagemaker.mixins.CfnModelPackagePropsMixin.InferenceSpecificationProperty |
TypeScript | @aws-cdk/mixins-preview » aws_sagemaker » mixins » CfnModelPackagePropsMixin » InferenceSpecificationProperty |
Defines how to perform inference generation after a training job is run.
Example
// The code below shows an example of how to instantiate this type.
// The values are placeholders you should change.
import { mixins as sagemaker_mixins } from '@aws-cdk/mixins-preview/aws-sagemaker';
declare const modelInput: any;
const inferenceSpecificationProperty: sagemaker_mixins.CfnModelPackagePropsMixin.InferenceSpecificationProperty = {
containers: [{
containerHostname: 'containerHostname',
environment: {
environmentKey: 'environment',
},
framework: 'framework',
frameworkVersion: 'frameworkVersion',
image: 'image',
imageDigest: 'imageDigest',
modelDataSource: {
s3DataSource: {
compressionType: 'compressionType',
modelAccessConfig: {
acceptEula: false,
},
s3DataType: 's3DataType',
s3Uri: 's3Uri',
},
},
modelDataUrl: 'modelDataUrl',
modelInput: modelInput,
nearestModelName: 'nearestModelName',
}],
supportedContentTypes: ['supportedContentTypes'],
supportedRealtimeInferenceInstanceTypes: ['supportedRealtimeInferenceInstanceTypes'],
supportedResponseMimeTypes: ['supportedResponseMimeTypes'],
supportedTransformInstanceTypes: ['supportedTransformInstanceTypes'],
};
Properties
| Name | Type | Description |
|---|---|---|
| containers? | IResolvable | (IResolvable | Model)[] | The Amazon ECR registry path of the Docker image that contains the inference code. |
| supported | string[] | The supported MIME types for the input data. |
| supported | string[] | A list of the instance types that are used to generate inferences in real-time. |
| supported | string[] | The supported MIME types for the output data. |
| supported | string[] | A list of the instance types on which a transformation job can be run or on which an endpoint can be deployed. |
containers?
Type:
IResolvable | (IResolvable | Model)[]
(optional)
The Amazon ECR registry path of the Docker image that contains the inference code.
supportedContentTypes?
Type:
string[]
(optional)
The supported MIME types for the input data.
supportedRealtimeInferenceInstanceTypes?
Type:
string[]
(optional)
A list of the instance types that are used to generate inferences in real-time.
This parameter is required for unversioned models, and optional for versioned models.
supportedResponseMimeTypes?
Type:
string[]
(optional)
The supported MIME types for the output data.
supportedTransformInstanceTypes?
Type:
string[]
(optional)
A list of the instance types on which a transformation job can be run or on which an endpoint can be deployed.
This parameter is required for unversioned models, and optional for versioned models.

.NET
Go
Java
Python
TypeScript