@Generated(value="com.amazonaws:aws-java-sdk-code-generator") public class DescribeInferenceExperimentResult extends AmazonWebServiceResult<ResponseMetadata> implements Serializable, Cloneable
Constructor and Description |
---|
DescribeInferenceExperimentResult() |
Modifier and Type | Method and Description |
---|---|
DescribeInferenceExperimentResult |
clone() |
boolean |
equals(Object obj) |
String |
getArn()
The ARN of the inference experiment being described.
|
Date |
getCompletionTime()
The timestamp at which the inference experiment was completed.
|
Date |
getCreationTime()
The timestamp at which you created the inference experiment.
|
InferenceExperimentDataStorageConfig |
getDataStorageConfig()
The Amazon S3 location and configuration for storing inference request and response data.
|
String |
getDescription()
The description of the inference experiment.
|
EndpointMetadata |
getEndpointMetadata()
The metadata of the endpoint on which the inference experiment ran.
|
String |
getKmsKey()
The Amazon Web Services Key Management Service (Amazon Web Services KMS) key that Amazon SageMaker uses to
encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
|
Date |
getLastModifiedTime()
The timestamp at which you last modified the inference experiment.
|
List<ModelVariantConfigSummary> |
getModelVariants()
An array of
ModelVariantConfigSummary objects. |
String |
getName()
The name of the inference experiment.
|
String |
getRoleArn()
The ARN of the IAM role that Amazon SageMaker can assume to access model artifacts and container images, and
manage Amazon SageMaker Inference endpoints for model deployment.
|
InferenceExperimentSchedule |
getSchedule()
The duration for which the inference experiment ran or will run.
|
ShadowModeConfig |
getShadowModeConfig()
The configuration of
ShadowMode inference experiment type, which shows the production variant that
takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the
inference requests. |
String |
getStatus()
The status of the inference experiment.
|
String |
getStatusReason()
The error message or client-specified
Reason from the StopInferenceExperiment API, that explains the status of the inference experiment. |
String |
getType()
The type of the inference experiment.
|
int |
hashCode() |
void |
setArn(String arn)
The ARN of the inference experiment being described.
|
void |
setCompletionTime(Date completionTime)
The timestamp at which the inference experiment was completed.
|
void |
setCreationTime(Date creationTime)
The timestamp at which you created the inference experiment.
|
void |
setDataStorageConfig(InferenceExperimentDataStorageConfig dataStorageConfig)
The Amazon S3 location and configuration for storing inference request and response data.
|
void |
setDescription(String description)
The description of the inference experiment.
|
void |
setEndpointMetadata(EndpointMetadata endpointMetadata)
The metadata of the endpoint on which the inference experiment ran.
|
void |
setKmsKey(String kmsKey)
The Amazon Web Services Key Management Service (Amazon Web Services KMS) key that Amazon SageMaker uses to
encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
|
void |
setLastModifiedTime(Date lastModifiedTime)
The timestamp at which you last modified the inference experiment.
|
void |
setModelVariants(Collection<ModelVariantConfigSummary> modelVariants)
An array of
ModelVariantConfigSummary objects. |
void |
setName(String name)
The name of the inference experiment.
|
void |
setRoleArn(String roleArn)
The ARN of the IAM role that Amazon SageMaker can assume to access model artifacts and container images, and
manage Amazon SageMaker Inference endpoints for model deployment.
|
void |
setSchedule(InferenceExperimentSchedule schedule)
The duration for which the inference experiment ran or will run.
|
void |
setShadowModeConfig(ShadowModeConfig shadowModeConfig)
The configuration of
ShadowMode inference experiment type, which shows the production variant that
takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the
inference requests. |
void |
setStatus(String status)
The status of the inference experiment.
|
void |
setStatusReason(String statusReason)
The error message or client-specified
Reason from the StopInferenceExperiment API, that explains the status of the inference experiment. |
void |
setType(String type)
The type of the inference experiment.
|
String |
toString()
Returns a string representation of this object.
|
DescribeInferenceExperimentResult |
withArn(String arn)
The ARN of the inference experiment being described.
|
DescribeInferenceExperimentResult |
withCompletionTime(Date completionTime)
The timestamp at which the inference experiment was completed.
|
DescribeInferenceExperimentResult |
withCreationTime(Date creationTime)
The timestamp at which you created the inference experiment.
|
DescribeInferenceExperimentResult |
withDataStorageConfig(InferenceExperimentDataStorageConfig dataStorageConfig)
The Amazon S3 location and configuration for storing inference request and response data.
|
DescribeInferenceExperimentResult |
withDescription(String description)
The description of the inference experiment.
|
DescribeInferenceExperimentResult |
withEndpointMetadata(EndpointMetadata endpointMetadata)
The metadata of the endpoint on which the inference experiment ran.
|
DescribeInferenceExperimentResult |
withKmsKey(String kmsKey)
The Amazon Web Services Key Management Service (Amazon Web Services KMS) key that Amazon SageMaker uses to
encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
|
DescribeInferenceExperimentResult |
withLastModifiedTime(Date lastModifiedTime)
The timestamp at which you last modified the inference experiment.
|
DescribeInferenceExperimentResult |
withModelVariants(Collection<ModelVariantConfigSummary> modelVariants)
An array of
ModelVariantConfigSummary objects. |
DescribeInferenceExperimentResult |
withModelVariants(ModelVariantConfigSummary... modelVariants)
An array of
ModelVariantConfigSummary objects. |
DescribeInferenceExperimentResult |
withName(String name)
The name of the inference experiment.
|
DescribeInferenceExperimentResult |
withRoleArn(String roleArn)
The ARN of the IAM role that Amazon SageMaker can assume to access model artifacts and container images, and
manage Amazon SageMaker Inference endpoints for model deployment.
|
DescribeInferenceExperimentResult |
withSchedule(InferenceExperimentSchedule schedule)
The duration for which the inference experiment ran or will run.
|
DescribeInferenceExperimentResult |
withShadowModeConfig(ShadowModeConfig shadowModeConfig)
The configuration of
ShadowMode inference experiment type, which shows the production variant that
takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the
inference requests. |
DescribeInferenceExperimentResult |
withStatus(InferenceExperimentStatus status)
The status of the inference experiment.
|
DescribeInferenceExperimentResult |
withStatus(String status)
The status of the inference experiment.
|
DescribeInferenceExperimentResult |
withStatusReason(String statusReason)
The error message or client-specified
Reason from the StopInferenceExperiment API, that explains the status of the inference experiment. |
DescribeInferenceExperimentResult |
withType(InferenceExperimentType type)
The type of the inference experiment.
|
DescribeInferenceExperimentResult |
withType(String type)
The type of the inference experiment.
|
getSdkHttpMetadata, getSdkResponseMetadata, setSdkHttpMetadata, setSdkResponseMetadata
public void setArn(String arn)
The ARN of the inference experiment being described.
arn
- The ARN of the inference experiment being described.public String getArn()
The ARN of the inference experiment being described.
public DescribeInferenceExperimentResult withArn(String arn)
The ARN of the inference experiment being described.
arn
- The ARN of the inference experiment being described.public void setName(String name)
The name of the inference experiment.
name
- The name of the inference experiment.public String getName()
The name of the inference experiment.
public DescribeInferenceExperimentResult withName(String name)
The name of the inference experiment.
name
- The name of the inference experiment.public void setType(String type)
The type of the inference experiment.
type
- The type of the inference experiment.InferenceExperimentType
public String getType()
The type of the inference experiment.
InferenceExperimentType
public DescribeInferenceExperimentResult withType(String type)
The type of the inference experiment.
type
- The type of the inference experiment.InferenceExperimentType
public DescribeInferenceExperimentResult withType(InferenceExperimentType type)
The type of the inference experiment.
type
- The type of the inference experiment.InferenceExperimentType
public void setSchedule(InferenceExperimentSchedule schedule)
The duration for which the inference experiment ran or will run.
schedule
- The duration for which the inference experiment ran or will run.public InferenceExperimentSchedule getSchedule()
The duration for which the inference experiment ran or will run.
public DescribeInferenceExperimentResult withSchedule(InferenceExperimentSchedule schedule)
The duration for which the inference experiment ran or will run.
schedule
- The duration for which the inference experiment ran or will run.public void setStatus(String status)
The status of the inference experiment. The following are the possible statuses for an inference experiment:
Creating
- Amazon SageMaker is creating your experiment.
Created
- Amazon SageMaker has finished the creation of your experiment and will begin the
experiment at the scheduled time.
Updating
- When you make changes to your experiment, your experiment shows as updating.
Starting
- Amazon SageMaker is beginning your experiment.
Running
- Your experiment is in progress.
Stopping
- Amazon SageMaker is stopping your experiment.
Completed
- Your experiment has completed.
Cancelled
- When you conclude your experiment early using the StopInferenceExperiment API, or if any operation fails with an unexpected error, it shows as cancelled.
status
- The status of the inference experiment. The following are the possible statuses for an inference
experiment:
Creating
- Amazon SageMaker is creating your experiment.
Created
- Amazon SageMaker has finished the creation of your experiment and will begin the
experiment at the scheduled time.
Updating
- When you make changes to your experiment, your experiment shows as updating.
Starting
- Amazon SageMaker is beginning your experiment.
Running
- Your experiment is in progress.
Stopping
- Amazon SageMaker is stopping your experiment.
Completed
- Your experiment has completed.
Cancelled
- When you conclude your experiment early using the StopInferenceExperiment API, or if any operation fails with an unexpected error, it shows as
cancelled.
InferenceExperimentStatus
public String getStatus()
The status of the inference experiment. The following are the possible statuses for an inference experiment:
Creating
- Amazon SageMaker is creating your experiment.
Created
- Amazon SageMaker has finished the creation of your experiment and will begin the
experiment at the scheduled time.
Updating
- When you make changes to your experiment, your experiment shows as updating.
Starting
- Amazon SageMaker is beginning your experiment.
Running
- Your experiment is in progress.
Stopping
- Amazon SageMaker is stopping your experiment.
Completed
- Your experiment has completed.
Cancelled
- When you conclude your experiment early using the StopInferenceExperiment API, or if any operation fails with an unexpected error, it shows as cancelled.
Creating
- Amazon SageMaker is creating your experiment.
Created
- Amazon SageMaker has finished the creation of your experiment and will begin the
experiment at the scheduled time.
Updating
- When you make changes to your experiment, your experiment shows as updating.
Starting
- Amazon SageMaker is beginning your experiment.
Running
- Your experiment is in progress.
Stopping
- Amazon SageMaker is stopping your experiment.
Completed
- Your experiment has completed.
Cancelled
- When you conclude your experiment early using the StopInferenceExperiment API, or if any operation fails with an unexpected error, it shows as
cancelled.
InferenceExperimentStatus
public DescribeInferenceExperimentResult withStatus(String status)
The status of the inference experiment. The following are the possible statuses for an inference experiment:
Creating
- Amazon SageMaker is creating your experiment.
Created
- Amazon SageMaker has finished the creation of your experiment and will begin the
experiment at the scheduled time.
Updating
- When you make changes to your experiment, your experiment shows as updating.
Starting
- Amazon SageMaker is beginning your experiment.
Running
- Your experiment is in progress.
Stopping
- Amazon SageMaker is stopping your experiment.
Completed
- Your experiment has completed.
Cancelled
- When you conclude your experiment early using the StopInferenceExperiment API, or if any operation fails with an unexpected error, it shows as cancelled.
status
- The status of the inference experiment. The following are the possible statuses for an inference
experiment:
Creating
- Amazon SageMaker is creating your experiment.
Created
- Amazon SageMaker has finished the creation of your experiment and will begin the
experiment at the scheduled time.
Updating
- When you make changes to your experiment, your experiment shows as updating.
Starting
- Amazon SageMaker is beginning your experiment.
Running
- Your experiment is in progress.
Stopping
- Amazon SageMaker is stopping your experiment.
Completed
- Your experiment has completed.
Cancelled
- When you conclude your experiment early using the StopInferenceExperiment API, or if any operation fails with an unexpected error, it shows as
cancelled.
InferenceExperimentStatus
public DescribeInferenceExperimentResult withStatus(InferenceExperimentStatus status)
The status of the inference experiment. The following are the possible statuses for an inference experiment:
Creating
- Amazon SageMaker is creating your experiment.
Created
- Amazon SageMaker has finished the creation of your experiment and will begin the
experiment at the scheduled time.
Updating
- When you make changes to your experiment, your experiment shows as updating.
Starting
- Amazon SageMaker is beginning your experiment.
Running
- Your experiment is in progress.
Stopping
- Amazon SageMaker is stopping your experiment.
Completed
- Your experiment has completed.
Cancelled
- When you conclude your experiment early using the StopInferenceExperiment API, or if any operation fails with an unexpected error, it shows as cancelled.
status
- The status of the inference experiment. The following are the possible statuses for an inference
experiment:
Creating
- Amazon SageMaker is creating your experiment.
Created
- Amazon SageMaker has finished the creation of your experiment and will begin the
experiment at the scheduled time.
Updating
- When you make changes to your experiment, your experiment shows as updating.
Starting
- Amazon SageMaker is beginning your experiment.
Running
- Your experiment is in progress.
Stopping
- Amazon SageMaker is stopping your experiment.
Completed
- Your experiment has completed.
Cancelled
- When you conclude your experiment early using the StopInferenceExperiment API, or if any operation fails with an unexpected error, it shows as
cancelled.
InferenceExperimentStatus
public void setStatusReason(String statusReason)
The error message or client-specified Reason
from the StopInferenceExperiment API, that explains the status of the inference experiment.
statusReason
- The error message or client-specified Reason
from the StopInferenceExperiment API, that explains the status of the inference experiment.public String getStatusReason()
The error message or client-specified Reason
from the StopInferenceExperiment API, that explains the status of the inference experiment.
Reason
from the StopInferenceExperiment API, that explains the status of the inference experiment.public DescribeInferenceExperimentResult withStatusReason(String statusReason)
The error message or client-specified Reason
from the StopInferenceExperiment API, that explains the status of the inference experiment.
statusReason
- The error message or client-specified Reason
from the StopInferenceExperiment API, that explains the status of the inference experiment.public void setDescription(String description)
The description of the inference experiment.
description
- The description of the inference experiment.public String getDescription()
The description of the inference experiment.
public DescribeInferenceExperimentResult withDescription(String description)
The description of the inference experiment.
description
- The description of the inference experiment.public void setCreationTime(Date creationTime)
The timestamp at which you created the inference experiment.
creationTime
- The timestamp at which you created the inference experiment.public Date getCreationTime()
The timestamp at which you created the inference experiment.
public DescribeInferenceExperimentResult withCreationTime(Date creationTime)
The timestamp at which you created the inference experiment.
creationTime
- The timestamp at which you created the inference experiment.public void setCompletionTime(Date completionTime)
The timestamp at which the inference experiment was completed.
completionTime
- The timestamp at which the inference experiment was completed.public Date getCompletionTime()
The timestamp at which the inference experiment was completed.
public DescribeInferenceExperimentResult withCompletionTime(Date completionTime)
The timestamp at which the inference experiment was completed.
completionTime
- The timestamp at which the inference experiment was completed.public void setLastModifiedTime(Date lastModifiedTime)
The timestamp at which you last modified the inference experiment.
lastModifiedTime
- The timestamp at which you last modified the inference experiment.public Date getLastModifiedTime()
The timestamp at which you last modified the inference experiment.
public DescribeInferenceExperimentResult withLastModifiedTime(Date lastModifiedTime)
The timestamp at which you last modified the inference experiment.
lastModifiedTime
- The timestamp at which you last modified the inference experiment.public void setRoleArn(String roleArn)
The ARN of the IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
roleArn
- The ARN of the IAM role that Amazon SageMaker can assume to access model artifacts and container images,
and manage Amazon SageMaker Inference endpoints for model deployment.public String getRoleArn()
The ARN of the IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
public DescribeInferenceExperimentResult withRoleArn(String roleArn)
The ARN of the IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
roleArn
- The ARN of the IAM role that Amazon SageMaker can assume to access model artifacts and container images,
and manage Amazon SageMaker Inference endpoints for model deployment.public void setEndpointMetadata(EndpointMetadata endpointMetadata)
The metadata of the endpoint on which the inference experiment ran.
endpointMetadata
- The metadata of the endpoint on which the inference experiment ran.public EndpointMetadata getEndpointMetadata()
The metadata of the endpoint on which the inference experiment ran.
public DescribeInferenceExperimentResult withEndpointMetadata(EndpointMetadata endpointMetadata)
The metadata of the endpoint on which the inference experiment ran.
endpointMetadata
- The metadata of the endpoint on which the inference experiment ran.public List<ModelVariantConfigSummary> getModelVariants()
An array of ModelVariantConfigSummary
objects. There is one for each variant in the inference
experiment. Each ModelVariantConfigSummary
object in the array describes the infrastructure
configuration for deploying the corresponding variant.
ModelVariantConfigSummary
objects. There is one for each variant in the
inference experiment. Each ModelVariantConfigSummary
object in the array describes the
infrastructure configuration for deploying the corresponding variant.public void setModelVariants(Collection<ModelVariantConfigSummary> modelVariants)
An array of ModelVariantConfigSummary
objects. There is one for each variant in the inference
experiment. Each ModelVariantConfigSummary
object in the array describes the infrastructure
configuration for deploying the corresponding variant.
modelVariants
- An array of ModelVariantConfigSummary
objects. There is one for each variant in the inference
experiment. Each ModelVariantConfigSummary
object in the array describes the infrastructure
configuration for deploying the corresponding variant.public DescribeInferenceExperimentResult withModelVariants(ModelVariantConfigSummary... modelVariants)
An array of ModelVariantConfigSummary
objects. There is one for each variant in the inference
experiment. Each ModelVariantConfigSummary
object in the array describes the infrastructure
configuration for deploying the corresponding variant.
NOTE: This method appends the values to the existing list (if any). Use
setModelVariants(java.util.Collection)
or withModelVariants(java.util.Collection)
if you want
to override the existing values.
modelVariants
- An array of ModelVariantConfigSummary
objects. There is one for each variant in the inference
experiment. Each ModelVariantConfigSummary
object in the array describes the infrastructure
configuration for deploying the corresponding variant.public DescribeInferenceExperimentResult withModelVariants(Collection<ModelVariantConfigSummary> modelVariants)
An array of ModelVariantConfigSummary
objects. There is one for each variant in the inference
experiment. Each ModelVariantConfigSummary
object in the array describes the infrastructure
configuration for deploying the corresponding variant.
modelVariants
- An array of ModelVariantConfigSummary
objects. There is one for each variant in the inference
experiment. Each ModelVariantConfigSummary
object in the array describes the infrastructure
configuration for deploying the corresponding variant.public void setDataStorageConfig(InferenceExperimentDataStorageConfig dataStorageConfig)
The Amazon S3 location and configuration for storing inference request and response data.
dataStorageConfig
- The Amazon S3 location and configuration for storing inference request and response data.public InferenceExperimentDataStorageConfig getDataStorageConfig()
The Amazon S3 location and configuration for storing inference request and response data.
public DescribeInferenceExperimentResult withDataStorageConfig(InferenceExperimentDataStorageConfig dataStorageConfig)
The Amazon S3 location and configuration for storing inference request and response data.
dataStorageConfig
- The Amazon S3 location and configuration for storing inference request and response data.public void setShadowModeConfig(ShadowModeConfig shadowModeConfig)
The configuration of ShadowMode
inference experiment type, which shows the production variant that
takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the
inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker
replicates.
shadowModeConfig
- The configuration of ShadowMode
inference experiment type, which shows the production variant
that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a
percentage of the inference requests. For the shadow variant it also shows the percentage of requests that
Amazon SageMaker replicates.public ShadowModeConfig getShadowModeConfig()
The configuration of ShadowMode
inference experiment type, which shows the production variant that
takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the
inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker
replicates.
ShadowMode
inference experiment type, which shows the production
variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker
replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of
requests that Amazon SageMaker replicates.public DescribeInferenceExperimentResult withShadowModeConfig(ShadowModeConfig shadowModeConfig)
The configuration of ShadowMode
inference experiment type, which shows the production variant that
takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the
inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker
replicates.
shadowModeConfig
- The configuration of ShadowMode
inference experiment type, which shows the production variant
that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a
percentage of the inference requests. For the shadow variant it also shows the percentage of requests that
Amazon SageMaker replicates.public void setKmsKey(String kmsKey)
The Amazon Web Services Key Management Service (Amazon Web Services KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint. For more information, see CreateInferenceExperiment.
kmsKey
- The Amazon Web Services Key Management Service (Amazon Web Services KMS) key that Amazon SageMaker uses to
encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint. For more
information, see CreateInferenceExperiment.public String getKmsKey()
The Amazon Web Services Key Management Service (Amazon Web Services KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint. For more information, see CreateInferenceExperiment.
public DescribeInferenceExperimentResult withKmsKey(String kmsKey)
The Amazon Web Services Key Management Service (Amazon Web Services KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint. For more information, see CreateInferenceExperiment.
kmsKey
- The Amazon Web Services Key Management Service (Amazon Web Services KMS) key that Amazon SageMaker uses to
encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint. For more
information, see CreateInferenceExperiment.public String toString()
toString
in class Object
Object.toString()
public DescribeInferenceExperimentResult clone()