@Generated(value="com.amazonaws:aws-java-sdk-code-generator") public class InferenceExecutionSummary extends Object implements Serializable, Cloneable, StructuredPojo
Contains information about the specific inference execution, including input and output data configuration, inference scheduling information, status, and so on.
Constructor and Description |
---|
InferenceExecutionSummary() |
Modifier and Type | Method and Description |
---|---|
InferenceExecutionSummary |
clone() |
boolean |
equals(Object obj) |
S3Object |
getCustomerResultObject()
The S3 object that the inference execution results were uploaded to.
|
Date |
getDataEndTime()
Indicates the time reference in the dataset at which the inference execution stopped.
|
InferenceInputConfiguration |
getDataInputConfiguration()
Specifies configuration information for the input data for the inference scheduler, including delimiter, format,
and dataset location.
|
InferenceOutputConfiguration |
getDataOutputConfiguration()
Specifies configuration information for the output results from for the inference execution, including the output
Amazon S3 location.
|
Date |
getDataStartTime()
Indicates the time reference in the dataset at which the inference execution began.
|
String |
getFailedReason()
Specifies the reason for failure when an inference execution has failed.
|
String |
getInferenceSchedulerArn()
The Amazon Resource Name (ARN) of the inference scheduler being used for the inference execution.
|
String |
getInferenceSchedulerName()
The name of the inference scheduler being used for the inference execution.
|
String |
getModelArn()
The Amazon Resource Name (ARN) of the machine learning model used for the inference execution.
|
String |
getModelName()
The name of the machine learning model being used for the inference execution.
|
Long |
getModelVersion()
The model version used for the inference execution.
|
String |
getModelVersionArn()
The Amazon Resource Number (ARN) of the model version used for the inference execution.
|
Date |
getScheduledStartTime()
Indicates the start time at which the inference scheduler began the specific inference execution.
|
String |
getStatus()
Indicates the status of the inference execution.
|
int |
hashCode() |
void |
marshall(ProtocolMarshaller protocolMarshaller)
Marshalls this structured data using the given
ProtocolMarshaller . |
void |
setCustomerResultObject(S3Object customerResultObject)
The S3 object that the inference execution results were uploaded to.
|
void |
setDataEndTime(Date dataEndTime)
Indicates the time reference in the dataset at which the inference execution stopped.
|
void |
setDataInputConfiguration(InferenceInputConfiguration dataInputConfiguration)
Specifies configuration information for the input data for the inference scheduler, including delimiter, format,
and dataset location.
|
void |
setDataOutputConfiguration(InferenceOutputConfiguration dataOutputConfiguration)
Specifies configuration information for the output results from for the inference execution, including the output
Amazon S3 location.
|
void |
setDataStartTime(Date dataStartTime)
Indicates the time reference in the dataset at which the inference execution began.
|
void |
setFailedReason(String failedReason)
Specifies the reason for failure when an inference execution has failed.
|
void |
setInferenceSchedulerArn(String inferenceSchedulerArn)
The Amazon Resource Name (ARN) of the inference scheduler being used for the inference execution.
|
void |
setInferenceSchedulerName(String inferenceSchedulerName)
The name of the inference scheduler being used for the inference execution.
|
void |
setModelArn(String modelArn)
The Amazon Resource Name (ARN) of the machine learning model used for the inference execution.
|
void |
setModelName(String modelName)
The name of the machine learning model being used for the inference execution.
|
void |
setModelVersion(Long modelVersion)
The model version used for the inference execution.
|
void |
setModelVersionArn(String modelVersionArn)
The Amazon Resource Number (ARN) of the model version used for the inference execution.
|
void |
setScheduledStartTime(Date scheduledStartTime)
Indicates the start time at which the inference scheduler began the specific inference execution.
|
void |
setStatus(String status)
Indicates the status of the inference execution.
|
String |
toString()
Returns a string representation of this object.
|
InferenceExecutionSummary |
withCustomerResultObject(S3Object customerResultObject)
The S3 object that the inference execution results were uploaded to.
|
InferenceExecutionSummary |
withDataEndTime(Date dataEndTime)
Indicates the time reference in the dataset at which the inference execution stopped.
|
InferenceExecutionSummary |
withDataInputConfiguration(InferenceInputConfiguration dataInputConfiguration)
Specifies configuration information for the input data for the inference scheduler, including delimiter, format,
and dataset location.
|
InferenceExecutionSummary |
withDataOutputConfiguration(InferenceOutputConfiguration dataOutputConfiguration)
Specifies configuration information for the output results from for the inference execution, including the output
Amazon S3 location.
|
InferenceExecutionSummary |
withDataStartTime(Date dataStartTime)
Indicates the time reference in the dataset at which the inference execution began.
|
InferenceExecutionSummary |
withFailedReason(String failedReason)
Specifies the reason for failure when an inference execution has failed.
|
InferenceExecutionSummary |
withInferenceSchedulerArn(String inferenceSchedulerArn)
The Amazon Resource Name (ARN) of the inference scheduler being used for the inference execution.
|
InferenceExecutionSummary |
withInferenceSchedulerName(String inferenceSchedulerName)
The name of the inference scheduler being used for the inference execution.
|
InferenceExecutionSummary |
withModelArn(String modelArn)
The Amazon Resource Name (ARN) of the machine learning model used for the inference execution.
|
InferenceExecutionSummary |
withModelName(String modelName)
The name of the machine learning model being used for the inference execution.
|
InferenceExecutionSummary |
withModelVersion(Long modelVersion)
The model version used for the inference execution.
|
InferenceExecutionSummary |
withModelVersionArn(String modelVersionArn)
The Amazon Resource Number (ARN) of the model version used for the inference execution.
|
InferenceExecutionSummary |
withScheduledStartTime(Date scheduledStartTime)
Indicates the start time at which the inference scheduler began the specific inference execution.
|
InferenceExecutionSummary |
withStatus(InferenceExecutionStatus status)
Indicates the status of the inference execution.
|
InferenceExecutionSummary |
withStatus(String status)
Indicates the status of the inference execution.
|
public void setModelName(String modelName)
The name of the machine learning model being used for the inference execution.
modelName
- The name of the machine learning model being used for the inference execution.public String getModelName()
The name of the machine learning model being used for the inference execution.
public InferenceExecutionSummary withModelName(String modelName)
The name of the machine learning model being used for the inference execution.
modelName
- The name of the machine learning model being used for the inference execution.public void setModelArn(String modelArn)
The Amazon Resource Name (ARN) of the machine learning model used for the inference execution.
modelArn
- The Amazon Resource Name (ARN) of the machine learning model used for the inference execution.public String getModelArn()
The Amazon Resource Name (ARN) of the machine learning model used for the inference execution.
public InferenceExecutionSummary withModelArn(String modelArn)
The Amazon Resource Name (ARN) of the machine learning model used for the inference execution.
modelArn
- The Amazon Resource Name (ARN) of the machine learning model used for the inference execution.public void setInferenceSchedulerName(String inferenceSchedulerName)
The name of the inference scheduler being used for the inference execution.
inferenceSchedulerName
- The name of the inference scheduler being used for the inference execution.public String getInferenceSchedulerName()
The name of the inference scheduler being used for the inference execution.
public InferenceExecutionSummary withInferenceSchedulerName(String inferenceSchedulerName)
The name of the inference scheduler being used for the inference execution.
inferenceSchedulerName
- The name of the inference scheduler being used for the inference execution.public void setInferenceSchedulerArn(String inferenceSchedulerArn)
The Amazon Resource Name (ARN) of the inference scheduler being used for the inference execution.
inferenceSchedulerArn
- The Amazon Resource Name (ARN) of the inference scheduler being used for the inference execution.public String getInferenceSchedulerArn()
The Amazon Resource Name (ARN) of the inference scheduler being used for the inference execution.
public InferenceExecutionSummary withInferenceSchedulerArn(String inferenceSchedulerArn)
The Amazon Resource Name (ARN) of the inference scheduler being used for the inference execution.
inferenceSchedulerArn
- The Amazon Resource Name (ARN) of the inference scheduler being used for the inference execution.public void setScheduledStartTime(Date scheduledStartTime)
Indicates the start time at which the inference scheduler began the specific inference execution.
scheduledStartTime
- Indicates the start time at which the inference scheduler began the specific inference execution.public Date getScheduledStartTime()
Indicates the start time at which the inference scheduler began the specific inference execution.
public InferenceExecutionSummary withScheduledStartTime(Date scheduledStartTime)
Indicates the start time at which the inference scheduler began the specific inference execution.
scheduledStartTime
- Indicates the start time at which the inference scheduler began the specific inference execution.public void setDataStartTime(Date dataStartTime)
Indicates the time reference in the dataset at which the inference execution began.
dataStartTime
- Indicates the time reference in the dataset at which the inference execution began.public Date getDataStartTime()
Indicates the time reference in the dataset at which the inference execution began.
public InferenceExecutionSummary withDataStartTime(Date dataStartTime)
Indicates the time reference in the dataset at which the inference execution began.
dataStartTime
- Indicates the time reference in the dataset at which the inference execution began.public void setDataEndTime(Date dataEndTime)
Indicates the time reference in the dataset at which the inference execution stopped.
dataEndTime
- Indicates the time reference in the dataset at which the inference execution stopped.public Date getDataEndTime()
Indicates the time reference in the dataset at which the inference execution stopped.
public InferenceExecutionSummary withDataEndTime(Date dataEndTime)
Indicates the time reference in the dataset at which the inference execution stopped.
dataEndTime
- Indicates the time reference in the dataset at which the inference execution stopped.public void setDataInputConfiguration(InferenceInputConfiguration dataInputConfiguration)
Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.
dataInputConfiguration
- Specifies configuration information for the input data for the inference scheduler, including delimiter,
format, and dataset location.public InferenceInputConfiguration getDataInputConfiguration()
Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.
public InferenceExecutionSummary withDataInputConfiguration(InferenceInputConfiguration dataInputConfiguration)
Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.
dataInputConfiguration
- Specifies configuration information for the input data for the inference scheduler, including delimiter,
format, and dataset location.public void setDataOutputConfiguration(InferenceOutputConfiguration dataOutputConfiguration)
Specifies configuration information for the output results from for the inference execution, including the output Amazon S3 location.
dataOutputConfiguration
- Specifies configuration information for the output results from for the inference execution, including the
output Amazon S3 location.public InferenceOutputConfiguration getDataOutputConfiguration()
Specifies configuration information for the output results from for the inference execution, including the output Amazon S3 location.
public InferenceExecutionSummary withDataOutputConfiguration(InferenceOutputConfiguration dataOutputConfiguration)
Specifies configuration information for the output results from for the inference execution, including the output Amazon S3 location.
dataOutputConfiguration
- Specifies configuration information for the output results from for the inference execution, including the
output Amazon S3 location.public void setCustomerResultObject(S3Object customerResultObject)
The S3 object that the inference execution results were uploaded to.
customerResultObject
- The S3 object that the inference execution results were uploaded to.public S3Object getCustomerResultObject()
The S3 object that the inference execution results were uploaded to.
public InferenceExecutionSummary withCustomerResultObject(S3Object customerResultObject)
The S3 object that the inference execution results were uploaded to.
customerResultObject
- The S3 object that the inference execution results were uploaded to.public void setStatus(String status)
Indicates the status of the inference execution.
status
- Indicates the status of the inference execution.InferenceExecutionStatus
public String getStatus()
Indicates the status of the inference execution.
InferenceExecutionStatus
public InferenceExecutionSummary withStatus(String status)
Indicates the status of the inference execution.
status
- Indicates the status of the inference execution.InferenceExecutionStatus
public InferenceExecutionSummary withStatus(InferenceExecutionStatus status)
Indicates the status of the inference execution.
status
- Indicates the status of the inference execution.InferenceExecutionStatus
public void setFailedReason(String failedReason)
Specifies the reason for failure when an inference execution has failed.
failedReason
- Specifies the reason for failure when an inference execution has failed.public String getFailedReason()
Specifies the reason for failure when an inference execution has failed.
public InferenceExecutionSummary withFailedReason(String failedReason)
Specifies the reason for failure when an inference execution has failed.
failedReason
- Specifies the reason for failure when an inference execution has failed.public void setModelVersion(Long modelVersion)
The model version used for the inference execution.
modelVersion
- The model version used for the inference execution.public Long getModelVersion()
The model version used for the inference execution.
public InferenceExecutionSummary withModelVersion(Long modelVersion)
The model version used for the inference execution.
modelVersion
- The model version used for the inference execution.public void setModelVersionArn(String modelVersionArn)
The Amazon Resource Number (ARN) of the model version used for the inference execution.
modelVersionArn
- The Amazon Resource Number (ARN) of the model version used for the inference execution.public String getModelVersionArn()
The Amazon Resource Number (ARN) of the model version used for the inference execution.
public InferenceExecutionSummary withModelVersionArn(String modelVersionArn)
The Amazon Resource Number (ARN) of the model version used for the inference execution.
modelVersionArn
- The Amazon Resource Number (ARN) of the model version used for the inference execution.public String toString()
toString
in class Object
Object.toString()
public InferenceExecutionSummary clone()
public void marshall(ProtocolMarshaller protocolMarshaller)
StructuredPojo
ProtocolMarshaller
.marshall
in interface StructuredPojo
protocolMarshaller
- Implementation of ProtocolMarshaller
used to marshall this object's data.