@Generated(value="com.amazonaws:aws-java-sdk-code-generator") public class DescribeInferenceSchedulerResult extends AmazonWebServiceResult<ResponseMetadata> implements Serializable, Cloneable
Constructor and Description |
---|
DescribeInferenceSchedulerResult() |
Modifier and Type | Method and Description |
---|---|
DescribeInferenceSchedulerResult |
clone() |
boolean |
equals(Object obj) |
Date |
getCreatedAt()
Specifies the time at which the inference scheduler was created.
|
Long |
getDataDelayOffsetInMinutes()
A period of time (in minutes) by which inference on the data is delayed after the data starts.
|
InferenceInputConfiguration |
getDataInputConfiguration()
Specifies configuration information for the input data for the inference scheduler, including delimiter, format,
and dataset location.
|
InferenceOutputConfiguration |
getDataOutputConfiguration()
Specifies information for the output results for the inference scheduler, including the output S3 location.
|
String |
getDataUploadFrequency()
Specifies how often data is uploaded to the source S3 bucket for the input data.
|
String |
getInferenceSchedulerArn()
The Amazon Resource Name (ARN) of the inference scheduler being described.
|
String |
getInferenceSchedulerName()
The name of the inference scheduler being described.
|
String |
getLatestInferenceResult()
Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found) or
Normal (no anomalous events found).
|
String |
getModelArn()
The Amazon Resource Name (ARN) of the machine learning model of the inference scheduler being described.
|
String |
getModelName()
The name of the machine learning model of the inference scheduler being described.
|
String |
getRoleArn()
The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference scheduler
being described.
|
String |
getServerSideKmsKeyId()
Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for Equipment.
|
String |
getStatus()
Indicates the status of the inference scheduler.
|
Date |
getUpdatedAt()
Specifies the time at which the inference scheduler was last updated, if it was.
|
int |
hashCode() |
void |
setCreatedAt(Date createdAt)
Specifies the time at which the inference scheduler was created.
|
void |
setDataDelayOffsetInMinutes(Long dataDelayOffsetInMinutes)
A period of time (in minutes) by which inference on the data is delayed after the data starts.
|
void |
setDataInputConfiguration(InferenceInputConfiguration dataInputConfiguration)
Specifies configuration information for the input data for the inference scheduler, including delimiter, format,
and dataset location.
|
void |
setDataOutputConfiguration(InferenceOutputConfiguration dataOutputConfiguration)
Specifies information for the output results for the inference scheduler, including the output S3 location.
|
void |
setDataUploadFrequency(String dataUploadFrequency)
Specifies how often data is uploaded to the source S3 bucket for the input data.
|
void |
setInferenceSchedulerArn(String inferenceSchedulerArn)
The Amazon Resource Name (ARN) of the inference scheduler being described.
|
void |
setInferenceSchedulerName(String inferenceSchedulerName)
The name of the inference scheduler being described.
|
void |
setLatestInferenceResult(String latestInferenceResult)
Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found) or
Normal (no anomalous events found).
|
void |
setModelArn(String modelArn)
The Amazon Resource Name (ARN) of the machine learning model of the inference scheduler being described.
|
void |
setModelName(String modelName)
The name of the machine learning model of the inference scheduler being described.
|
void |
setRoleArn(String roleArn)
The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference scheduler
being described.
|
void |
setServerSideKmsKeyId(String serverSideKmsKeyId)
Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for Equipment.
|
void |
setStatus(String status)
Indicates the status of the inference scheduler.
|
void |
setUpdatedAt(Date updatedAt)
Specifies the time at which the inference scheduler was last updated, if it was.
|
String |
toString()
Returns a string representation of this object.
|
DescribeInferenceSchedulerResult |
withCreatedAt(Date createdAt)
Specifies the time at which the inference scheduler was created.
|
DescribeInferenceSchedulerResult |
withDataDelayOffsetInMinutes(Long dataDelayOffsetInMinutes)
A period of time (in minutes) by which inference on the data is delayed after the data starts.
|
DescribeInferenceSchedulerResult |
withDataInputConfiguration(InferenceInputConfiguration dataInputConfiguration)
Specifies configuration information for the input data for the inference scheduler, including delimiter, format,
and dataset location.
|
DescribeInferenceSchedulerResult |
withDataOutputConfiguration(InferenceOutputConfiguration dataOutputConfiguration)
Specifies information for the output results for the inference scheduler, including the output S3 location.
|
DescribeInferenceSchedulerResult |
withDataUploadFrequency(DataUploadFrequency dataUploadFrequency)
Specifies how often data is uploaded to the source S3 bucket for the input data.
|
DescribeInferenceSchedulerResult |
withDataUploadFrequency(String dataUploadFrequency)
Specifies how often data is uploaded to the source S3 bucket for the input data.
|
DescribeInferenceSchedulerResult |
withInferenceSchedulerArn(String inferenceSchedulerArn)
The Amazon Resource Name (ARN) of the inference scheduler being described.
|
DescribeInferenceSchedulerResult |
withInferenceSchedulerName(String inferenceSchedulerName)
The name of the inference scheduler being described.
|
DescribeInferenceSchedulerResult |
withLatestInferenceResult(LatestInferenceResult latestInferenceResult)
Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found) or
Normal (no anomalous events found).
|
DescribeInferenceSchedulerResult |
withLatestInferenceResult(String latestInferenceResult)
Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found) or
Normal (no anomalous events found).
|
DescribeInferenceSchedulerResult |
withModelArn(String modelArn)
The Amazon Resource Name (ARN) of the machine learning model of the inference scheduler being described.
|
DescribeInferenceSchedulerResult |
withModelName(String modelName)
The name of the machine learning model of the inference scheduler being described.
|
DescribeInferenceSchedulerResult |
withRoleArn(String roleArn)
The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference scheduler
being described.
|
DescribeInferenceSchedulerResult |
withServerSideKmsKeyId(String serverSideKmsKeyId)
Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for Equipment.
|
DescribeInferenceSchedulerResult |
withStatus(InferenceSchedulerStatus status)
Indicates the status of the inference scheduler.
|
DescribeInferenceSchedulerResult |
withStatus(String status)
Indicates the status of the inference scheduler.
|
DescribeInferenceSchedulerResult |
withUpdatedAt(Date updatedAt)
Specifies the time at which the inference scheduler was last updated, if it was.
|
getSdkHttpMetadata, getSdkResponseMetadata, setSdkHttpMetadata, setSdkResponseMetadata
public void setModelArn(String modelArn)
The Amazon Resource Name (ARN) of the machine learning model of the inference scheduler being described.
modelArn
- The Amazon Resource Name (ARN) of the machine learning model of the inference scheduler being described.public String getModelArn()
The Amazon Resource Name (ARN) of the machine learning model of the inference scheduler being described.
public DescribeInferenceSchedulerResult withModelArn(String modelArn)
The Amazon Resource Name (ARN) of the machine learning model of the inference scheduler being described.
modelArn
- The Amazon Resource Name (ARN) of the machine learning model of the inference scheduler being described.public void setModelName(String modelName)
The name of the machine learning model of the inference scheduler being described.
modelName
- The name of the machine learning model of the inference scheduler being described.public String getModelName()
The name of the machine learning model of the inference scheduler being described.
public DescribeInferenceSchedulerResult withModelName(String modelName)
The name of the machine learning model of the inference scheduler being described.
modelName
- The name of the machine learning model of the inference scheduler being described.public void setInferenceSchedulerName(String inferenceSchedulerName)
The name of the inference scheduler being described.
inferenceSchedulerName
- The name of the inference scheduler being described.public String getInferenceSchedulerName()
The name of the inference scheduler being described.
public DescribeInferenceSchedulerResult withInferenceSchedulerName(String inferenceSchedulerName)
The name of the inference scheduler being described.
inferenceSchedulerName
- The name of the inference scheduler being described.public void setInferenceSchedulerArn(String inferenceSchedulerArn)
The Amazon Resource Name (ARN) of the inference scheduler being described.
inferenceSchedulerArn
- The Amazon Resource Name (ARN) of the inference scheduler being described.public String getInferenceSchedulerArn()
The Amazon Resource Name (ARN) of the inference scheduler being described.
public DescribeInferenceSchedulerResult withInferenceSchedulerArn(String inferenceSchedulerArn)
The Amazon Resource Name (ARN) of the inference scheduler being described.
inferenceSchedulerArn
- The Amazon Resource Name (ARN) of the inference scheduler being described.public void setStatus(String status)
Indicates the status of the inference scheduler.
status
- Indicates the status of the inference scheduler.InferenceSchedulerStatus
public String getStatus()
Indicates the status of the inference scheduler.
InferenceSchedulerStatus
public DescribeInferenceSchedulerResult withStatus(String status)
Indicates the status of the inference scheduler.
status
- Indicates the status of the inference scheduler.InferenceSchedulerStatus
public DescribeInferenceSchedulerResult withStatus(InferenceSchedulerStatus status)
Indicates the status of the inference scheduler.
status
- Indicates the status of the inference scheduler.InferenceSchedulerStatus
public void setDataDelayOffsetInMinutes(Long dataDelayOffsetInMinutes)
A period of time (in minutes) by which inference on the data is delayed after the data starts. For instance, if you select an offset delay time of five minutes, inference will not begin on the data until the first data measurement after the five minute mark. For example, if five minutes is selected, the inference scheduler will wake up at the configured frequency with the additional five minute delay time to check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to stop and restart the scheduler when uploading new data.
dataDelayOffsetInMinutes
- A period of time (in minutes) by which inference on the data is delayed after the data starts. For
instance, if you select an offset delay time of five minutes, inference will not begin on the data until
the first data measurement after the five minute mark. For example, if five minutes is selected, the
inference scheduler will wake up at the configured frequency with the additional five minute delay time to
check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to
stop and restart the scheduler when uploading new data.public Long getDataDelayOffsetInMinutes()
A period of time (in minutes) by which inference on the data is delayed after the data starts. For instance, if you select an offset delay time of five minutes, inference will not begin on the data until the first data measurement after the five minute mark. For example, if five minutes is selected, the inference scheduler will wake up at the configured frequency with the additional five minute delay time to check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to stop and restart the scheduler when uploading new data.
public DescribeInferenceSchedulerResult withDataDelayOffsetInMinutes(Long dataDelayOffsetInMinutes)
A period of time (in minutes) by which inference on the data is delayed after the data starts. For instance, if you select an offset delay time of five minutes, inference will not begin on the data until the first data measurement after the five minute mark. For example, if five minutes is selected, the inference scheduler will wake up at the configured frequency with the additional five minute delay time to check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to stop and restart the scheduler when uploading new data.
dataDelayOffsetInMinutes
- A period of time (in minutes) by which inference on the data is delayed after the data starts. For
instance, if you select an offset delay time of five minutes, inference will not begin on the data until
the first data measurement after the five minute mark. For example, if five minutes is selected, the
inference scheduler will wake up at the configured frequency with the additional five minute delay time to
check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to
stop and restart the scheduler when uploading new data.public void setDataUploadFrequency(String dataUploadFrequency)
Specifies how often data is uploaded to the source S3 bucket for the input data. This value is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts once every 5 minutes.
dataUploadFrequency
- Specifies how often data is uploaded to the source S3 bucket for the input data. This value is the length
of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will
upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how
often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts
once every 5 minutes.DataUploadFrequency
public String getDataUploadFrequency()
Specifies how often data is uploaded to the source S3 bucket for the input data. This value is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts once every 5 minutes.
DataUploadFrequency
public DescribeInferenceSchedulerResult withDataUploadFrequency(String dataUploadFrequency)
Specifies how often data is uploaded to the source S3 bucket for the input data. This value is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts once every 5 minutes.
dataUploadFrequency
- Specifies how often data is uploaded to the source S3 bucket for the input data. This value is the length
of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will
upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how
often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts
once every 5 minutes.DataUploadFrequency
public DescribeInferenceSchedulerResult withDataUploadFrequency(DataUploadFrequency dataUploadFrequency)
Specifies how often data is uploaded to the source S3 bucket for the input data. This value is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts once every 5 minutes.
dataUploadFrequency
- Specifies how often data is uploaded to the source S3 bucket for the input data. This value is the length
of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will
upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how
often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts
once every 5 minutes.DataUploadFrequency
public void setCreatedAt(Date createdAt)
Specifies the time at which the inference scheduler was created.
createdAt
- Specifies the time at which the inference scheduler was created.public Date getCreatedAt()
Specifies the time at which the inference scheduler was created.
public DescribeInferenceSchedulerResult withCreatedAt(Date createdAt)
Specifies the time at which the inference scheduler was created.
createdAt
- Specifies the time at which the inference scheduler was created.public void setUpdatedAt(Date updatedAt)
Specifies the time at which the inference scheduler was last updated, if it was.
updatedAt
- Specifies the time at which the inference scheduler was last updated, if it was.public Date getUpdatedAt()
Specifies the time at which the inference scheduler was last updated, if it was.
public DescribeInferenceSchedulerResult withUpdatedAt(Date updatedAt)
Specifies the time at which the inference scheduler was last updated, if it was.
updatedAt
- Specifies the time at which the inference scheduler was last updated, if it was.public void setDataInputConfiguration(InferenceInputConfiguration dataInputConfiguration)
Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.
dataInputConfiguration
- Specifies configuration information for the input data for the inference scheduler, including delimiter,
format, and dataset location.public InferenceInputConfiguration getDataInputConfiguration()
Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.
public DescribeInferenceSchedulerResult withDataInputConfiguration(InferenceInputConfiguration dataInputConfiguration)
Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.
dataInputConfiguration
- Specifies configuration information for the input data for the inference scheduler, including delimiter,
format, and dataset location.public void setDataOutputConfiguration(InferenceOutputConfiguration dataOutputConfiguration)
Specifies information for the output results for the inference scheduler, including the output S3 location.
dataOutputConfiguration
- Specifies information for the output results for the inference scheduler, including the output S3
location.public InferenceOutputConfiguration getDataOutputConfiguration()
Specifies information for the output results for the inference scheduler, including the output S3 location.
public DescribeInferenceSchedulerResult withDataOutputConfiguration(InferenceOutputConfiguration dataOutputConfiguration)
Specifies information for the output results for the inference scheduler, including the output S3 location.
dataOutputConfiguration
- Specifies information for the output results for the inference scheduler, including the output S3
location.public void setRoleArn(String roleArn)
The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference scheduler being described.
roleArn
- The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference
scheduler being described.public String getRoleArn()
The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference scheduler being described.
public DescribeInferenceSchedulerResult withRoleArn(String roleArn)
The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference scheduler being described.
roleArn
- The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference
scheduler being described.public void setServerSideKmsKeyId(String serverSideKmsKeyId)
Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for Equipment.
serverSideKmsKeyId
- Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for
Equipment.public String getServerSideKmsKeyId()
Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for Equipment.
public DescribeInferenceSchedulerResult withServerSideKmsKeyId(String serverSideKmsKeyId)
Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for Equipment.
serverSideKmsKeyId
- Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for
Equipment.public void setLatestInferenceResult(String latestInferenceResult)
Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found) or Normal (no anomalous events found).
latestInferenceResult
- Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found)
or Normal (no anomalous events found).LatestInferenceResult
public String getLatestInferenceResult()
Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found) or Normal (no anomalous events found).
LatestInferenceResult
public DescribeInferenceSchedulerResult withLatestInferenceResult(String latestInferenceResult)
Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found) or Normal (no anomalous events found).
latestInferenceResult
- Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found)
or Normal (no anomalous events found).LatestInferenceResult
public DescribeInferenceSchedulerResult withLatestInferenceResult(LatestInferenceResult latestInferenceResult)
Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found) or Normal (no anomalous events found).
latestInferenceResult
- Indicates whether the latest execution for the inference scheduler was Anomalous (anomalous events found)
or Normal (no anomalous events found).LatestInferenceResult
public String toString()
toString
in class Object
Object.toString()
public DescribeInferenceSchedulerResult clone()