AWS services or capabilities described in AWS Documentation may vary by region/location. Click Getting Started with Amazon AWS to see specific differences applicable to the China (Beijing) Region.
Container for the parameters to the CreateBatchPrediction operation.
Generates predictions for a group of observations. The observations to process exist
in one or more data files referenced by a
DataSource. This operation
creates a new
BatchPrediction, and uses an
MLModel and the
data files referenced by the
DataSource as information sources.
CreateBatchPrediction is an asynchronous operation. In response to
Amazon Machine Learning (Amazon ML) immediately returns and sets the
PENDING. After the
Amazon ML sets the status to
You can poll for status updates by using the GetBatchPrediction operation and
Status parameter of the result. After the
status appears, the results are available in the location specified by the
public class CreateBatchPredictionRequest : AmazonMachineLearningRequest IAmazonWebServiceRequest
The CreateBatchPredictionRequest type exposes the following members
Gets and sets the property BatchPredictionDataSourceId.
The ID of the
Gets and sets the property BatchPredictionId.
A user-supplied ID that uniquely identifies the
Gets and sets the property BatchPredictionName.
A user-supplied name or description of the
Gets and sets the property MLModelId.
The ID of the
Gets and sets the property OutputUri.
The location of an Amazon Simple Storage Service (Amazon S3) bucket or directory to
store the batch prediction results. The following substrings are not allowed in the
Amazon ML needs permissions to store and retrieve the logs on your behalf. For information about how to set permissions, see the Amazon Machine Learning Developer Guide.
Supported in: 1.3
Supported in: 4.5, 4.0, 3.5
Portable Class Library:
Supported in: Windows Store Apps
Supported in: Windows Phone 8.1
Supported in: Xamarin Android
Supported in: Xamarin iOS (Unified)
Supported in: Xamarin.Forms