Preprocess data and train a machine learning model with Amazon SageMaker - AWS Step Functions

Preprocess data and train a machine learning model with Amazon SageMaker

This sample project demonstrates how to use SageMaker and AWS Step Functions to preprocess data and train a machine learning model.

In this project, Step Functions uses a Lambda function to seed an Amazon S3 bucket with a test dataset and a Python script for data processing. It then trains a machine learning model and performs a batch transform, using the SageMaker service integration.

For more information about SageMaker and Step Functions service integrations, see the following:

Note

This sample project may incur charges.

For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see SageMaker Pricing.

Step 1: Create the state machine and provision resources

  1. Open the Step Functions console and choose Create state machine.

  2. Type Preprocess data and train a machine learning model in the search box, and then choose Preprocess data and train a machine learning model from the search results that are returned.

  3. Choose Next to continue.

  4. Step Functions lists the AWS services used in the sample project you selected. It also shows a workflow graph for the sample project. Deploy this project to your AWS account or use it as a starting point for building your own projects. Based on how you want to proceed, choose Run a demo or Build on it.

    This sample project deploys the following resources:

    • An AWS Lambda function

    • An Amazon S3 bucket

    • An AWS Step Functions state machine

    • Related AWS Identity and Access Management (IAM) roles

    The following image shows the workflow graph for the Preprocess data and train a machine learning model sample project:

    Workflow graph of the Preprocess data and train a machine learning model sample project.
  5. Choose Use template to continue with your selection.

  6. Do one of the following:

    • If you selected Build on it, Step Functions creates the workflow prototype for the sample project you selected. Step Functions doesn't deploy the resources listed in the workflow definition.

      In Workflow Studio's Design mode, drag and drop states from the States browser to continue building your workflow protoype. Or switch to the Code mode that provides an integrated code editor similar to VS Code for updating the Using Amazon States Language to define Step Functions workflows (ASL) definition of your state machine within the Step Functions console. For more information about using Workflow Studio to build your state machines, see Creating a workflow with Workflow Studio in Step Functions.

      Important

      Remember to update the placeholder Amazon Resource Name (ARN) for the resources used in the sample project before you run your workflow.

    • If you selected Run a demo, Step Functions creates a read-only sample project which uses an AWS CloudFormation template to deploy the AWS resources listed in that template to your AWS account.

      Tip

      To view the state machine definition of the sample project, choose Code.

      When you're ready, choose Deploy and run to deploy the sample project and create the resources.

      It can take up to 10 minutes for these resources and related IAM permissions to be created. While your resources are being deployed, you can open the CloudFormation Stack ID link to see which resources are being provisioned.

      After all the resources in the sample project are created, you can see the new sample project listed on the State machines page.

      Important

      Standard charges may apply for each service used in the CloudFormation template.

Step 2: Run the state machine

  1. On the State machines page, choose your sample project.

  2. On the sample project page, choose Start execution.

  3. In the Start execution dialog box, do the following:

    1. (Optional) To identify your execution, you can specify a name for it in the Name box. By default, Step Functions generates a unique execution name automatically.

      Note

      Step Functions allows you to create names for state machines, executions, and activities, and labels that contain non-ASCII characters. These non-ASCII names don't work with Amazon CloudWatch. To ensure that you can track CloudWatch metrics, choose a name that uses only ASCII characters.

    2. (Optional) In the Input box, enter input values in JSON format to run your workflow.

      If you chose to Run a demo, you need not provide any execution input.

      Note

      If the demo project you deployed contains prepopulated execution input data, use that input to run the state machine.

    3. Choose Start execution.

    4. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete.

      To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution Details page – Interface overview.

Example State Machine Code

The state machine in this sample project integrates with SageMaker and AWS Lambda by passing parameters directly to those resources, and uses an Amazon S3 bucket for the training data source and output.

Browse through this example state machine to see how Step Functions controls Lambda and SageMaker.

For more information about how AWS Step Functions can control other AWS services, see Integrating other services with Step Functions.

{ "StartAt": "Generate dataset", "States": { "Generate dataset": { "Resource": "arn:aws:lambda:sa-east-1:1234567890:function:FeatureTransform-LambaForDataGeneration-17M8LX7IO9LUW", "Type": "Task", "Next": "Standardization: x' = (x - x̄) / σ" }, "Standardization: x' = (x - x̄) / σ": { "Resource": "arn:aws:states:::sagemaker:createProcessingJob.sync", "Parameters": { "ProcessingResources": { "ClusterConfig": { "InstanceCount": 1, "InstanceType": "ml.m5.xlarge", "VolumeSizeInGB": 10 } }, "ProcessingInputs": [ { "InputName": "input-1", "S3Input": { "S3Uri": "s3://featuretransform-bucketforcodeanddata-1jn1le6gadwfz/input/raw.csv", "LocalPath": "/opt/ml/processing/input", "S3DataType": "S3Prefix", "S3InputMode": "File", "S3DataDistributionType": "FullyReplicated", "S3CompressionType": "None" } }, { "InputName": "code", "S3Input": { "S3Uri": "s3://featuretransform-bucketforcodeanddata-1jn1le6gadwfz/code/transform.py", "LocalPath": "/opt/ml/processing/input/code", "S3DataType": "S3Prefix", "S3InputMode": "File", "S3DataDistributionType": "FullyReplicated", "S3CompressionType": "None" } } ], "ProcessingOutputConfig": { "Outputs": [ { "OutputName": "train_data", "S3Output": { "S3Uri": "s3://featuretransform-bucketforcodeanddata-1jn1le6gadwfz/train", "LocalPath": "/opt/ml/processing/output/train", "S3UploadMode": "EndOfJob" } } ] }, "AppSpecification": { "ImageUri": "737474898029.dkr.ecr.sa-east-1.amazonaws.com/sagemaker-scikit-learn:0.20.0-cpu-py3", "ContainerEntrypoint": [ "python3", "/opt/ml/processing/input/code/transform.py" ] }, "StoppingCondition": { "MaxRuntimeInSeconds": 300 }, "RoleArn": "arn:aws:iam::1234567890:role/SageMakerAPIExecutionRole-AIDACKCEVSQ6C2EXAMPLE", "ProcessingJobName.$": "$$.Execution.Name" }, "Type": "Task", "Next": "Train model (XGBoost)" }, "Train model (XGBoost)": { "Resource": "arn:aws:states:::sagemaker:createTrainingJob.sync", "Parameters": { "AlgorithmSpecification": { "TrainingImage": "855470959533.dkr.ecr.sa-east-1.amazonaws.com/xgboost:latest", "TrainingInputMode": "File" }, "OutputDataConfig": { "S3OutputPath": "s3://featuretransform-bucketforcodeanddata-1jn1le6gadwfz/models" }, "StoppingCondition": { "MaxRuntimeInSeconds": 86400 }, "ResourceConfig": { "InstanceCount": 1, "InstanceType": "ml.m5.xlarge", "VolumeSizeInGB": 30 }, "RoleArn": "arn:aws:iam::1234567890:role/SageMakerAPIExecutionRole-AIDACKCEVSQ6C2EXAMPLE", "InputDataConfig": [ { "DataSource": { "S3DataSource": { "S3DataDistributionType": "ShardedByS3Key", "S3DataType": "S3Prefix", "S3Uri": "s3://featuretransform-bucketforcodeanddata-1jn1le6gadwfz" } }, "ChannelName": "train", "ContentType": "text/csv" } ], "HyperParameters": { "objective": "reg:logistic", "eval_metric": "rmse", "num_round": "5" }, "TrainingJobName.$": "$$.Execution.Name" }, "Type": "Task", "End": true } } }

For information about how to configure IAM when using Step Functions with other AWS services, see How Step Functions generates IAM policies for integrated services.

IAM Example

These example AWS Identity and Access Management (IAM) policies generated by the sample project include the least privilege necessary to execute the state machine and related resources. We recommend that you include only those permissions that are necessary in your IAM policies.

{ "Version": "2012-10-17", "Statement": [ { "Action": [ "cloudwatch:PutMetricData", "logs:CreateLogStream", "logs:PutLogEvents", "logs:CreateLogGroup", "logs:DescribeLogStreams", "s3:GetObject", "s3:PutObject", "s3:ListBucket", "ecr:GetAuthorizationToken", "ecr:BatchCheckLayerAvailability", "ecr:GetDownloadUrlForLayer", "ecr:BatchGetImage" ], "Resource": "*", "Effect": "Allow" } ] }

The following policy allows the Lambda function to seed the Amazon S3 bucket with sample data.

{ "Version": "2012-10-17", "Statement": [ { "Action": [ "s3:PutObject" ], "Resource": "arn:aws:s3:::featuretransform-bucketforcodeanddata-1jn1le6gadwfz/*", "Effect": "Allow" } ] }

For information about how to configure IAM when using Step Functions with other AWS services, see How Step Functions generates IAM policies for integrated services.