Step 1: Launch the stack
Follow the step-by-step instructions in this section to configure and deploy the solution into your account.
Time to deploy: Approximately three minutes
-
Sign in to the AWS Management Console
and select the button to launch the mlops-workload-orchestrator-multi-account.template
AWS CloudFormation template. -
The template launches in the US East (N. Virginia) Region by default. To launch the solution in a different AWS Region, use the Region selector in the console navigation bar.
Note
This solution uses the AWS CodePipeline and Amazon SageMaker services, which are not currently available in all AWS Regions. You must launch this solution in an AWS Region where AWS CodePipeline and Amazon SageMaker are available. For the most current availability by Region, see the Supported AWS Regions table.
-
On the Create stack page, verify that the correct template URL is in the Amazon S3 URL text box and choose Next.
-
On the Specify stack details page, assign a name to your solution stack. For information about naming character limitations, see IAM and AWS STS quotas, name requirements, and character limits in the AWS Identity and Access Management User Guide.
-
Under Parameters, review the parameters for this solution template and modify them as necessary. This solution uses the following default values.
Parameter Default Description Notification Email <Requires input>
Specify an email to receive Amazon SNS notifications about pipeline outcomes. CodeCommit Repo URL Address <Optional input> Only provide value if you want to provision a pipeline from a CodeCommit repository. For example:
https://git-codecommit.us-east-1.amazonaws.com/v1/repos/
<repository-name>
Name of an Existing S3 Bucket <Optional input> Optionally, provide the name of an existing S3 bucket to be used as the S3 assets bucket. If an existing bucket is not provided, the solution creates a new S3 bucket.
Note
If you use an existing S3 bucket for the bucket must meet the following requirements: 1) the bucket must be in the same Region as the MLOps Workload Orchestrator stack, 2) the bucket must allow reading/writing objects to/from the bucket, and 3) versioning must be allowed on the bucket. We recommended blocking public access, enabling S3 server-side encryption, access logging, and secure transport (for example, HTTPS only bucket policy) on your existing S3 bucket.
Name of an Existing Amazon ECR repository <Optional input> Optionally, provide the name of an existing Amazon ECR repository name to be used for custom algorithms images. If you do not specify an existing repository, the solution creates a new Amazon ECR repository.
Note
The Amazon ECR repository must be in the same Region where the solution is deployed.
Do you want to use SageMaker Model Registry? No
By default, this value is
No
. You must provide the algorithm and model artifact location. If you want to use Amazon SageMaker model registry, you must set the value toYes
, and provide the model version ARN in the API call. For more details, refer to API operations. The solution expects that the model artifact is stored in the S3 assets bucket.Do you want the solution to create a SageMaker’s model package group? No
By default, this value is
No
. If you are using Amazon SageMaker Model Registry, you can set this value toYes
to instruct the solution to create a Model Registry (for example, model package group). Otherwise, you can use your own model registry created outside the solution.Note
If you choose to use a model registry that was not created by this solution, you must set up access permissions for other accounts to access the model registry. For more information refer to Deploy a Model Version from a Different Account in the Amazon SageMaker Developer Guide.
Do you want to allow detailed error message in the APIs response? Yes
By default, this value is
Yes
. If allowed, the API’s response returns a detailed message for any server-side error/exception. If you set this parameter toNo
, the API’s response returns a general error message.Are you using a delegated administrator account (AWS Organizations)? Yes
By default, this value is
Yes
. The solution expects that the orchestrator account is an AWS Organizations delegated administrator account. This follows best practices to limit the access to the AWS Organizations management account. However, if you want to use the management account as the orchestrator account, you can change this value toNo
.Development Account ID <Requires input>
The development account’s number. Development Account Organizational Unit ID <Requires input>
The AWS Organizations unit ID for the development account (for example,
o-a1ss2d3g4
).Staging Account ID <Requires input>
The staging account’s number. Staging Account Organizational Unit ID <Requires input>
The AWS Organizations unit ID for the staging account. Production Account ID <Requires input>
The production account’s number. Production Account Organizational Unit ID <Requires input>
The AWS Organizations unit ID for the production account. For more information about creating Amazon SageMaker model registry, setting permissions, and registering models, refer to Register and deploy models with model registry in the Amazon SageMaker Developer Guide.
Note
To connect a GitHub or BitBucket code repository to this solution, launch the solution and use the process in the source stage of the pipeline to create GitHubSourceAction and BitBucketSourceAction.
-
Select Next.
-
On the Configure stack options page, choose Next.
-
On the Review and create page, review and confirm the settings. Select the box acknowledging that the template will create AWS Identity and Access Management (IAM) resources.
-
Choose Submit to deploy the stack.
You can view the status of the stack in the AWS CloudFormation console in the Status column. You should receive a CREATE_COMPLETE status in approximately three minutes.
Note
In addition to the primary
AWSMLOpsFrameworkPipelineOrchestration
AWS Lambda function, this solution includes thesolution-helper
Lambda function, which runs only during initial configuration or when resources are updated or deleted.When you run this solution, you will notice both Lambda functions in the AWS console. Only the
AWSMLOpsFrameworkPipelineOrchestration
Lambda function is regularly active. However, you must not delete thesolution-helper
Lambdafunction, since it is necessary to manage associated resources.