This section includes information about an optional feature for collecting unique metrics for this solution, pointers to related resources, and a list of builders who contributed to this solution.
Anonymized data collection
This solution includes an option to send anonymized operational metrics to AWS. We use this data to better understand how customers use this solution and related services and products. When invoked, the following information is collected and sent to AWS:
-
Solution ID - The AWS solution identifier
-
Version - The AWS solution version
-
Unique ID (UUID) - Randomly generated, unique identifier for each MLOps Workload Orchestrator deployment
-
Timestamp - Data-collection timestamp
-
configBucketProvided - Whether or not an S3 bucket for MLOps pipeline config is provided
-
Region - The AWS Region where the solution was deployed
-
IsMutliAccount - Which template option was deployed (multi-account or single account)
-
IsDelegatedAccount - Whether an AWS Organization delegated administrator account, or a management account, is used to deploy the solution’s multi-account deployment option
-
UseModelRegistry - Whether Amazon SageMaker AI model registry is used or not
AWS owns the data gathered though this survey. Data collection is subject to the Privacy Notice
-
Download the AWS CloudFormation template
to your local hard drive. -
Open the AWS CloudFormation template with a text editor.
-
Modify the AWS CloudFormation template mapping section from:
AnonymizedData: SendAnonymizedData: Data: Yes
to:
AnonymizedData: SendAnonymizedData: Data: No
-
Sign in to the AWS CloudFormation console
. -
Select Create stack.
-
On the Create stack page, Specify template section, select Upload a template file.
-
Under Upload a template file, choose Choose file and select the edited template from your local drive.
-
Choose Next and follow the steps in Launch the stack for the relevant deployment option in the Deploy the solution section of this guide.
Related resources
-
The Cognizant Case study
describes how Cognizant built its MLOps Model Lifecycle Orchestrator on top of the MLOps Workload Orchestrator solution to speed deployment of machine learning models from weeks to hours.
Contributors
-
Tarek Abdunabi
-
Mohsen Ansari
-
Zain Kabani
-
Dylan Tong