AWS services in this solution - MLOps Workload Orchestrator

AWS services in this solution

The following AWS services are included in this solution:

AWS service Description

AWS Lambda

Core. Provides logic for the orchestration component of the solution and different ML workload tasks, such as triggering model training jobs, creating baseline jobs, and deploying StackSets.

Amazon API Gateway

Core. Provides the interface with the solution orchestration component and SageMaker AI real-time inference endpoint.

Amazon SageMaker AI

Core. Provides model training, batch inference, real-time inference, and model monitoring.

AWS CodePipeline

Core. Provides the ability to automatically deploy different ML workloads offered by the solution.

AWS CloudFormation

Core. Used to deploy the main solution orchestrator and different ML workloads, built as CloudFormation templates.

AWS Identity and Access Management

Core. Provides required scoped permissions to provision different ML workloads.

AWS CodeBuild

Supporting. Provides support to build custom SageMaker AI algorithms.

Amazon Simple Notification Service

Supporting. Provides notifications to the admin user about ML workload status.

AWS Systems Manager

Supporting. Provides application-level resource monitoring and visualization of resource operations and cost data.

Amazon ECR

Optional. Hosts custom SageMaker AI algorithms images for use by ML workloads. Required only for use with custom algorithms.

AWS Key Management Service

Optional. Stores customer’s encryption keys if they want to use their own keys to encrypt model artifact, and outputs of ML workloads.