Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

Automatic model tuning with SageMaker AI

Focus mode
Automatic model tuning with SageMaker AI - Amazon SageMaker AI

Amazon SageMaker AI automatic model tuning (AMT) finds the best version of a model by running many training jobs on your dataset. Amazon SageMaker AI automatic model tuning (AMT) is also known as hyperparameter tuning. To do this, AMT uses the algorithm and ranges of hyperparameters that you specify. It then chooses the hyperparameter values that creates a model that performs the best, as measured by a metric that you choose.

For example, running a binary classification problem on a marketing dataset. Your goal is to maximize the area under the curve (AUC) metric of the algorithm by training an XGBoost algorithm with Amazon SageMaker AI model. You want to find which values for the eta, alpha, min_child_weight, and max_depth hyperparameters that will train the best model. Specify a range of values for these hyperparameters. Then, SageMaker AI hyperparameter tuning searches within the ranges to find a combination that creates a training job that creates a model with the highest AUC. To conserve resources or meet a specific model quality expectation, set up completion criteria to stop tuning after the criteria have been met.

You can use SageMaker AI AMT with built-in algorithms, custom algorithms, or SageMaker AI pre-built containers for machine learning frameworks.

SageMaker AI AMT can use an Amazon EC2 Spot instance to optimize costs when running training jobs. For more information, see Managed Spot Training in Amazon SageMaker AI.

Before you start using hyperparameter tuning, you should have a well-defined machine learning problem, including the following:

  • A dataset

  • An understanding of the type of algorithm that you need to train

  • A clear understanding of how you measure success

Prepare your dataset and algorithm so that they work in SageMaker AI and successfully run a training job at least once. For information about setting up and running a training job, see Guide to getting set up with Amazon SageMaker AI.

PrivacySite termsCookie preferences
© 2025, Amazon Web Services, Inc. or its affiliates. All rights reserved.