Tune Multiple Algorithms with Hyperparameter Optimization to Find the Best Model - Amazon SageMaker

Tune Multiple Algorithms with Hyperparameter Optimization to Find the Best Model

To create a new hyperparameter optimization (HPO) job with Amazon SageMaker that tunes multiple algorithms, you must provide job settings that apply to all of the algorithms to be tested and a training definition for each of these algorithms. You must also specify the resources you want to use for the tuning job.

  • The job settings to configure include warm starting, early stopping, and the tuning strategy. Warm starting and early stopping are available only when tuning a single algorithm.

  • The training job definition to specify the name, algorithm source, objective metric, and the range of values, when required, to configure the set of hyperparameter values for each training job. It configures the channels for data inputs, data output locations, and any checkpoint storage locations for each training job. The definition also configures the resources to deploy for each training job, including instance types and counts, managed spot training, and stopping conditions.

  • The tuning job resources: to deploy, including the maximum number of concurrent training jobs that a hyperparameter tuning job can run concurrently and the maximum number of training jobs that the hyperparameter tuning job can run.

Get Started

You can create a new hyperparameter tuning job, clone a job, add or edit tags to a job from the console. You can also use the search feature to find jobs by their name, creation time, or status. Alternatively, you can also hyperparameter tuning jobs with the SageMaker API.