Meta Llama 2 model customization hyperparameters - Amazon Bedrock

Meta Llama 2 model customization hyperparameters

The Meta Llama 2 13B and 70B models support the following hyperparameters for model customization. For more information, see Customize your model to improve its performance for your use case.

For information about fine tuning Meta Llama models, see the Meta documentation at https://ai.meta.com/llama/get-started/#fine-tuning.

Note

The epochCount quota is adjustable.

Hyperparameter (console) Hyperparameter (API) Definition Type Minimum Maximum Default
Epochs epochCount The number of iterations through the entire training dataset integer 1 10 5
Batch size batchSize The number of samples processed before updating model parameters integer 1 1 1
Learning rate learningRate The rate at which model parameters are updated after each batch float 5.00E-6 0.1 1.00E-4