# Hyperparameter Tuning

Layer helps you perform **hyperparameter tuning** to find the best version of your model by running multiple distributed training jobs using the algorithm (Bayesian, Random, Grid and Manual) and range of the parameters you provided. Once the training jobs complete,Layer picks the best parameter combination and saves the corresponding model in the **Layer Model Catalog**.

## #

How Hyperparameter Tuning Works**Hyperparameter tuning** works by running multiple distributed trains of your model in a single tuning job. Each run is a complete execution of your `train_model()`

function implemented in your model's source code with a sampled parameter combination. Layer keeps track of these executions and finds the best trained model based on the target metric you defined.

Once the **Hyperparatemer tuning** job is completed, Layer saves the best model in the model catalog. In the model details page, you can view the best version of your model along with its metrics and hyperparameter values.

## #

ConfigurationYou can decleratively define your **hyperparameter tuning** configuration in the `model.yml`

file under the `training`

key.

Let's take a look at a simple tuning configuration:

Here we are trying to `maximize`

our `accuracy`

metric by tuning the `n_estimators`

parameter with a `Random`

search strategy.

### #

Choosing the StrategyLayer supports four different search algorithms to tune your parameters: Manual, Random, Grid, Bayesian search.

**Manual** search enables you to pass a predefined set of inputs (hyperparameters) to be tested automatically based on the target objective.

**Random** search defines a uniform distribution from which sample parameter values are drawn whereas **Grid** search build a grid of model hyperparameters and run the training function in a loop automatically. Keep in mind that, these search algorithms are completely ** uninformed** by past evaluations and they can spend time on testing suboptimal hyperparameters in the following search iterations.

On the other hand, **Bayesian** search draws samples from a probabilistic model mapping hyperparameter values to values of the objective function, basing this model on results from past evaluations.

### #

Defining the Objective**Hyperparameter tuning** requires an objective (a target metric) to evaluate the tuning executions. In your model source code, you can calculate and define a metric with `train.log_metric()`

and pass this metric as a target to your tuning job.

For example, you can calculate and define the `accuracy`

metric like the following:

Then, in the `model.yml`

we can pass this metric as the objective to be maximized. You can also use `minimize`

key to set the objective.

### #

Defining the HyperparametersYou can pass the hyperparameters to the tuning job in the `model.yml`

file. Hyparparameters must have a type (`float`

by default) and a value/range definition. For **Grid search**, you can pass a `step`

parameter.

**Examples:**

Defining an integer parameter with a range of 1-20:

Defining a float parameter with predefined values of 1.2, 5.5 and 10:

Defining an integer parameter with a single value:

Defining a float parameter with a step value: