Unable to tune hyperparameters for CatBoostRegressor

I am trying to fit a CatBoostRegressor to my model. When I perform K fold CV for the baseline model everything works fine. But when I use Optuna for hyperparameter tuning, it does something really weird. It runs the first trial and then throws the following error:-

[I 2021-08-26 08:00:56,865] Trial 0 finished with value: 0.7219653113910736 and parameters: 
{'model__depth': 2, 'model__iterations': 1715, 'model__subsample': 0.5627211605250965, 
'model__learning_rate': 0.15601805222619286}. Best is trial 0 with value: 0.7219653113910736. 
[W 2021-08-26 08:00:56,869] 

Trial 1 failed because of the following error: CatBoostError(You 
can't change params of fitted model.)
Traceback (most recent call last):

I used a similar approach for XGBRegressor and LGBM and they worked fine. So why am I getting an error for CatBoost?

Below is my Optuna code:-

import optuna
from sklearn.metrics import mean_squared_error

def objective(trial):

    model__depth = trial.suggest_int('model__depth', 2, 10)
    model__iterations = trial.suggest_int('model__iterations', 100, 2000)
    model__subsample = trial.suggest_float('model__subsample', 0.0, 1.0)
    model__learning_rate = trial.suggest_float('model__learning_rate', 0.001, 0.3, log = True)

    params = {'model__depth' : model__depth,
              'model__iterations' : model__iterations,
              'model__subsample' : model__subsample, 
              'model__learning_rate' : model__learning_rate}

    pipe.set_params(**params)
    pipe.fit(train_x, train_y)
    pred = pipe.predict(test_x)

    return np.sqrt(mean_squared_error(test_y, pred))

cbr_study = optuna.create_study(direction = 'minimize')
cbr_study.optimize(objective, n_trials = 10)

Topic catboost hyperparameter-tuning

Category Data Science


This seems to be an issue with Catboost, at least there is a (now closed) issue on GitHub. Probably open a new issue to let the developers know about this.

I tuned Catboost using bayes_opt from BayesianOptimization in the past (using bayesian optimization as the package name says). Find the main part of the code below and a full example here.

def cbfunc(border_count,l2_leaf_reg, depth, learning_rate):
    params = {
        'eval_metric':'MAE', # using MAE here, could also be RMSE or MSE
        'early_stopping_rounds': esrounds,
        'num_boost_round': brounds,
        'use_best_model': True,
        'task_type': "GPU"
    }

    params['border_count'] = round(border_count,0)
    params['l2_leaf_reg'] = l2_leaf_reg
    params['depth'] = round(depth,0)
    params['learning_rate'] = learning_rate

    # Cross validation   
    cv_results = cb.cv(cb.Pool(xtrain, ytrain, cat_features=cat_features), params=params,fold_count=3,inverted=False,partition_random_seed=5,shuffle=True, logging_level='Silent') 
    # bayes_opt MAXIMISES: In order to minimise MAE, I use 1/MAE as target value
    return 1/cv_results['test-MAE-mean'].min()

pbounds = { 
        'border_count': (1,255),      # int. 1-255
        'l2_leaf_reg': (0,20),        # any positive value
        'depth': (1,16),              # int. up to  16
        'learning_rate': (0.01,0.2),
    }

optimizer = BayesianOptimization(
    f=cbfunc,
    pbounds=pbounds,
    verbose=2, # verbose = 1 prints only when a maximum is observed, verbose = 0 is silent
    random_state=5
)

optimizer.maximize(
    init_points=2,
    n_iter=500,
)

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.