Counting the number of trainable parameters in a gradient boosted tree

I recently ran the gradient boosted tree regressor using scikit-learn via:

GradientBoostingRegressor()

This model depends on the following hyperparameters:

  • Estimators ($N_1$)
  • Min Samples Leaf ($N_2$)
  • Max Depth ($N_3$) which in-turn determine the number of trainable parameters in this model. My question is, how can I count the number of parameters (trainable or otherwise randomly assigned) which determined the final model as a function of the above?

My guess is $N_1 \times N_2 \times N_3$ but is this correct?

Topic bagging boosting xgboost decision-trees

Category Data Science


Maybe that can be the size of your grid, but not the number of possible trainable hyperparameter of the Gradient Boosting Regressor of scikit learn.

I add some more hyperparameters of GBDT Regressor:

  • loss{‘ls’, ‘lad’, ‘huber’, ‘quantile’},
  • learning_rate
  • criterion{‘friedman_mse’, ‘mse’, ‘mae’}
  • max_features{‘auto’, ‘sqrt’, ‘log2’},

This is just some of the hyperparameters needed, this is extracted from the documentation https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.GradientBoostingRegressor.html

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.