Does Gradient Boosting perform n-ary splits where n > 2?

I wonder whether algorithms such as GBM, XGBoost, CatBoost, and LightGBM perform more than two splits at a node in the decision trees? Can a node be split into 3 or more branches instead of merely binary splits? Can more than one feature be used in deciding how to split a node? Can a feature be re-used in splitting a descendant node?

Topic natural-gradient-boosting catboost lightgbm xgboost gbm

Category Data Science


Gradient boosting can be applied to any base model, so doing it with a Quinlan-family decision tree (which allow for such higher-arity splits for categorical features) should make this possible. However, all implementations of gradient boosted trees that I know of (and certainly XGBoost, CatBoost, LightGBM) all use CART as their tree model, so you won't get anything but binary trees. (These GBMs do modify CART a little, e.g. in using histogram binning to reduce the split searches, but nothing as drastic as n-ary splits for categoricals.)

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.