If a feature has already split, will it hardly be selected to split again in the subsequent tree in a Gradient Boosting Tree
I have asked this question here, but seems no one was interested in it:
If a feature has already split, will it hardly be selected to split again in the subsequent tree in a Gradient Boosting Tree? It is motivated by the fact that for the heavy correlated features in a single tree, usually only one of them will be selected to split as their uncertainty will remain few after a splitting. Now in Gradient Boosting Tree, is residual similar with the uncertainty?
Currently, I happened to how heavily correlated features affects the feature importance selected by Gradient Boosting Tree. I guess the result is that Gradient Boosting Tree will only select the importance from one of correlated features just like LASSO.