Why would GradientBoostClassifier do better than XGBoostClassifier?
I am working on the Kaggle home loan model and interestingly enough, the GradientBoostClassifier has a considerably better score than XGBClassifier. At the same time it seems to not overfit as much. (note, I am running both algos with default settings). From what I've been reading XGBClassifier is the same as GradientBoostClassifier, just much faster and more robust. Therefore I am now confused on why would XGB overfit so much more than GradientBoostClassifier, when it should do the contrary? What would be a good reason why this is happening?
Topic natural-gradient-boosting xgboost python
Category Data Science