Does hyperparameter tuning of Decision Tree then use it in Adaboost individually vs Simultaneously yield the same results?
So, my predicament here is as follows, I performed hyperparameter tuning on a standalone Decision Tree classifier, and I got the best results, now comes the turn of Standalone Adaboost, but here is where my problem lies, if I use the Tuned Decision Tree from earlier as a base_estimator in Adaboost, then I perform hyperparameter tuning on Adaboost only, will it yield the same results as trying to perform hyperparameter tuning on untuned Adaboost and untuned Decision Tree as a base_estimator simultaneously, where I try the hyperparameters of both Adaboost and Decision Tree together.
Topic hyperparameter-tuning adaboost gridsearchcv decision-trees scikit-learn
Category Data Science