Are linear models better when dealing with too many features? If so, why?
I had to build a classification model in order to predict which what would be the user rating by using his/her review. (I was dealing with this dataset: Trip Advisor Hotel Reviews)
After some preprocessing, I compared the results of a Logistic Regression with a CatBoost Classifier, both of them with the defatult hyperparameters. The Logistic Regression gave me a better AUC and F1-score.
I've heard some colleagues saying that this happneed because linear models are better when dealing with too many features (they told me 500). Is that correct? Is there any relation regarding the number of features and the model performance? I think I missed something in the theory