Forecasting: Multiple Linear Regression (OLS) outperforms Random Forests / Gradient Boosting / AdaBoost
I'm using different forecasting methods on a dataset to try and compare the accuracy of these methods.
For some reason, multiple linear regression (OLS) is outperforming RF, GB and AdaBoost when comparing MAE, RMSE R^2 and MAPE. This is very surprising to me.
Is there any general reason that could explain this outperformance?
I know that ML methods don't perform well with datasets that have a small amount of samples, but this should not be the case here.
I'm a beginner in this area, so I hope this is not a stupid question and somebody is able to help me! Thanks!
Topic adaboost forecasting regression random-forest
Category Data Science