Do the benefits of ridge regression diminish with larger datasets?
I have a question about ridge regression and about its benefits (relative to OLS) when the datasets are big. Do the benefits of ridge regression disappear when the datasets are larger (e.g. 50,000 vs 1000)? When the dataset is large enough, wouldn't the normal OLS model be able to determine which parameters are more important, thus reducing the need for the penalty term? Ridge regression makes sense when the data sets are small and there is scope for high variance, but do we expect its intended benefits (relative to OLS) to disappear for large datasets?
Topic ridge-regression regression python
Category Data Science