Do non-parametric models always overfit without regularization?

Let's scope this to just classification.

It's clear that if you fully grow out a decision tree with no regularization (e.g. max depth, pruning), it will overfit the training data and get full accuracy down to Bayes error*.

Is this universally true for all non-parametric methods?

*Assuming the model has access to the right features.

Topic non-parametric bias overfitting regularization machine-learning

Category Data Science


No - non-parametric methods only means that the method does not assume a function form of the data. There are non-parametric methods such as Random Forest that do not always overfit. In fact nonparametric methods could underfit, it could lack the ability to fit the training data. An example of this would be a decision stump.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.