Do non-parametric models always overfit without regularization?
Let's scope this to just classification.
It's clear that if you fully grow out a decision tree with no regularization (e.g. max depth, pruning), it will overfit the training data and get full accuracy down to Bayes error*.
Is this universally true for all non-parametric methods?
*Assuming the model has access to the right features.
Topic non-parametric bias overfitting regularization machine-learning
Category Data Science