Should the output of regression models, like SVR, be normalized?

I have a regression problem which I solved using SVR. Accidentally, I normalized my output along with the inputs by removing the mean and dividing by standard deviation from each feature.

Surprisingly, the Rsquare score increased by 10%.

How can one explain the impact of output normalization for svm regression?

Topic svr regression svm machine-learning

Category Data Science


In regression problems it is customary to normalize the output too, because the scale of output and input features may differ. After getting the result of the SVR model, you have to add the mean to the result and multiply that by the standard deviation, if you have done that during normalizing.

How can one explain the impact of output normalization for svm regression?

If you normalize your data, you will have a cost function which is well behaved. means that you can find local parts more easily. The reason is that you have to construct the output using the input features in regression problems. It is difficult to make large values with small normalized features but with small numbers, making a normalized output is easier and can be learned faster.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.