Constraining linear regressor parameters in scikit-learn?
I'm using sklearn.linear_model.Ridge
to use ridge regression to extract the coefficients of a polynomial.
However, some of the coefficients have physical constraints that require them to be negative. Is there a way to impose a constraint on those parameters? I haven't spotted one in the documentation...
As a workaround of sorts, I have tried making many fits using different complexity parameters (see toy code below) and selecting the one with coefficients that satisfy the physical constraint, but this is too unreliable to use in production.
# Preliminaries
from sklearn.linear_model import Ridge
n_alphas = 2000
alphas = np.logspace(-15,3,n_alphas)
# Perform fit
fits = {}
for alpha in alphas:
temp_ridge = Ridge(alpha, fit_intercept=False)
temp_ridge.fit(indep_training_data, dep_training_data)
temp_ridge_R2 = temp_ridge.score(indep_test_data, dep_test_data)
fits[alpha] = [temp_ridge, temp_ridge_R2]
Is there a way to impose a sign constraint on some of the parameters using ridge regression? Thanks!
Topic ridge-regression linear-regression regression scikit-learn
Category Data Science