MinMaxScaler returned values greater than one
Basically I was looking for a normalization function part of sklearn, which is useful later for logistic regression.
Since I have negative values, I chose MinMaxScaler with: feature_range=(0, 1)
as a parameter.
x = MinMaxScaler(feature_range=(0, 1)).fit_transform(x)
Then using sm.Logit
trainer I got and error,
import statsmodels.api as sm
logit_model=sm.Logit(train_data_numeric_final,target)
result=logit_model.fit()
print(result.summary())
ValueError: endog must be in the unit interval.
I presume my values are out of (0,1) range, which is the case:
np.unique(np.less_equal(train_data_numeric_final.values, 1))
array([False, True])
How come? then how can I proceed.
Topic numerical normalization feature-scaling logistic-regression python
Category Data Science