Multicollinearity vs Perfect multicollinearity for Linear regression
I have been trying to understand how multicollinearity within the independent variables would affect the Linear regression model. Wikipedia page suggests that only when there is a perfect multicollinearity, one of the independent variables would have to be removed from training.
Now my question is that should we only remove one of the columns if the correlation is equal to +/- 1 or do we consider a threshold (say 0.90) after which it should be considered as perfect multicollinearity.
Topic collinearity linear-regression
Category Data Science