Is there a way to output feature importance based on the outputted class?
I'm running a random forest classifier in Python (two classes). I am using the feature_importances_
method of the RandomForestClassifier
to get feature importances.
It provides a nice visualization of importances but it does not offer insight into which features were most important for each class. For example, it may be for class 1 that some feature values were important, whereas for class 2 some other feature values were more important.
Is it possible to split feature important based on the predicted class?
Topic explainable-ai random-forest feature-selection python
Category Data Science