Feature Interactions vs Feature Importances
What are the differences between Feature Interactions and Feature Importances?
What are the differences between Feature Interactions and Feature Importances?
The most intuitive way to explain it I found it on this Nature: Explainable AI for Trees: From Local Explanations to Global Understanding:
Feature Importance: The contribution of a feature to the final prediction. If it was a linear regression, this will be the coefficient. For a decision tree you have mean decreased accuracy or as the paper says, shapley values. From the shapley github
Feature Interaction: A model makes a prediction based on two features, we can decompose the prediction into four terms: a constant term, a term for the first feature, a term for the second feature and a term for the interaction between the two features. The interaction between two features is the change in the prediction that occurs by varying the features after considering the individual feature effects.
Lets say our model is like:
y = a + b·x_1 + c·x_2 + d·x_1·x_2
The feature importance terms will be b, and c.
The feature interaction will be the product x_1 · x_2
So one is the coefficient and the other one the relationship between the two. In the following graph, you can see the relationship between Age and Sex changes when the Age increases value.
Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.