Interpretation of SHAP summary plot in a multi class context
I'm performing multi-class classification and uses SHAP values to interpret the features. I have 3 classes. I have testet XGBoost and Multinomial Logistic Regression. When i'm using XGBoost I am able to get a summary plot where I can see the individual feature affect on all three classes. I'm also able to get a seperate plot for each class to see how small/large feature values affect the prediction towards the individual class. It seems like this is only possible to get when using XGBoost. Why is that? Is it because XGBoost fits one tree per class?
See pictures for difference in summary plots
Category Data Science