In A Unified Approach to Interpreting Model Predictions the authors define SHAP values "as a unified measure of feature importance". That is, SHAP values are one of many approaches to estimate feature importance.

This e-book provides a good explanation, too:

The goal of SHAP is to explain the prediction of an instance x by computing the contribution of each feature to the prediction. [...] SHAP feature importance is an alternative to permutation feature importance. There is a big difference between both importance measures: Permutation feature importance is based on the decrease in model performance. SHAP is based on magnitude of feature attributions.


SHAP values estimate the impact of a feature on predictions whereas feature importances estimate the impact of a feature on model fit.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.