Understanding hierarchical clustering features importance

I made a hierarchical clustering with scikit :

selected_model = AgglomerativeClustering(n_clusters=8)
hierarchical_clustering8 = selected_model.fit_predict(answers)

This classification was done on the basis of 50 features and led me to 8 clusters.

How can I proceed to determine the importance of each feature in this classification ?

My goal is to determine the most important and least important features for each cluster, and to be able to explain each cluster.

Topic agglomerative explainable-ai scikit-learn python clustering

Category Data Science


Since you have a estimator trained and ready. You can use the created classes and train a classification mode based on these classes. I would try a Random Forest Classifier which has a built in feature importance attribute. This attribute indicates the information gain that the features impose.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.