LSTM Shapley Deep Explainer TimeseriesGenerator Keras

I have this data in the form:

X_train shape: (2724, 10) , y_train shape: (2724,)
X_test shape: (682, 10) , y_test shape: (682,)

which I feed into Keras' TimeseriesGenerator:

window_length = 63
batch_size = 32
train_generator = TimeseriesGenerator(X_train, y_train, length=window_length, sampling_rate=1,
                                      batch_size=batch_size, stride=1)
test_generator = TimeseriesGenerator(X_test, y_test, length=window_length, sampling_rate=1,
                                     batch_size=batch_size, stride=1)

This is the type of train_generator: type(train_generator): class 'tensorflow.python.keras.preprocessing.sequence.TimeseriesGenerator'

Each sequence looks like this:

for i in range(len(train_generator)):
    x, y = train_generator[i]
    print(x.shape, y.shape)

(32, 63, 10) (32,)
(32, 63, 10) (32,)
(32, 63, 10) (32,)
(32, 63, 10) (32,)
(32, 63, 10) (32,)
(32, 63, 10) (32,)
(32, 63, 10) (32,)
       ... 
(32, 63, 10) (32,)
(32, 63, 10) (32,)
(32, 63, 10) (32,)
(32, 63, 10) (32,)
(32, 63, 10) (32,)
(32, 63, 10) (32,)
(5, 63, 10) (5,)

I want to use the shap library's DeepExplainer to gain insight into my LSTM model. What is the correct way to input the train_generator and/or test_generator data into the DeepExplainer model, as when I just pass, say train_generator, I get:

    if self.data[0].shape[0]  5000:
AttributeError: 'TimeseriesGenerator' object has no attribute 'shape'

The online SHAP documentation gives an example:

explainer = shap.KernelExplainer(model, X.iloc[:50,:])
shap_values = explainer.shap_values(X.iloc[299,:], nsamples=500)
shap.force_plot(explainer.expected_value, shap_values, X_display.iloc[299,:])

My question is, after training my LSTM model, what and how should I perform SHAP's DeepExplainer with TimeseriesGenerator generated data?

Topic shap keras deep-learning time-series python

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.