How to Visualize attention weights in a Attention based Encoder-Decoder network in Time series forecasting

Below is one example Attention-based Encoder-decoder network for multivariate time series forecasting task. I want to visualize the attention weights.

input_ = Input(shape=(TIME_STEPS,N))
x = attention_block(input_)
x = LSTM(512, return_sequences=True)(x)
x = LSTM(512)(x)
x = RepeatVector(n_future)(x)
x = LSTM(128, activation='relu', return_sequences=True)(x)
x = TimeDistributed(Dense(128, activation='relu'))(x)
x = Dense(1)(x)
model = Model(input_,x)
model.compile(loss=mean_squared_error,optimizer=adam,metrics=[acc])
print(model.summary())

Here is the implementation of my attention block:

def attention_block(inputs):
    x=Permute((2,1))(inputs)
    x=Dense(TIME_STEPS,activation=softmax)(x)
    x=Permute((2,1),name=attention_prob)(x)
    x=multiply([inputs,x])
return x

I will highly appreciate if a fresh implementation of the attention model is provided.

Topic encoder attention-mechanism forecasting deep-learning time-series

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.