How multi layer LSTM are interconnected?

I am trying to understand the layers in LSTM for my own implementation using Python. I started with Keras to getting familiarized with the layer flow.

I have tried the below code in Keras and I have the observations as follows

# LSTM MODEL
step_size = 3
model = Sequential()
model.add(LSTM(32, input_shape=(2, step_size), return_sequences = True))
model.add(LSTM(18))
model.add(Dense(1))
model.add(Activation('linear'))

And I got the below summary details for this implementation.

I tried to understand the internal structure of these layers and how these layer numbers ( 32, 18,1) mapped with matrix. I got the below matrix size from the weight matrix.

I can not see any relation between the output size and the weight matrix :(

Matrix Order

Layer 1

(3, 128)

(32, 128)

(128,)


Layer 2

(32, 72)

(18, 72)

(72,)


Layer 3

(18, 1)

(1,)

Now I am confused to find out the implementation logic inside the layer. Because I got three weight matrix in each LSTM layer and I can not connect the first LSTM(32) to the next LSTM(18) layer.

It will be thankful if any live implementation or flow diagram exists for this implementation. Thanks.

Topic stacked-lstm lstm keras neural-network machine-learning

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.