Many to one LSTM, where some sequence values are known at prediction step
I have a time series problem, which I am modelling with an RNN (using LSTMs).
The input contains a sequence of values x_0 to x_4, for predictions at positions n-k (where k is a configurable parameter - the length of the sequence). I.e. the input shape is (k, 4). This is a regression problem, where the 4 (correlated) sequences are mapped into a prediction of the n+1th position, y_(n+1). However, this is an unusual problem where all values for x_1 and x_2 are known for position n+1.
Currently I do not use these known values, but I believe that the correlations of x_1(n+1) and x_2(n+1) could help with the predictions of y_(n+1). However, I don't know how best to implement this in my network.
I had the idea of using a multibranch network, where one branch contains sequential LSTM layers, and the other branch contains dense layers, with inputs x_1(n+1) and x_2(n+1). However, I don't know if this allows the network to learn the correlations of x_1(n+1) and x_2(n+1) with y_(n+1). In theory, the network learns these correlations with the LSTM layers, containing the input sequences of k values of x_0 - x_4, however I don't know how to implement this association with the other branch.
I hope this makes sense, happy to rephrase the problem statement if it is confusing for people. Would appreciate any advice, or links to examples where I could find out more information on how to model this type of problem
Thanks!
Category Data Science