How to train ML model for time series data

I am trying to build a machine learning model in python. I used pytorch and sklearn to make the model. My model is a bit complicated: I have between one to 8 input feature but several target variables. My target variables are kind of time series. Shape of my taget variable is 50x169 but my input feature the shape is between 50x1 to 50x8. I showed three different series in the upladed figure.

I used algorithms like DecisionTreeRegressor and RandomeForestRegressor to fit the only input variable to several target variables. But the prediction of trained model is not so well for extrapolation. Does anyone know such trained model in Python? I tried hyperparameter tuning using GridSearchCV but it did not help me. I have seen some tutorials on long short-term memory (LSTM) method but they usually use the historical data to find the trend and predict future but I want to predict the time series based on input feature/s. In advance I do appreciate your help and feedback.

Topic pytorch lstm scikit-learn python

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.