How to handle this kind of timeseries

I am using Python and I have a sample dataset of this kind:

columns = ['product_id', 'market_value2015', 'market_value2016', 
           'market_value2017', 'market_value2018', 'market_value2019', 
           'market_value2020', 'market_value2021', 'retired'],

where market_value2015...2021 are floats and retired is a Boolean.

The objective is to train a neural network that predicts if a product will be retired or not.

Initially, my idea was to use floats variables from 2015 to 2020 in order to predict retirement in 2021 by considering these variables as time series, but I got stucked since I've never seen a dataset like this before. So, I don't know how to let the neural network understand that floats variables have the concept of time from 2015 to 2020, since they are always on the same level. I hope that this is clear.

Topic neural-network time-series python

Category Data Science


From a model point of view, there is no "time" concept.

What you are trying to do is to use a feature vector $x_t$, for $t$ ranging from 2015 to 2020 (why not 2021 as well?).

This is not a problem of time series forecast, but rather a time series (binary) classification. Each example is a time series, with an outcome retired or not.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.