Is there a difference between on-line learning, incremental learning and sequential learning?

What I mean is the following: Instead of processing all the training data at once and calculating a model, we process one data point at a time and update the model directly afterwards.

I have seen the terms "on-line (or online) learning" and "incremental learning" for this. Is there a subtle difference? Is one term used more frequently? Or does it depend on the research community?

Edit: The Bishop book (Pattern Recognition and Machine Learning) uses the terms on-line learning and sequential learning as synonyms but does not mention incremental learning.

Topic online-learning machine-learning

Category Data Science


According to this article, both of the learning(Online and incremental) methods aim at learning (updating) a model when the data comes on the fly to obtain the same model as the one learned in a batch setting (i.e. on static data). The difference is that on-line learning learns a model when the training instances arrive sequentially one by one (1-by-1), whereas incremental learning updates a model when a new batch of data instances arrive. The comparisons between on-line learning and incremental learning are listed in Table 1. It should be noted that, all the existing on-line learning frameworks can be used for incremental learning as well, because on-line learning algorithms can process the batch of new data 1-by-1.


Although the definitions of the two concepts are fuzzy, there is a slight difference between online learning and incremental learning approaches.

In online learning approach, model is updated to adapt to the new data. It is possible that the model can forget the previously learned inferences which is called as Catastrophic Interference.

Whereas in the incremental approach, even as the model is updated, previous inferences are not forgotten.


So, the following answer is just based on different opinions of collegues and professors from the field. I want to try to summarize it briefly:

Sequential and online learning is mostly associated with Bayesian updating. The sequential learning is used widely for an order in time of the data, meaning that $x_1$ is coming always first, then $x_2$, then $x_3$ and so on. The Dataset each has a certain order in that sense. In contrast to that incremental may be a whole block of data at time x and another block of data at time y. While the block internally may be randomly ordered.

Concerning online learning, the people mostly referred to a data stream, hence a online learning is always incremental learning but incremental learning does not have to be online.

These are quite fuzzy definitions, and in my opinion there is not clear definition though. I still hope that helps.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.