Is it possible to detect drift with real time predictions?
I have been reading up on detecting data drift and concept drift, I have found this library but it seems all the methods here detect concept drift and take input as if the prediction was correct or not. (Requiring ground truth) is this the correct assumption?
Then I stumbled on Kullback-Leibler Divergence and JS-Divergence. Can I use these methods to detect data drift in real time? (ex: request comes into my models API and then the prediction is made. I then take the features and pass it to the function calculating the drift)
Some of my concerns are do I need the full training data to compare to? As I understand these algorithms need the same size of data to compare, so would I need a data set the same size as my training data? Even understanding what the inputs used to detect data drift vs concept drift vs covariate shift would helpful.
Topic concept-drift data machine-learning
Category Data Science