Detect unusal slope increasing

I have a response variable series which will be generated randomly in a fixed interval [0-100] base on every second, and I want to detect the event when the new generated data is significantly greater than data of last second, and send alarm message to me.

So, I calculate the difference of response variable by 1 lag and divided by difference of time (slope), than use bootstrapping to construct the 95% confidence interval of response's 90% percentile, if the new data is greater than Upper Limit, I define it as unusual.

In fact, the quantity of data I have is extremely large, and these data update frequently. So, it will be horrible to resampling and calculate the CI, even it could be a way to solve problem. And it's looks like not necessarily to sampling because I have large sample.

Are there any better way to do? Thanks!

Topic time-series data-mining

Category Data Science


One option is collect the empirical distribution of 1 lag (i.e., how much change happens over one time step?). Then set the threshold based on a percentage of this empirical distribution.

This is often done with latency. An unusual latency event is defined at 95% or 99% percentile.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.