What is the name of this technique involving tracking cumulative errors with a forgiveness parameter?

I'm looking for the name of a technique I've seen used before.

Most common in time-series based anomaly detection. It involves keeping a running total of consecutive error amounts, generally the difference from a prediction or baseline, and then reacting when the cumulative amount exceeds a specific tolerance level.

There needs to be a forgiveness amount in this technique that reduces the cumulative error each iteration, to avoid a lot of small errors from eventually stacking up and flipping the flag.

Does anyone know what this method is called, so that I can do more research into it?

Topic anomaly-detection definitions

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.