Quantitative measure of the smoothness of learning curves

$\DeclareMathOperator{\loss}{loss}$ $\DeclareMathOperator{\AvgVar}{AvgVar}$

Lat's say we have some deep learning task. We have our model and two sets of hyperparameters $A$ and $B$. We train both systems for 10000 mini-batches and we obtain two learning curves (losses on these train batches).

Is there any quantitative measure of the smoothness of the learning curve? I saw few times in the articles that the authors just overlap two curves to show that one is smoother then the other, but obviously it would be easier to compare them with some measure of smoothness.


Edit. As for the first try I just computed the average variation analogously to (total variation). I.e. $$\AvgVar_n = \frac{1}{n}\sum_{i=0}^{n-1}|\loss_{i+1} - \loss_i|$$

Topic model-evaluations training deep-learning

Category Data Science


Basically, smoothness is defined by the continuous derivatives up to a desired order.

The more you are close to those derivatives, the more your curve is smooth.

In practice, you could cut the learning curve in pieces, apply a polynomial regression to each piece, and calculate the error between the polynomial curve and the learning curve in order to have a value of the smoothness. Of course, it shall be in a specific range of time to be able to compare curves' smoothness between each other.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.