Quantitative measure of the smoothness of learning curves
$\DeclareMathOperator{\loss}{loss}$ $\DeclareMathOperator{\AvgVar}{AvgVar}$
Lat's say we have some deep learning task. We have our model and two sets of hyperparameters $A$ and $B$. We train both systems for 10000 mini-batches and we obtain two learning curves (losses on these train batches).
Is there any quantitative measure of the smoothness of the learning curve? I saw few times in the articles that the authors just overlap two curves to show that one is smoother then the other, but obviously it would be easier to compare them with some measure of smoothness.
Edit. As for the first try I just computed the average variation analogously to (total variation). I.e. $$\AvgVar_n = \frac{1}{n}\sum_{i=0}^{n-1}|\loss_{i+1} - \loss_i|$$
Topic model-evaluations training deep-learning
Category Data Science