Vowpal Wabbit Online Normalization -- Possible to parallelize?
Vowpal Wabbit (VW) uses online normalization as explained here [1].
When running VW with multiple workers, workers synchronize their models with an AllReduce at the end of each epoch.
Is it possible or is there any code/paper that explores the idea of doing online learning with multiple workers in a parameter server setting?