Vowpal Wabbit Online Normalization -- Possible to parallelize?

Vowpal Wabbit (VW) uses online normalization as explained here [1].

When running VW with multiple workers, workers synchronize their models with an AllReduce at the end of each epoch.

Is it possible or is there any code/paper that explores the idea of doing online learning with multiple workers in a parameter server setting?

[1] https://arxiv.org/abs/1305.6646

Topic vowpal-wabbit mini-batch-gradient-descent gradient-descent machine-learning

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.