Vowpal Wabbit Online Normalization -- Possible to parallelize?
Vowpal Wabbit (VW) uses online normalization as explained here [1]. When running VW with multiple workers, workers synchronize their models with an AllReduce at the end of each epoch. Is it possible or is there any code/paper that explores the idea of doing online learning with multiple workers in a parameter server setting? [1] https://arxiv.org/abs/1305.6646
Category:
Data Science