Incremental Methods for Additive Convex Cost Optimization
Motivated by machine learning problems over large data sets and distributed optimization over networks, we consider the problem of minimizing the sum of a large number of convex component functions. We study incremental gradient methods for solving such problems, which process component functions sequentially one at a time. We first consider deterministic cyclic incremental gradient methods (that process the component functions in a cycle) and provide new convergence rate results under some assumptions. We then consider a randomized incremental gradient…