An Evaluation of Stochastic Variance Lowered Gradient for Linear Inverse Issues
Authors: Bangti Jin, Zehui Zhou, Jun Zou
Summary: Stochastic variance diminished gradient (SVRG) is a well-liked variance discount method for accelerating stochastic gradient descent (SGD). We offer a primary evaluation of the tactic for fixing a category of linear inverse issues within the lens of the classical regularization principle. We show that for an acceptable fixed step measurement schedule, the tactic can obtain an optimum convergence price by way of the noise stage (beneath appropriate regularity situation) and the variance of the SVRG iterate error is smaller than that by SGD. These theoretical findings are corroborated by a set of numerical experiments.