World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

Stochastic sub-sampled Newton method with variance reduction

    https://doi.org/10.1142/S0219691319500413Cited by:1 (Source: Crossref)

    Stochastic optimization on large-scale machine learning problems has been developed dramatically since stochastic gradient methods with variance reduction technique were introduced. Several stochastic second-order methods, which approximate curvature information by the Hessian in stochastic setting, have been proposed for improvements. In this paper, we introduce a Stochastic Sub-Sampled Newton method with Variance Reduction (S2NMVR), which incorporates the sub-sampled Newton method and stochastic variance-reduced gradient. For many machine learning problems, the linear time Hessian-vector production provides evidence to the computational efficiency of S2NMVR. We then develop two variations of S2NMVR that preserve the estimation of Hessian inverse and decrease the computational cost of Hessian-vector product for nonlinear problems.

    AMSC: 62H35, 68T01