A stochastic semismooth Newton method for nonsmooth nonconvex optimization

Andre Milzarek, Xiantao Xiao, Shicong Cen, Zaiwen Wen, Michael Ulbrich

Publikation: Beitrag in FachzeitschriftArtikelBegutachtung

21 Zitate (Scopus)

Abstract

In this work, we present a globalized stochastic semismooth Newton method for solving stochastic optimization problems involving smooth nonconvex and nonsmooth convex terms in the objective function. We assume that only noisy gradient and Hessian information of the smooth part of the objective function is available via calling stochastic first and second order oracles. The proposed method can be seen as a hybrid approach combining stochastic semismooth Newton steps and stochastic proximal gradient steps. Two inexact growth conditions are incorporated to monitor the convergence and the acceptance of the semismooth Newton steps and it is shown that the algorithm converges globally to stationary points in expectation and almost surely. We present numerical results and comparisons on l1-regularized logistic regression and nonconvex binary classification that demonstrate the efficiency of the algorithm.

OriginalspracheEnglisch
Seiten (von - bis)2916-2948
Seitenumfang33
FachzeitschriftSIAM Journal on Optimization
Jahrgang29
Ausgabenummer4
DOIs
PublikationsstatusVeröffentlicht - 2019

Fingerprint

Untersuchen Sie die Forschungsthemen von „A stochastic semismooth Newton method for nonsmooth nonconvex optimization“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren