Fast incremental method for smooth nonconvex optimization

Sashank J. Reddi, Suvrit Sra, Barnabás Póczos, Alex Smola

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

44 Zitate (Scopus)

Abstract

We analyze a fast incremental aggregated gradient method for optimizing nonconvex problems of the form minΣi fi(x). Specifically, we analyze the SAGA algorithm within an Incremental First-order Oracle framework, and show that it converges to a stationary point provably faster than both gradient descent and stochastic gradient descent. We also discuss a Polyak's special class of nonconvex problems for which SAGA converges at a linear rate to the global optimum. Finally, we analyze the practically valuable regularized and minibatch variants of SAGA. To our knowledge, this paper presents the first analysis of fast convergence for an incremental aggregated gradient method for nonconvex problems.

OriginalspracheEnglisch
Titel2016 IEEE 55th Conference on Decision and Control, CDC 2016
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
Seiten1971-1977
Seitenumfang7
ISBN (elektronisch)9781509018376
DOIs
PublikationsstatusVeröffentlicht - 27 Dez. 2016
Extern publiziertJa
Veranstaltung55th IEEE Conference on Decision and Control, CDC 2016 - Las Vegas, USA/Vereinigte Staaten
Dauer: 12 Dez. 201614 Dez. 2016

Publikationsreihe

Name2016 IEEE 55th Conference on Decision and Control, CDC 2016

Konferenz

Konferenz55th IEEE Conference on Decision and Control, CDC 2016
Land/GebietUSA/Vereinigte Staaten
OrtLas Vegas
Zeitraum12/12/1614/12/16

Fingerprint

Untersuchen Sie die Forschungsthemen von „Fast incremental method for smooth nonconvex optimization“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren