Fast incremental method for smooth nonconvex optimization

Sashank J. Reddi, Suvrit Sra, Barnabás Póczos, Alex Smola

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

38 Scopus citations

Abstract

We analyze a fast incremental aggregated gradient method for optimizing nonconvex problems of the form minΣi fi(x). Specifically, we analyze the SAGA algorithm within an Incremental First-order Oracle framework, and show that it converges to a stationary point provably faster than both gradient descent and stochastic gradient descent. We also discuss a Polyak's special class of nonconvex problems for which SAGA converges at a linear rate to the global optimum. Finally, we analyze the practically valuable regularized and minibatch variants of SAGA. To our knowledge, this paper presents the first analysis of fast convergence for an incremental aggregated gradient method for nonconvex problems.

Original languageEnglish
Title of host publication2016 IEEE 55th Conference on Decision and Control, CDC 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1971-1977
Number of pages7
ISBN (Electronic)9781509018376
DOIs
StatePublished - 27 Dec 2016
Externally publishedYes
Event55th IEEE Conference on Decision and Control, CDC 2016 - Las Vegas, United States
Duration: 12 Dec 201614 Dec 2016

Publication series

Name2016 IEEE 55th Conference on Decision and Control, CDC 2016

Conference

Conference55th IEEE Conference on Decision and Control, CDC 2016
Country/TerritoryUnited States
CityLas Vegas
Period12/12/1614/12/16

Fingerprint

Dive into the research topics of 'Fast incremental method for smooth nonconvex optimization'. Together they form a unique fingerprint.

Cite this