Skip to main navigation Skip to search Skip to main content

Understanding Nesterov’s Acceleration via Proximal Point Method

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Scopus citations

Abstract

The proximal point method (PPM) is a fundamental method in optimization that is often used as a building block for designing optimization algorithms. In this work, we use the PPM method to provide conceptually simple derivations along with convergence analyses of different versions of Nesterov’s accelerated gradient method (AGM). The key observation is that AGM is a simple approximation of PPM, which results in an elementary derivation of the update equations and stepsizes of AGM. This view also leads to a transparent and conceptually simple analysis of AGM’s convergence by using the analysis of PPM. The derivations also naturally extend to the strongly convex case. Ultimately, the results presented in this paper are of both didactic and conceptual value; they unify and explain existing variants of AGM while motivating other accelerated methods for practically relevant settings.

Original languageEnglish
Title of host publicationSIAM Symposium on Simplicity in Algorithms, SOSA 2022
PublisherSociety for Industrial and Applied Mathematics Publications
Pages117-130
Number of pages14
ISBN (Electronic)9781713852087
DOIs
StatePublished - 2022
Externally publishedYes
Event5th SIAM Symposium on Simplicity in Algorithms, SOSA 2022, co-located with SODA 2022 - Virtual, Online
Duration: 10 Jan 202211 Jan 2022

Publication series

NameSIAM Symposium on Simplicity in Algorithms, SOSA 2022

Conference

Conference5th SIAM Symposium on Simplicity in Algorithms, SOSA 2022, co-located with SODA 2022
CityVirtual, Online
Period10/01/2211/01/22

Fingerprint

Dive into the research topics of 'Understanding Nesterov’s Acceleration via Proximal Point Method'. Together they form a unique fingerprint.

Cite this