From Nesterov’s Estimate Sequence to Riemannian Acceleration

Kwangjun Ahn, Suvrit Sra

Research output: Contribution to journalConference articlepeer-review

34 Scopus citations

Abstract

We propose the first global accelerated gradient method for Riemannian manifolds. Toward establishing our results, we revisit Nesterov’s estimate sequence technique and develop a conceptually simple alternative from first principles. We then extend our analysis to Riemannian acceleration, localizing the key difficulty into “metric distortion.” We control this distortion via a novel geometric inequality, which enables us to formulate and analyze global Riemannian acceleration.

Original languageEnglish
Pages (from-to)84-118
Number of pages35
JournalProceedings of Machine Learning Research
Volume125
StatePublished - 2020
Externally publishedYes
Event33rd Conference on Learning Theory, COLT 2020 - Virtual, Online, Austria
Duration: 9 Jul 202012 Jul 2020

Fingerprint

Dive into the research topics of 'From Nesterov’s Estimate Sequence to Riemannian Acceleration'. Together they form a unique fingerprint.

Cite this