Abstract
We propose the first global accelerated gradient method for Riemannian manifolds. Toward establishing our results, we revisit Nesterov’s estimate sequence technique and develop a conceptually simple alternative from first principles. We then extend our analysis to Riemannian acceleration, localizing the key difficulty into “metric distortion.” We control this distortion via a novel geometric inequality, which enables us to formulate and analyze global Riemannian acceleration.
Original language | English |
---|---|
Pages (from-to) | 84-118 |
Number of pages | 35 |
Journal | Proceedings of Machine Learning Research |
Volume | 125 |
State | Published - 2020 |
Externally published | Yes |
Event | 33rd Conference on Learning Theory, COLT 2020 - Virtual, Online, Austria Duration: 9 Jul 2020 → 12 Jul 2020 |