A nonconvex proximal splitting algorithm under Moreau-Yosida regularization

Emanuel Laude, Tao Wu, Daniel Cremers

Publikation: KonferenzbeitragPapierBegutachtung

4 Zitate (Scopus)

Abstract

We tackle highly nonconvex, nonsmooth composite optimization problems whose objectives comprise a Moreau-Yosida regularized term. Classical nonconvex proximal splitting algorithms, such as nonconvex ADMM, suffer from lack of convergence for such a problem class. To overcome this difficulty, in this work we consider a lifted variant of the Moreau-Yosida regularized model and propose a novel multiblock primal-dual algorithm that intrinsically stabilizes the dual block. We provide a complete convergence analysis of our algorithm and identify respective optimality qualifications under which stationarity of the original model is retrieved at convergence. Numerically, we demonstrate the relevance of Moreau-Yosida regularized models and the efficiency of our algorithm on robust regression as well as joint feature selection and semi-supervised learning.

OriginalspracheEnglisch
Seiten491-499
Seitenumfang9
PublikationsstatusVeröffentlicht - 2018
Veranstaltung21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018 - Playa Blanca, Lanzarote, Canary Islands, Spanien
Dauer: 9 Apr. 201811 Apr. 2018

Konferenz

Konferenz21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018
Land/GebietSpanien
OrtPlaya Blanca, Lanzarote, Canary Islands
Zeitraum9/04/1811/04/18

Fingerprint

Untersuchen Sie die Forschungsthemen von „A nonconvex proximal splitting algorithm under Moreau-Yosida regularization“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren