Fast Newton-type methods for total variation regularization

Álvaro Barbero, Suvrit Sra

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

63 Scopus citations

Abstract

Numerous applications in statistics, signal processing, and machine learning regularize using Total Variation (TV) penalties. We study anisotropic (ℓ1-based) TV and also a related ℓ2-norm variant. We consider for both variants associated (1D) proximity operators, which lead to challenging optimization problems. We solve these problems by developing Newton-type methods that outperform the state-of-the-art algorithms. More importantly, our 1D-TV algorithms serve as building blocks for solving the harder task of computing 2- (and higher-dimensional TV proximity. We illustrate the computational benefits of our methods by applying them to several applications: (i) image denoising; (ii) image deconvolution (by plugging in our TV solvers into publicly available software); and (iii) four variants of fused-lasso. The results show large speedups - and to support our claims, we provide software accompanying this paper.

Original languageEnglish
Title of host publicationProceedings of the 28th International Conference on Machine Learning, ICML 2011
Pages313-320
Number of pages8
StatePublished - 2011
Externally publishedYes
Event28th International Conference on Machine Learning, ICML 2011 - Bellevue, WA, United States
Duration: 28 Jun 20112 Jul 2011

Publication series

NameProceedings of the 28th International Conference on Machine Learning, ICML 2011

Conference

Conference28th International Conference on Machine Learning, ICML 2011
Country/TerritoryUnited States
CityBellevue, WA
Period28/06/112/07/11

Fingerprint

Dive into the research topics of 'Fast Newton-type methods for total variation regularization'. Together they form a unique fingerprint.

Cite this