Modular proximal optimization for multidimensional total-variation regularization

Álvaro Barbero, Suvrit Sra

Research output: Contribution to journalArticlepeer-review

35 Scopus citations

Abstract

We study TV regularization, a widely used technique for eliciting structured sparsity. In particular, we propose efficient algorithms for computing prox-operators for `p-norm TV. The most important among these is `1-norm TV, for whose prox-operator we present a new geometric analysis which unveils a hitherto unknown connection to taut-string methods. This connection turns out to be remarkably useful as it shows how our geometry guided implementation results in efficient weighted and unweighted 1D-TV solvers, surpassing state-of-the-art methods. Our 1D-TV solvers provide the backbone for building more complex (two or higher-dimensional) TV solvers within a modular proximal optimization approach. We review the literature for an array of methods exploiting this strategy, and illustrate the benefits of our modular design through extensive suite of experiments on (i) image denoising, (ii) image deconvolution, (iii) four variants of fused-lasso, and (iv) video denoising. To underscore our claims and permit easy reproducibility, we provide all the reviewed and our new TV solvers in an easy to use multi-threaded C++, Matlab and Python library.

Original languageEnglish
JournalJournal of Machine Learning Research
Volume19
StatePublished - 1 Nov 2018
Externally publishedYes

Keywords

  • Non–smooth optimization
  • Proximal optimization
  • Regularized learning
  • Sparsity
  • Total variation

Fingerprint

Dive into the research topics of 'Modular proximal optimization for multidimensional total-variation regularization'. Together they form a unique fingerprint.

Cite this