Low-rank matrix recovery via iteratively reweighted least squares minimization

Massimo Fornasier, Holger Rauhut, Rachel Ward

Research output: Contribution to journalArticlepeer-review

171 Scopus citations

Abstract

We present and analyze an efficient implementation of an iteratively reweighted least squares algorithm for recovering a matrix from a small number of linear measurements. The algorithm is designed for the simultaneous promotion of both a minimal nuclear norm and an approximately low-rank solution. Under the assumption that the linear measurements fulfill a suitable generalization of the null space property known in the context of compressed sensing, the algorithm is guaranteed to recover iteratively any matrix with an error of the order of the best k-rank approximation. In certain relevant cases, for instance, for the matrix completion problem, our version of this algorithm can take advantage of the Woodbury matrix identity, which allows us to expedite the solution of the least squares problems required at each iteration. We present numerical experiments which confirm the robustness of the algorithm for the solution of matrix completion problems, and we demonstrate its competitiveness with respect to other techniques proposed recently in the literature.

Original languageEnglish
Pages (from-to)1614-1640
Number of pages27
JournalSIAM Journal on Optimization
Volume21
Issue number4
DOIs
StatePublished - 2011
Externally publishedYes

Keywords

  • Iteratively reweighted least squares
  • Low-rank matrix recovery
  • Matrix completion

Fingerprint

Dive into the research topics of 'Low-rank matrix recovery via iteratively reweighted least squares minimization'. Together they form a unique fingerprint.

Cite this