Abstract
We are concerned with the efficient numerical solution of minimization problems in Hilbert spaces involving sparsity constraints. These optimizations arise, e.g., in the context of inverse problems. In this work we analyze an efficient variant of the well-known iterative soft-shrinkage algorithm for large or even infinite dimensional problems. This algorithm is modified in the following way. Instead of prescribing a fixed thresholding parameter, we use a decreasing thresholding strategy. Moreover, we use suitable variants of the adaptive schemes derived by Cohen, Dahmen and DeVore for the approx-imation of the infinite matrix-vector products. We derive a block multiscale preconditioning technique which allows for local well-conditioning of the underlying matrices and for extending the concept of restricted isometry property to infinitely labelled matrices. The combination of these ingredients gives rise to a numerical scheme that is guaranteed to converge with exponential rate, and which allows for a controlled inflation of the support size of the iterations. We also present numerical experiments that confirm the applicability of our approach which extends concepts from compressed sensing to large scale simulation.
Original language | English |
---|---|
Pages (from-to) | 419-446 |
Number of pages | 28 |
Journal | Mathematics of Computation |
Volume | 81 |
Issue number | 277 |
DOIs | |
State | Published - 2011 |
Keywords
- Adaptive algorithms
- Iterative thresholding
- Multilevel preconditioning
- Multiscale methods
- Operator equations
- Restricted isometry property
- Sparse optimization and compressed sensing
- Wavelets