Abstract
This paper is concerned with the development of numerical schemes for the minimization of functionals involving sparsity constraints and nonconvex fidelity terms. These functionals appear in a natural way in the context of Tikhonov regularization of nonlinear inverse problems with ℓ1 penalty terms. Our method of minimization is based on a generalized conditional gradient scheme. It is well known that these algorithms might converge quite slowly in practice. Therefore, we propose an acceleration which is based on a decreasing thresholding strategy. Its efficiency relies on certain spectral properties of the problem at hand. We show that under certain boundedness and contraction conditions the resulting algorithm is linearly convergent to a global minimizer and that the iteration is monotone with respect to the Tikhonov functional. We study important classes of operator equations to which our analysis can be applied. Moreover, we introduce a certain multilevel preconditioning strategy which in practice promotes the aforementioned spectral properties for problems where the nonlinearity is a perturbation of a linear operator.
Original language | English |
---|---|
Pages (from-to) | 393-414 |
Number of pages | 22 |
Journal | Journal of Inverse and Ill-Posed Problems |
Volume | 23 |
Issue number | 4 |
DOIs | |
State | Published - 1 Aug 2015 |
Keywords
- (nonlinear) operator equations
- Conditional gradient method
- iterative thresholding
- multilevel preconditioning
- nonconvex optimization
- sparse minimization
- wavelets