TY - JOUR
T1 - A machine learning approach to optimal Tikhonov regularization I
T2 - Affine manifolds
AU - De Vito, Ernesto
AU - Fornasier, Massimo
AU - Naumova, Valeriya
N1 - Publisher Copyright:
© 2022 World Scientific Publishing Company.
PY - 2022/3/1
Y1 - 2022/3/1
N2 - Despite a variety of available techniques, such as discrepancy principle, generalized cross validation, and balancing principle, the issue of the proper regularization parameter choice for inverse problems still remains one of the relevant challenges in the field. The main difficulty lies in constructing an efficient rule, allowing to compute the parameter from given noisy data without relying either on any a priori knowledge of the solution, noise level or on the manual input. In this paper, we propose a novel method based on a statistical learning theory framework to approximate the high-dimensional function, which maps noisy data to the optimal Tikhonov regularization parameter. After an offline phase where we observe samples of the noisy data-to-optimal parameter mapping, an estimate of the optimal regularization parameter is computed directly from noisy data. Our assumptions are that ground truth solutions of the inverse problem are statistically distributed in a concentrated manner on (lower-dimensional) linear subspaces and the noise is sub-gaussian. We show that for our method to be efficient, the number of previously observed samples of the noisy data-to-optimal parameter mapping needs to scale at most linearly with the dimension of the solution subspace. We provide explicit error bounds on the approximation accuracy from noisy data of unobserved optimal regularization parameters and ground truth solutions. Even though the results are more of theoretical nature, we present a recipe for the practical implementation of the approach. We conclude with presenting numerical experiments verifying our theoretical results and illustrating the superiority of our method with respect to several state-of-the-art approaches in terms of accuracy or computational time for solving inverse problems of various types.
AB - Despite a variety of available techniques, such as discrepancy principle, generalized cross validation, and balancing principle, the issue of the proper regularization parameter choice for inverse problems still remains one of the relevant challenges in the field. The main difficulty lies in constructing an efficient rule, allowing to compute the parameter from given noisy data without relying either on any a priori knowledge of the solution, noise level or on the manual input. In this paper, we propose a novel method based on a statistical learning theory framework to approximate the high-dimensional function, which maps noisy data to the optimal Tikhonov regularization parameter. After an offline phase where we observe samples of the noisy data-to-optimal parameter mapping, an estimate of the optimal regularization parameter is computed directly from noisy data. Our assumptions are that ground truth solutions of the inverse problem are statistically distributed in a concentrated manner on (lower-dimensional) linear subspaces and the noise is sub-gaussian. We show that for our method to be efficient, the number of previously observed samples of the noisy data-to-optimal parameter mapping needs to scale at most linearly with the dimension of the solution subspace. We provide explicit error bounds on the approximation accuracy from noisy data of unobserved optimal regularization parameters and ground truth solutions. Even though the results are more of theoretical nature, we present a recipe for the practical implementation of the approach. We conclude with presenting numerical experiments verifying our theoretical results and illustrating the superiority of our method with respect to several state-of-the-art approaches in terms of accuracy or computational time for solving inverse problems of various types.
KW - Tikhonov regularization
KW - concentration inequalities
KW - high-dimensional function approximation
KW - parameter choice rule
KW - sub-gaussian vectors
UR - http://www.scopus.com/inward/record.url?scp=85100591240&partnerID=8YFLogxK
U2 - 10.1142/S0219530520500220
DO - 10.1142/S0219530520500220
M3 - Article
AN - SCOPUS:85100591240
SN - 0219-5305
VL - 20
SP - 353
EP - 400
JO - Analysis and Applications
JF - Analysis and Applications
IS - 2
ER -