TY - JOUR
T1 - CERTIFIED AND FAST COMPUTATIONS WITH SHALLOW COVARIANCE KERNELS
AU - Kressner, Daniel
AU - Latz, Jonas
AU - Massei, Stefano
AU - Ullmann, Elisabeth
N1 - Publisher Copyright:
© American Institute of Mathematical Sciences.
PY - 2020/12
Y1 - 2020/12
N2 - Many techniques for data science and uncertainty quantification demand efficient tools to handle Gaussian random fields, which are defined in terms of their mean functions and covariance operators. Recently, parameterized Gaussian random fields have gained increased attention, due to their higher degree of flexibility. However, especially if the random field is parameterized through its covariance operator, classical random field discretization techniques fail or become inefficient. In this work we introduce and analyze a new and certified algorithm for the low-rank approximation of a parameterized family of covariance operators which represents an extension of the adaptive cross approximation method for symmetric positive definite matrices. The algorithm relies on an affine linear expansion of the covariance operator with respect to the parameters, which needs to be computed in a preprocessing step using, e.g., the empirical interpolation method. We discuss and test our new approach for isotropic covariance kernels, such as Matérn kernels. The numerical results demonstrate the advantages of our approach in terms of computational time and confirm that the proposed algorithm provides the basis of a fast sampling procedure for parameter dependent Gaussian random fields.
AB - Many techniques for data science and uncertainty quantification demand efficient tools to handle Gaussian random fields, which are defined in terms of their mean functions and covariance operators. Recently, parameterized Gaussian random fields have gained increased attention, due to their higher degree of flexibility. However, especially if the random field is parameterized through its covariance operator, classical random field discretization techniques fail or become inefficient. In this work we introduce and analyze a new and certified algorithm for the low-rank approximation of a parameterized family of covariance operators which represents an extension of the adaptive cross approximation method for symmetric positive definite matrices. The algorithm relies on an affine linear expansion of the covariance operator with respect to the parameters, which needs to be computed in a preprocessing step using, e.g., the empirical interpolation method. We discuss and test our new approach for isotropic covariance kernels, such as Matérn kernels. The numerical results demonstrate the advantages of our approach in terms of computational time and confirm that the proposed algorithm provides the basis of a fast sampling procedure for parameter dependent Gaussian random fields.
KW - Adaptive cross approximation
KW - Gaussian random field
KW - Wasserstein distance
KW - covariance matrix
KW - greedy algorithm
UR - http://www.scopus.com/inward/record.url?scp=85139724422&partnerID=8YFLogxK
U2 - 10.3934/fods.2020022
DO - 10.3934/fods.2020022
M3 - Article
AN - SCOPUS:85139724422
SN - 2639-8001
VL - 2
SP - 487
EP - 512
JO - Foundations of Data Science
JF - Foundations of Data Science
IS - 4
ER -