## Abstract

In this paper we develop fast and memory efficient numerical methods for learning functions of many variables that admit sparse representations in terms of general bounded orthonormal tensor product bases. Such functions appear in many applications including, e.g., various Uncertainty Quantification (UQ) problems involving the solution of parametric PDE that are approximately sparse in Chebyshev or Legendre product bases (Chkifa et al. in Polynomial approximation via compressed sensing of high-dimensional functions on lower sets. arXiv:1602.05823, 2016; Rauhut and Schwab in Math Comput 86(304):661–700, 2017). We expect that our results provide a starting point for a new line of research on sublinear-time solution techniques for UQ applications of the type above which will eventually be able to scale to significantly higher-dimensional problems than what are currently computationally feasible. More concretely, let B be a finite Bounded Orthonormal Product Basis (BOPB) of cardinality | B| = N. Herein we will develop methods that rapidly approximate any function f that is sparse in the BOPB, that is, f: D⊂ R^{D}→ C of the form f(x)=∑b∈Scb·b(x)with S⊂ B of cardinality | S| = s≪ N. Our method adapts the CoSaMP algorithm (Needell and Tropp in Appl Comput Harmon Anal 26(3):301–321, 2009) to use additional function samples from f along a randomly constructed grid G⊂ R^{D} with universal approximation properties in order to rapidly identify the multi-indices of the most dominant basis functions in S component by component during each CoSaMP iteration. It has a runtime of just (slog N) ^{O}^{(}^{1}^{)}, uses only (slog N) ^{O}^{(}^{1}^{)} function evaluations on the fixed and nonadaptive grid G, and requires not more than (slog N) ^{O}^{(}^{1}^{)} bits of memory. We emphasize that nothing about S or any of the coefficients c_{b}∈ C is assumed in advance other than that S⊂ B has | S| ≤ s. Both S and its related coefficients c_{b} will be learned from the given function evaluations by the developed method. For s≪ N, the runtime (slog N) ^{O}^{(}^{1}^{)} will be less than what is required to simply enumerate the elements of the basis B; thus our method is the first approach applicable in a general BOPB framework that falls into the class referred to as sublinear-time. This and the similarly reduced sample and memory requirements set our algorithm apart from previous works based on standard compressive sensing algorithms such as basis pursuit which typically store and utilize full intermediate basis representations of size Ω(N) during the solution process.

Original language | English |
---|---|

Pages (from-to) | 275-329 |

Number of pages | 55 |

Journal | Foundations of Computational Mathematics |

Volume | 21 |

Issue number | 2 |

DOIs | |

State | Published - Apr 2021 |

## Keywords

- Compressive sensing
- Function learning
- High-dimensional function approximation
- Sparse approximation
- Sublinear-time algorithms