Gaussian quadrature for matrix inverse forms with applications

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

We present a framework for accelerating a spectrum of machine learning algorithms that require computation of bilinear inverse forms uT A-lu, where A is a positive definite matrix and u a given vector. Our framework is built on Gauss- type quadrature and easily scales to large, sparse matrices. Further, it allows retrospective computation of lower and upper bounds on uT A-l, which in turn accelerates several algorithms. We prove that these bounds tighten iteratively and converge at a linear (geometric) rate. To our knowledge, ours is the first work to demonstrate these key properties of Gauss-type quadrature, which is a classical and deeply studied topic. We illustrate empirical consequences of our results by using quadrature to accelerate machine learning tasks involving determinantal point processes and submodular optimization, and observe tremendous speedups in several instances.

Original languageEnglish
Title of host publication33rd International Conference on Machine Learning, ICML 2016
EditorsKilian Q. Weinberger, Maria Florina Balcan
PublisherInternational Machine Learning Society (IMLS)
Pages2630-2651
Number of pages22
ISBN (Electronic)9781510829008
StatePublished - 2016
Externally publishedYes
Event33rd International Conference on Machine Learning, ICML 2016 - New York City, United States
Duration: 19 Jun 201624 Jun 2016

Publication series

Name33rd International Conference on Machine Learning, ICML 2016
Volume4

Conference

Conference33rd International Conference on Machine Learning, ICML 2016
Country/TerritoryUnited States
CityNew York City
Period19/06/1624/06/16

Fingerprint

Dive into the research topics of 'Gaussian quadrature for matrix inverse forms with applications'. Together they form a unique fingerprint.

Cite this