Kronecker determinantal point processes

Zelda Mariet, Suvrit Sra

Research output: Contribution to journalConference articlepeer-review

16 Scopus citations

Abstract

Determinantal Point Processes (DPPs) are probabilistic models over all subsets a ground set of N items. They have recently gained prominence in several applications that rely on "diverse" subsets. However, their applicability to large problems is still limited due to O(N3) complexity of core tasks such as sampling and learning. We enable efficient sampling and learning for DPPs by introducing KRONDPP, a DPP model whose kernel matrix decomposes as a tensor product of multiple smaller kernel matrices. This decomposition immediately enables fast exact sampling. But contrary to what one may expect, leveraging the Kronecker product structure for speeding up DPP learning turns out to be more difficult. We overcome this challenge, and derive batch and stochastic optimization algorithms for efficiently learning the parameters of a KRONDPP.

Original languageEnglish
Pages (from-to)2702-2710
Number of pages9
JournalAdvances in Neural Information Processing Systems
Volume0
StatePublished - 2016
Externally publishedYes
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: 5 Dec 201610 Dec 2016

Fingerprint

Dive into the research topics of 'Kronecker determinantal point processes'. Together they form a unique fingerprint.

Cite this