Curvature and optimal algorithms for learning and minimizing submodular functions

Rishabh Iyer, Stefanie Jegelka, Jeff Bilmes

Research output: Contribution to journalConference articlepeer-review

72 Scopus citations

Abstract

We investigate three related and important problems connected to machine learning: approximating a submodular function everywhere, learning a submodular function (in a PAC-like setting [28]), and constrained minimization of submodular functions. We show that the complexity of all three problems depends on the "curvature" of the submodular function, and provide lower and upper bounds that refine and improve previous results [2, 6, 8, 27]. Our proof techniques are fairly generic. We either use a black-box transformation of the function (for approximation and learning), or a transformation of algorithms to use an appropriate surrogate function (for minimization). Curiously, curvature has been known to influence approximations for submodular maximization [3, 29], but its effect on minimization, approximation and learning has hitherto been open. We complete this picture, and also support our theoretical claims by empirical results.

Original languageEnglish
JournalAdvances in Neural Information Processing Systems
StatePublished - 2013
Externally publishedYes
Event27th Annual Conference on Neural Information Processing Systems, NIPS 2013 - Lake Tahoe, NV, United States
Duration: 5 Dec 201310 Dec 2013

Fingerprint

Dive into the research topics of 'Curvature and optimal algorithms for learning and minimizing submodular functions'. Together they form a unique fingerprint.

Cite this