On power-law kernels, corresponding Reproducing Kernel Hilbert Space and applications

Debarghya Ghoshdastidar, Ambedkar Dukkipati

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

The role of kernels is central to machine learning. Motivated by the importance of power-law distributions in statistical modeling, in this paper, we propose the notion of power-law kernels to investigate power-laws in learning problem. We propose two power-law kernels by generalizing Gaussian and Laplacian kernels. This generalization is based on distributions, arising out of maximization of a generalized information measure known as nonextensive entropy that is very well studied in statistical mechanics.We prove that the proposed kernels are positive definite, and provide some insights regarding the corresponding Reproducing Kernel Hilbert Space (RKHS). We also study practical significance of both kernels in classification and regression, and present some simulation results.

Original languageEnglish
Title of host publicationProceedings of the 27th AAAI Conference on Artificial Intelligence, AAAI 2013
Pages365-371
Number of pages7
StatePublished - 2013
Externally publishedYes
Event27th AAAI Conference on Artificial Intelligence, AAAI 2013 - Bellevue, WA, United States
Duration: 14 Jul 201318 Jul 2013

Publication series

NameProceedings of the 27th AAAI Conference on Artificial Intelligence, AAAI 2013

Conference

Conference27th AAAI Conference on Artificial Intelligence, AAAI 2013
Country/TerritoryUnited States
CityBellevue, WA
Period14/07/1318/07/13

Fingerprint

Dive into the research topics of 'On power-law kernels, corresponding Reproducing Kernel Hilbert Space and applications'. Together they form a unique fingerprint.

Cite this