Informational divergence approximations to product distributions

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

34 Scopus citations

Abstract

The minimum rate needed to accurately approximate a product distribution based on an unnormalized informational divergence is shown to be a mutual information. This result subsumes results of Wyner on common information and Han-Verdú on resolvability. The result also extends to cases where the source distribution is unknown but the entropy is known.

Original languageEnglish
Title of host publication2013 13th Canadian Workshop on Information Theory, CWIT 2013
Pages76-81
Number of pages6
DOIs
StatePublished - 2013
Event2013 13th Canadian Workshop on Information Theory, CWIT 2013 - Toronto, ON, Canada
Duration: 18 Jun 201321 Jun 2013

Publication series

Name2013 13th Canadian Workshop on Information Theory, CWIT 2013

Conference

Conference2013 13th Canadian Workshop on Information Theory, CWIT 2013
Country/TerritoryCanada
CityToronto, ON
Period18/06/1321/06/13

Fingerprint

Dive into the research topics of 'Informational divergence approximations to product distributions'. Together they form a unique fingerprint.

Cite this