Invertible Low-Divergence Coding

Patrick Schulte, Rana Ali Amjad, Thomas Wiegart, Gerhard Kramer

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Several applications in communication, control, and learning require approximating target distributions to within small informational divergence. The additional requirement of invertibility usually leads to using encoders that are one-to-one mappings, also known as distribution matchers. However, even the best one-to-one encoders have divergences that grow logarithmically with the block length. To overcome this limitation, an encoder is proposed that has an invertible one-to-many mapping and a low-rate random number generator (RNG). Two algorithms are developed to design the mapping by assigning strings in either a most-likely first or least-likely first order. Both algorithms give information rates approaching the entropy of the target distribution with exponentially decreasing divergence and with vanishing RNG rate in the block length.

Original languageEnglish
Pages (from-to)178-192
Number of pages15
JournalIEEE Transactions on Information Theory
Volume68
Issue number1
DOIs
StatePublished - 1 Jan 2022

Keywords

  • Divergence
  • encoding
  • mutual information
  • random number generation

Fingerprint

Dive into the research topics of 'Invertible Low-Divergence Coding'. Together they form a unique fingerprint.

Cite this