Invertible Low-Divergence Coding

Patrick Schulte, Rana Ali Amjad, Thomas Wiegart, Gerhard Kramer

Publikation: Beitrag in FachzeitschriftArtikelBegutachtung

2 Zitate (Scopus)

Abstract

Several applications in communication, control, and learning require approximating target distributions to within small informational divergence. The additional requirement of invertibility usually leads to using encoders that are one-to-one mappings, also known as distribution matchers. However, even the best one-to-one encoders have divergences that grow logarithmically with the block length. To overcome this limitation, an encoder is proposed that has an invertible one-to-many mapping and a low-rate random number generator (RNG). Two algorithms are developed to design the mapping by assigning strings in either a most-likely first or least-likely first order. Both algorithms give information rates approaching the entropy of the target distribution with exponentially decreasing divergence and with vanishing RNG rate in the block length.

OriginalspracheEnglisch
Seiten (von - bis)178-192
Seitenumfang15
FachzeitschriftIEEE Transactions on Information Theory
Jahrgang68
Ausgabenummer1
DOIs
PublikationsstatusVeröffentlicht - 1 Jan. 2022

Fingerprint

Untersuchen Sie die Forschungsthemen von „Invertible Low-Divergence Coding“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren