Abstract
Several applications in communication, control, and learning require approximating target distributions to within small informational divergence. The additional requirement of invertibility usually leads to using encoders that are one-to-one mappings, also known as distribution matchers. However, even the best one-to-one encoders have divergences that grow logarithmically with the block length. To overcome this limitation, an encoder is proposed that has an invertible one-to-many mapping and a low-rate random number generator (RNG). Two algorithms are developed to design the mapping by assigning strings in either a most-likely first or least-likely first order. Both algorithms give information rates approaching the entropy of the target distribution with exponentially decreasing divergence and with vanishing RNG rate in the block length.
Originalsprache | Englisch |
---|---|
Seiten (von - bis) | 178-192 |
Seitenumfang | 15 |
Fachzeitschrift | IEEE Transactions on Information Theory |
Jahrgang | 68 |
Ausgabenummer | 1 |
DOIs | |
Publikationsstatus | Veröffentlicht - 1 Jan. 2022 |