Abstract
Several applications in communication, control, and learning require approximating target distributions to within small informational divergence. The additional requirement of invertibility usually leads to using encoders that are one-to-one mappings, also known as distribution matchers. However, even the best one-to-one encoders have divergences that grow logarithmically with the block length. To overcome this limitation, an encoder is proposed that has an invertible one-to-many mapping and a low-rate random number generator (RNG). Two algorithms are developed to design the mapping by assigning strings in either a most-likely first or least-likely first order. Both algorithms give information rates approaching the entropy of the target distribution with exponentially decreasing divergence and with vanishing RNG rate in the block length.
Original language | English |
---|---|
Pages (from-to) | 178-192 |
Number of pages | 15 |
Journal | IEEE Transactions on Information Theory |
Volume | 68 |
Issue number | 1 |
DOIs | |
State | Published - 1 Jan 2022 |
Keywords
- Divergence
- encoding
- mutual information
- random number generation