TY - GEN
T1 - Informational divergence approximations to product distributions
AU - Hou, Jie
AU - Kramer, Gerhard
PY - 2013
Y1 - 2013
N2 - The minimum rate needed to accurately approximate a product distribution based on an unnormalized informational divergence is shown to be a mutual information. This result subsumes results of Wyner on common information and Han-Verdú on resolvability. The result also extends to cases where the source distribution is unknown but the entropy is known.
AB - The minimum rate needed to accurately approximate a product distribution based on an unnormalized informational divergence is shown to be a mutual information. This result subsumes results of Wyner on common information and Han-Verdú on resolvability. The result also extends to cases where the source distribution is unknown but the entropy is known.
UR - http://www.scopus.com/inward/record.url?scp=84888869316&partnerID=8YFLogxK
U2 - 10.1109/CWIT.2013.6621596
DO - 10.1109/CWIT.2013.6621596
M3 - Conference contribution
AN - SCOPUS:84888869316
SN - 9781479906345
T3 - 2013 13th Canadian Workshop on Information Theory, CWIT 2013
SP - 76
EP - 81
BT - 2013 13th Canadian Workshop on Information Theory, CWIT 2013
T2 - 2013 13th Canadian Workshop on Information Theory, CWIT 2013
Y2 - 18 June 2013 through 21 June 2013
ER -