TY - JOUR
T1 - Information networks with in-block memory
AU - Kramer, Gerhard
PY - 2014/4
Y1 - 2014/4
N2 - A class of channels is introduced for which there is memory inside blocks of a specified length and no memory across the blocks. The multiuser model is called an information network with in-block memory (NiBM). It is shown that block-fading channels, channels with state known causally at the encoder, and relay networks with delays are NiBMs. A cut-set bound is developed for NiBMs that unifies, strengthens, and generalizes existing cut bounds for discrete memoryless networks. The bound gives new finite-letter capacity expressions for several classes of networks including point-to-point channels, and certain multiaccess, broadcast, and relay channels. Cardinality bounds on the random coding alphabets are developed that improve on existing bounds for channels with action-dependent state available causally at the encoder and for relays without delay. Finally, quantize-forward network coding is shown to achieve rates within an additive gap of the new cut-set bound for linear, additive, Gaussian noise channels, symmetric power constraints, and a multicast session.
AB - A class of channels is introduced for which there is memory inside blocks of a specified length and no memory across the blocks. The multiuser model is called an information network with in-block memory (NiBM). It is shown that block-fading channels, channels with state known causally at the encoder, and relay networks with delays are NiBMs. A cut-set bound is developed for NiBMs that unifies, strengthens, and generalizes existing cut bounds for discrete memoryless networks. The bound gives new finite-letter capacity expressions for several classes of networks including point-to-point channels, and certain multiaccess, broadcast, and relay channels. Cardinality bounds on the random coding alphabets are developed that improve on existing bounds for channels with action-dependent state available causally at the encoder and for relays without delay. Finally, quantize-forward network coding is shown to achieve rates within an additive gap of the new cut-set bound for linear, additive, Gaussian noise channels, symmetric power constraints, and a multicast session.
KW - Capacity
KW - feedback
KW - networks
KW - relay channels
UR - http://www.scopus.com/inward/record.url?scp=84897003540&partnerID=8YFLogxK
U2 - 10.1109/TIT.2014.2303120
DO - 10.1109/TIT.2014.2303120
M3 - Article
AN - SCOPUS:84897003540
SN - 0018-9448
VL - 60
SP - 2105
EP - 2120
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 4
M1 - 6727506
ER -