Concatenated Classic and Neural (CCN) Codes: ConcatenatedAE

Onur Günlü, Rick Fritschek, Rafael F. Schaefer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Small neural networks (NNs) used for error correction were shown to improve on classic channel codes and to address channel model changes. We extend the code dimension of any such structure by using the same NN under one-hot encoding multiple times, then serially-concatenated with an outer classic code. We design NNs with the same network parameters, where each Reed-Solomon codeword symbol is an input to a different NN. Significant improvements in block error probabilities for an additive Gaussian noise channel as compared to the small neural code are illustrated, as well as robustness to channel model changes.

Original languageEnglish
Title of host publication2023 IEEE Wireless Communications and Networking Conference, WCNC 2023 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665491228
DOIs
StatePublished - 2023
Externally publishedYes
Event2023 IEEE Wireless Communications and Networking Conference, WCNC 2023 - Glasgow, United Kingdom
Duration: 26 Mar 202329 Mar 2023

Publication series

NameIEEE Wireless Communications and Networking Conference, WCNC
Volume2023-March
ISSN (Print)1525-3511

Conference

Conference2023 IEEE Wireless Communications and Networking Conference, WCNC 2023
Country/TerritoryUnited Kingdom
CityGlasgow
Period26/03/2329/03/23

Fingerprint

Dive into the research topics of 'Concatenated Classic and Neural (CCN) Codes: ConcatenatedAE'. Together they form a unique fingerprint.

Cite this