Robust Generation of Channel Distributions with Diffusion Models

Muah Kim, Rick Fritschek, Rafael F. Schaefer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Training neural encoders requires a differentiable channel model for backpropagation. This can be bypassed by approximating the channel distribution using pilot signals. A common method for this is the use of generative adversarial networks (GANs). In this paper, we introduce diffusion models (DMs) for channel generation and propose an efficient training algorithm. Our DMs provide a solution that achieves near-optimal end-to-end symbol error rates (SERs). Importantly, DMs outperform GANs in high signal-to-noise ratio regions. Here, in particular, we explore the trade-off between sample quality and speed. We also show that the right noise scheduling can significantly reduce sampling time with a minor increase in SER.

Original languageEnglish
Title of host publicationICC 2024 - IEEE International Conference on Communications
EditorsMatthew Valenti, David Reed, Melissa Torres
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages330-335
Number of pages6
ISBN (Electronic)9781728190549
DOIs
StatePublished - 2024
Externally publishedYes
Event59th Annual IEEE International Conference on Communications, ICC 2024 - Denver, United States
Duration: 9 Jun 202413 Jun 2024

Publication series

NameIEEE International Conference on Communications
ISSN (Print)1550-3607

Conference

Conference59th Annual IEEE International Conference on Communications, ICC 2024
Country/TerritoryUnited States
CityDenver
Period9/06/2413/06/24

Keywords

  • Channel generation
  • diffusion model
  • end-to-end learning
  • generative networks

Fingerprint

Dive into the research topics of 'Robust Generation of Channel Distributions with Diffusion Models'. Together they form a unique fingerprint.

Cite this