Modumer: Modulating Transformer for Image Restoration

Yuning Cui, Mingyu Liu, Wenqi Ren, Alois Knoll

Research output: Contribution to journalArticlepeer-review

Abstract

Image restoration aims to recover clean images from degraded counterparts. While Transformer-based approaches have achieved significant advancements in this field, they are limited by high complexity and their inability to capture omni-range dependencies, hindering their overall performance. In this work, we develop Modumer for effective and efficient image restoration by revisiting the Transformer block and modulation design, which processes input through a convolutional block and projection layers and fuses features via elementwise multiplication. Specifically, within each unit of Modumer, we integrate the cascaded modulation design with the downsampled Transformer block to build the attention layers, enabling omni-kernel modulation and mapping inputs into high-dimensional feature spaces. Moreover, we introduce a bioinspired parameter-sharing mechanism to attention layers, which not only enhances efficiency but also improves performance. In addition, a dual-domain feed-forward network (DFFN) strengthens the representational power of the model. Extensive experimental evaluations demonstrate that the proposed Modumer achieves state-of-the-art performance across ten datasets in five single-degradation image restoration tasks, including image motion deblurring, deraining, dehazing, desnowing, and low-light enhancement. Moreover, the model exhibits strong generalization capabilities in all-in-one image restoration tasks. Additionally, it demonstrates competitive performance in composite-degradation image restoration.

Keywords

  • All-in-one image restoration
  • composite-degradation image restoration
  • dual-domain learning
  • image restoration
  • modulation design
  • parameter sharing
  • transformer

Fingerprint

Dive into the research topics of 'Modumer: Modulating Transformer for Image Restoration'. Together they form a unique fingerprint.

Cite this