EPAFusion: A novel fusion network based on enhancement and progressive aware for infrared - Visible images in low-light

Jianhuan Qi, Bo Ni, Qida Yu, Haibin Ni, Xiaoyan Zhou, Jianhua Chang

Research output: Contribution to journalArticlepeer-review

Abstract

Most infrared and visible image fusion algorithms demonstrate satisfactory performance under normal lighting conditions, but often perform poorly in low-light environments because the texture details in visible images are barely visible in the dark. To resolve this issue, we have proposed A novel fusion network based on enhancement and progressive aware for infrared - visible images in low-light (EPAFusion). The EPAFusion network efficiently brightens low-light environments while fusing complementary information from the two modalities. Specifically, we design a degraded illumination disentangled network (DIDNet), which is specially used to eliminate the degraded illumination effect in the visible image and enhance the feature information of the source image. Then, a progressive complementary aware fusion network (PCAFNet), which is used to generate fused images with rich texture details and high contrast by integrating the complementary information of the dual modal through a progressive approach. Extensive comparative experiments have demonstrated that EPAFusion exhibits superior to state-of-the-art fusion algorithms in terms of subjective and objective evaluation metrics. Particularly, low-light enhancement and dual-modal aware fusion provide more effective information to the fused image.

Original languageEnglish
JournalIEEE Sensors Journal
DOIs
StateAccepted/In press - 2025
Externally publishedYes

Keywords

  • Degraded illumination divestment
  • Dual modal aware fusion
  • Image fusion
  • Low-light image enhancement
  • Progressive complementary aware fusion

Fingerprint

Dive into the research topics of 'EPAFusion: A novel fusion network based on enhancement and progressive aware for infrared - Visible images in low-light'. Together they form a unique fingerprint.

Cite this