TY - JOUR
T1 - EPAFusion
T2 - A novel fusion network based on enhancement and progressive aware for infrared - Visible images in low-light
AU - Qi, Jianhuan
AU - Ni, Bo
AU - Yu, Qida
AU - Ni, Haibin
AU - Zhou, Xiaoyan
AU - Chang, Jianhua
N1 - Publisher Copyright:
© 2001-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Most infrared and visible image fusion algorithms demonstrate satisfactory performance under normal lighting conditions, but often perform poorly in low-light environments because the texture details in visible images are barely visible in the dark. To resolve this issue, we have proposed A novel fusion network based on enhancement and progressive aware for infrared - visible images in low-light (EPAFusion). The EPAFusion network efficiently brightens low-light environments while fusing complementary information from the two modalities. Specifically, we design a degraded illumination disentangled network (DIDNet), which is specially used to eliminate the degraded illumination effect in the visible image and enhance the feature information of the source image. Then, a progressive complementary aware fusion network (PCAFNet), which is used to generate fused images with rich texture details and high contrast by integrating the complementary information of the dual modal through a progressive approach. Extensive comparative experiments have demonstrated that EPAFusion exhibits superior to state-of-the-art fusion algorithms in terms of subjective and objective evaluation metrics. Particularly, low-light enhancement and dual-modal aware fusion provide more effective information to the fused image.
AB - Most infrared and visible image fusion algorithms demonstrate satisfactory performance under normal lighting conditions, but often perform poorly in low-light environments because the texture details in visible images are barely visible in the dark. To resolve this issue, we have proposed A novel fusion network based on enhancement and progressive aware for infrared - visible images in low-light (EPAFusion). The EPAFusion network efficiently brightens low-light environments while fusing complementary information from the two modalities. Specifically, we design a degraded illumination disentangled network (DIDNet), which is specially used to eliminate the degraded illumination effect in the visible image and enhance the feature information of the source image. Then, a progressive complementary aware fusion network (PCAFNet), which is used to generate fused images with rich texture details and high contrast by integrating the complementary information of the dual modal through a progressive approach. Extensive comparative experiments have demonstrated that EPAFusion exhibits superior to state-of-the-art fusion algorithms in terms of subjective and objective evaluation metrics. Particularly, low-light enhancement and dual-modal aware fusion provide more effective information to the fused image.
KW - Degraded illumination divestment
KW - Dual modal aware fusion
KW - Image fusion
KW - Low-light image enhancement
KW - Progressive complementary aware fusion
UR - http://www.scopus.com/inward/record.url?scp=85216077845&partnerID=8YFLogxK
U2 - 10.1109/JSEN.2024.3520499
DO - 10.1109/JSEN.2024.3520499
M3 - Article
AN - SCOPUS:85216077845
SN - 1530-437X
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
ER -