Skip to main content

Infrared and visible image fusion using two-layer generative adversarial network.

Chen, L., Han, J. and Tian, F., 2021. Infrared and visible image fusion using two-layer generative adversarial network. Journal of Intelligent and Fuzzy Systems, 40 (6), 11897 - 11913.

Full text available as:

[img]
Preview
PDF
Infrared and Visible Image Fusion 2020.11.17.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial.

6MB

Official URL: https://content.iospress.com/articles/journal-of-i...

DOI: 10.3233/JIFS-210041

Abstract

Infrared (IR) images can distinguish targets from their backgrounds based on difference in thermal radiation, whereas visible images can provide texture details with high spatial resolution. The fusion of the IR and visible images has many advantages and can be applied to applications such as target detection and recognition. This paper proposes a two-layer generative adversarial network (GAN) to fuse these two types of images. In the first layer, the network generate fused images using two GANs: one uses the IR image as input and the visible image as ground truth, and the other with the visible as input and the IR as ground truth. In the second layer, the network transfer one of the two fused images generated in the first layer as input and the other as ground truth to GAN to generate the final fused image. We adopt TNO and INO data sets to verify our method, and by comparing with eight objective evaluation parameters obtained by other ten methods. It is demonstrated that our method is able to achieve better performance than state-of-arts on preserving both texture details and thermal information.

Item Type:Article
ISSN:1064-1246
Uncontrolled Keywords:Infrared and visible images; Image fusion; Generative adversarial network; Deep learning
Group:Faculty of Science & Technology
ID Code:35907
Deposited By: Unnamed user with email symplectic@symplectic
Deposited On:18 Aug 2021 13:52
Last Modified:18 Aug 2021 13:52

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -