12
Research Article A Visible and Passive Millimeter Wave Image Fusion Algorithm Based on Pulse-Coupled Neural Network in Tetrolet Domain for Early Risk Warning Yuanjiang Li, 1,2 WeiYang Ye , 1 Jian Fei Chen , 3 Miao Gong , 1 Yousai Zhang , 1 and Feng Li 1 1 School of Electronics and Information, Jiangsu University of Science and Technology, Zhenjiang 212003, China 2 State Key Laboratory of Millimeter Waves, Southeast China University, Nanjing 210096, China 3 School of Optoelectronic Engineering, Nangjing University of Posts and Telecommunications, Nanjing 210042, China Correspondence should be addressed to Yousai Zhang; [email protected] Received 6 October 2017; Revised 16 February 2018; Accepted 5 March 2018; Published 8 April 2018 Academic Editor: Erik Cuevas Copyright © 2018 Yuanjiang Li et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. An algorithm based on pulse-coupled neural network (PCNN) constructed in the Tetrolet transform domain is proposed for the fusion of the visible and passive millimeter wave images in order to effectively identify concealed targets. e Tetrolet transform is applied to build the framework of the multiscale decomposition due to its high sparse degree. Meanwhile, a Laplacian pyramid is used to decompose the low-pass band of the Tetrolet transform for improving the approximation performance. In addition, the maximum criterion based on regional average gradient is applied to fuse the top layers along with selecting the maximum absolute values of the other layers. Furthermore, an improved PCNN model is employed to enhance the contour feature of the hidden targets and obtain the fusion results of the high-pass band based on the firing time. Finally, the inverse transform of Tetrolet is exploited to obtain the fused results. Some objective evaluation indexes, such as information entropy, mutual information, and / , are adopted for evaluating the quality of the fused images. e experimental results show that the proposed algorithm is superior to other image fusion algorithms. 1. Introduction Both the active mode and passive mode are used to detect concealed objects. e active detection mode usually relies on strong penetrability of special ray. It (i.e., the irreversible radiation) is easy to damage the testing material and human health. On the contrary, the passive detection mode plays an important role in the field of threat precaution due to its security. It depends on the spectral radiation difference between the interesting targets and surrounding for recog- nizing the concealed objects. For example, a metal gun is hidden in the abdomen of a man shown in Figure 1, which is labeled for explaining a passive imaging scene. When the man passes through a passive millimeter wave (PMMW) system, the gun should reflect brightness temperature of cold air in millimeter wavelength. Meanwhile, some completely different pixels are generated for describing the information of brightness temperature in PMMW images, which lead to an obvious diversity between the gun and human body. e PMMW imaging produces an interpretable imagery without irradiating the targets. It has the capability to penetrate through the low-visibility conditions and some obstacles such as textile materials [1]. erefore, the concealed objects under clothing can be identified by the PMMW imaging system reasonably. e target characteristic forming in the PMMW images is different from surroundings, which leads to an automated target detection [2]. e radiometer array captures radiated energy restricted by the antenna aperture. Every pixel of the PMMW images actually reflects a weighted average of regional radiation in millimeter wave (MMW) band. e low resolution images are usually obtained due to the diffraction limit and the low signal level. Meanwhile, the sensitivity of the sensor and environmental radiation are the key factors of affecting Hindawi Mathematical Problems in Engineering Volume 2018, Article ID 4205308, 11 pages https://doi.org/10.1155/2018/4205308

A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

  • Upload
    hadan

  • View
    218

  • Download
    0

Embed Size (px)

Citation preview

Page 1: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

Research ArticleA Visible and Passive Millimeter Wave Image FusionAlgorithm Based on Pulse-Coupled Neural Network inTetrolet Domain for Early Risk Warning

Yuanjiang Li12 WeiYang Ye 1 Jian Fei Chen 3 Miao Gong 1

Yousai Zhang 1 and Feng Li1

1School of Electronics and Information Jiangsu University of Science and Technology Zhenjiang 212003 China2State Key Laboratory of Millimeter Waves Southeast China University Nanjing 210096 China3School of Optoelectronic Engineering Nangjing University of Posts and Telecommunications Nanjing 210042 China

Correspondence should be addressed to Yousai Zhang 1972105867qqcom

Received 6 October 2017 Revised 16 February 2018 Accepted 5 March 2018 Published 8 April 2018

Academic Editor Erik Cuevas

Copyright copy 2018 Yuanjiang Li et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

An algorithm based on pulse-coupled neural network (PCNN) constructed in the Tetrolet transform domain is proposed for thefusion of the visible and passive millimeter wave images in order to effectively identify concealed targets The Tetrolet transformis applied to build the framework of the multiscale decomposition due to its high sparse degree Meanwhile a Laplacian pyramidis used to decompose the low-pass band of the Tetrolet transform for improving the approximation performance In addition themaximum criterion based on regional average gradient is applied to fuse the top layers along with selecting the maximum absolutevalues of the other layers Furthermore an improved PCNNmodel is employed to enhance the contour feature of the hidden targetsand obtain the fusion results of the high-pass band based on the firing time Finally the inverse transform of Tetrolet is exploitedto obtain the fused results Some objective evaluation indexes such as information entropy mutual information and 119876119860119861119865 areadopted for evaluating the quality of the fused images The experimental results show that the proposed algorithm is superior toother image fusion algorithms

1 Introduction

Both the active mode and passive mode are used to detectconcealed objects The active detection mode usually relieson strong penetrability of special ray It (ie the irreversibleradiation) is easy to damage the testing material and humanhealth On the contrary the passive detection mode playsan important role in the field of threat precaution due toits security It depends on the spectral radiation differencebetween the interesting targets and surrounding for recog-nizing the concealed objects For example a metal gun ishidden in the abdomen of a man shown in Figure 1 whichis labeled for explaining a passive imaging scene When theman passes through a passive millimeter wave (PMMW)system the gun should reflect brightness temperature of coldair in millimeter wavelength Meanwhile some completelydifferent pixels are generated for describing the information

of brightness temperature in PMMW images which lead toan obvious diversity between the gun and human body ThePMMW imaging produces an interpretable imagery withoutirradiating the targets It has the capability to penetratethrough the low-visibility conditions and some obstacles suchas textile materials [1]Therefore the concealed objects underclothing can be identified by the PMMW imaging systemreasonably The target characteristic forming in the PMMWimages is different from surroundings which leads to anautomated target detection [2]

The radiometer array captures radiated energy restrictedby the antenna aperture Every pixel of the PMMW imagesactually reflects a weighted average of regional radiation inmillimeter wave (MMW) band The low resolution imagesare usually obtained due to the diffraction limit and thelow signal level Meanwhile the sensitivity of the sensorand environmental radiation are the key factors of affecting

HindawiMathematical Problems in EngineeringVolume 2018 Article ID 4205308 11 pageshttpsdoiorg10115520184205308

2 Mathematical Problems in Engineering

MMW imaging sensor array

Imagingobject

PMMWimages

Figure 1 The schematic diagram of the PMMW imaging

feature expression about the concealed objects As a resultthe imaging quality is insufficient for supporting the follow-up task such as target recognition and localizationThere is aninevitable limitation for detecting the concealed targets basedon any type of sensors or methods Additionally the visibleimage has good readability and rich details of scene withthe defect of exploring the concealed objects The infraredsensor is used to obtain thermal radiation information of thehidden targets which is inferior to the PMMW sensor whendetecting the metal objects [2] The multisource informationfusion has maintained a strong vitality for obtaining multipledimensions information about the interesting targets underany complex viewing conditions It is effective for improvingthe validity and accuracy of recognizing the concealed objectswith the comprehensive description of a scene The fusedresults integrate complementary and redundant informationfrom source images and obtain a more sufficient descriptionof the targets than any single source images [3] Song etal [4] proposed a novel region-based algorithm based onExpectation-Maximization (EM) algorithm and NormalizedCut (Ncut) algorithm A region growing algorithm is used toextract the potential target regions and a statistical modelwith Gaussian mixture distortion is applied for producingthe fusion image Xiong et al [5] proposed a novel algorithmbased on clustering andnonsubsampled contourlet transform(NSCT) The fusion image is obtained by taking the inverseNSCT of the fusion coefficients These fusion algorithms aremore meaningful for obtaining the concealed information

The fused images contain much more comprehensiveand accurate information than a single image This is widelyexploited in the field of military medical science remotesensing and machine vision [9] Particularly the multiscaletransforms are usually applied to achieve sparse representa-tion of source images The final images are obtained throughfusion in accordance with certain rules There are severaltypes of the multiscale transform such as discrete wavelettransform (DWT) [10] Curvelet transform (CT) [11] NSCTand Tetrolet transform (TT) [12 13] The DWT is suitable fordealing with singular signal with the limitation of describinglinear signal and the CT is suitable for approximating theclosed curve The NSCT not only inherits the anisotropyof the CT but also expands multidirection and translationinvariance The TT executes sparse decomposition of sourceimages due to its excellent capability of multiscale geometryanalysis Krommweh firstly proved that the TT is betterthan the DWT CT and NSCT when describing geometric

structure characteristics [13] Shi et al presented a hybridmethod for image approximation using the TT and wavelet[14] The core of the algorithm is the further sparse rep-resentation of the low-pass band in the TT domain Afterthat some scholars began to explore the possibility of intro-ducing the TT into multisource image fusion For exampleHuang et al proposed different rules for the low and high-pass coefficients The local region gradient information wasapplied to get the low-pass fusion coefficients And the largerregion edge informationmeasurement factor is used to selectthe better coefficients for fusion [15] Shen et al proposedan improved algorithm based on the TT for fusing theinfrared and visible images [16] An optimization algorithmnamed compressive sampling matching pursuit (referred toas CoSaMP) is used to determine the fusion coefficients ThePMMW images contain relatively less information due to thedetecting principles The CoSaMP algorithm usually causescertain loss of useful information Yan et al introduced aregional gradient into the fusion process in the TT domain[17] The fused results are better than those algorithms basedon the wavelet transform and principal component analysis(PCA) methods However the low-frequency coefficients ofthe TT contain a small amount of details such as edge andcorner feature If these details are neglected the fused resultsalways lost a lot of targetsrsquo contours Zhang et al proposeda Laplacian pyramid for decomposing the low-frequencyportion of the TT and proved that the Laplacian pyramidis conducive to improve the capability of describing details[18] The result shows that the proposed algorithm performswell when fusing multichannel satellite cloud images If thesource images have similar structural characteristics thismethod has good performance when preserving image edgeand curvature geometric structure However due to thecharacteristic difference between the visible and PMMWimages the contour features of the concealed objects can besubmerged in the background easily

The pulse-coupled neural network (PCNN) is known asthe third-generation neural network developed by Eckhorn etal in 1990 It was founded on the experimental observationsof synchronous pulse bursts in cat and monkey visual cortex[19] Although PCNN achieves excellent results the PCNN-based fusion methods are complex and inefficient for dealingwith different source images Wang et al illustrated thatthe amount of the channels of the PCNN and parameterslimits its application [20] Many researchers have improvedthe original PCNN model for making it more appropriate

Mathematical Problems in Engineering 3

Figure 2 The five kinds of tetrominoes

for image fusion For example Deng and Ma proposedan improved PCNN model and implemented the initialparameters based on the max gray of normalized sourceimages [21] Chenhui and Jianying [22] decomposed thevisible and PMMW images in themultibandwavelet domainThe fusion ruler for low-frequency coefficients is based onlocal variance and the high-frequency coefficients are basedon the adaptive PCNN Xiong et al [23] adopted the CTtransform for obtaining the coefficients at different scalesThepotential target regions of the PMMW image is conducive todetermine the fusion coefficients which takes advantage ofthe particular ability of the PMMWimage in presentingmetaltarget So the fusion coefficients are determined by the featureof PMMW image based on the region growing algorithmand the improved PCNN is selected for the fine scalewhich enhances the performance of fusion for integratingthe important information of the visible and PMMW imageHowever the result of the region growing is restricted bythree major factors such as an initial growth point a growthcriteria and a terminating condition which directly affectedthe success rate of the potential target extraction

In this work we adopted a generic framework for thefusion of the source image instead of extracting the targetregion of the PMMW image Both the TT and the improvedPCNN are applied for fusing the visible and PMMW imageswith different rulesMeanwhile the PCNN is used to enhancethe clarity and contrast of the hidden targets The rest of thispaper is organized as follows The principles of the TT andthe PCNN are illustrated in Section 2 The proposed fusionmethod is described in Section 3 The results and analysisof experiments are shown in Section 4 Finally Section 5concludes the work

2 The Theory of the TT and PCNN

21 The Theory of the TT The TT possesses smaller supportdomain and avoids the Gibbs phenomenon at the edge ofimages Five basic structures of the TT are shown in Figure 2

Suppose a source image is expressed as 1198860 = (119886[119894 119895])119873minus1119894119895=0119873 = 2119869 (119869 isin 119873) The decomposition process of the TT isshown as follows

(I) Primary Decomposition The low-pass image 119886119903minus1 isdivided into several blocks 119876119894119895 119894 119895 = 0 1198734119903 minus 1(II) Tetrominoes Selection The low-pass coefficients aredefined as

119886119903(119888) = (119886119903(119888) [119904])3119904=0

119886119903(119888) [119904] = sum(1198981198991015840)isin119868(119888)119904

120576 [0 119871 (119898 119899)] 119886119903minus1 [119898 119899] (1)

and then the three high-pass coefficients for 119897 = 1 2 3 aregiven by

119908119903(119888)119897 = (119908119903(119888)119897 [119904])3119904=0

119908119903(119888)119897 [119904] = sum(119898119899)isin119868(119888)119904119904

120576 [119897 119871 (119898 119899)] 119886119903minus1 [119898 119899] (2)

Thus the covering 119888lowast is119888lowast = argmin

119888

3sum119897=1

10038171003817100381710038171003817119908119903(119888)119897 10038171003817100381710038171003817119897 = argmin119888

3sum119897=1

3sum119904=0

10038161003816100381610038161003816119908119903(119888)119897 [119904]10038161003816100381610038161003816 (3)

An optimal Tetrolet decomposition in the first phase is[119886119903(119888lowast) 119908119903(119888lowast)1 119908119903(119888lowast)2 119908119903(119888lowast)3 ](III) Rearranging Coefficients The low-frequency coefficientsof each block are retranslated into 2 times 2 blocks Then steps(I) and (II) are repeated for sparse representation

(IV) Image Reconstruction The fused image is reconstructedbased on the low-pass high-pass coefficients and the corre-sponding coverings

The flow chart of the TT is shown in Figure 3

22TheTheory of the PCNN Theneuronmodel of the PCNNis described as follows [20]

119865119894119895 (119899) = exp (minus120572119865) 119865119894119895 (119899 minus 1)+ 119881119865sum119908119894119895119896119897119884119894119895 (119899 minus 1) + 119878119894119895

119871 119894119895 (119899) = exp (minus120572119871) 119871 119894119895 (119899 minus 1)+ 119881119871sum119872119894119895119896119897119884119894119895 (119899 minus 1)

119880119894119895 (119899) = 119865119894119895 (119899) (1 + 120573119871 119894119895 (119899)) 119884119894119895 (119899) = 1 if 119880119894119895 (119899) gt 119879119894119895 (119899)119884119894119895 (119899) = 0 otherwise

119879119894119895 (119899) = exp (minus120572119879) 119879119894119895 (119899 minus 1) + 119881119879sum119884119894119895 (119899)

(4)

where 119878119894119895 and 119865119894119895(119899) denote the external input stimulus andthe feedback of119873(119894 119895) respectively 119880119894119895(119899) and 119879119894119895(119899) repre-sent the internal activity of neuron and the dynamic thresh-old respectively and 119871 119894119895(119899) is the linking item 119884119894119895(119899) isin[0 1] denotes the pulse output of 119873(119894 119895) 119872119894119895119896119897 and 119908119894119895119896119897denote the relationship between the current neuron and thesurrounding neurons respectively 120573 is the linking strengthor linking coefficient 120572119865 120572119871 and 120572119879 are the attenuationtime constants 119881119865 119881119871 and 119881119879 denote the inherent voltagepotential of 119865119894119895(119899) 119871 119894119895(119899) and 119879119894119895(119899) respectively

4 Mathematical Problems in Engineering

32

32

16

16

16

16

8

8Low-pass

High-pass

Low-pass

High-pass

64

64

4 times 4

4 times 4

2 times 2

2 times 2

2 times 2

2 times 2

12times1

12times1

12times1

12times1

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot

middot middot middot

middot middot middot

Figure 3 The diagram of the TT

The complexity limits the application of the PCNNin the field of image fusion Most of the parameters aredifficult to set up due to the change of source images Theseparameters are commonly adjusted by a lot of experimentsand experience The PCNN relies on sync pulse distributionphenomenon for giving rise to pixel change Because themathematics coupled characteristic of the PCNN itself has anoverwritten effect on biological characteristics the improvedPCNN and parameters setting basis are used to eliminatethe coupled characteristic [21] We adopted the optimizationmodel for fusing the high-pass coefficients Thereby theimproved model is given by

119865119894119895 (119899) = 119878119894119895119880119894119895 (119899) = 119865119894119895 (119899) [119863 + 119863sum119872119894119895119896119897119884119894119895 (119899 minus 1)] 119884119894119895 (119899) = 120576 [119880119894119895 (119899) minus 119879119894119895 (119899)] 119879119894119895 (119899) = exp (minus120572119879) 119879119894119895 (119899 minus 1) + 119881119879sum119884119894119895 (119899 minus 1)

(5)

where 119863 is the normalized parameter for finishing the weakcoupling connection The spatial frequency (SF) is fit formotivating the PCNN directly [6] It reflects the gradientfeatures of images in transform domain which is consideredan effective external input stimulus of the PCNN Let 120595119894119895119896119897represent the coefficients located at (119894 119895) in the 119896th subbandsat the 119897th decomposition level These parameters are

119878119894119895 = sum119894119895isin[33]

(120595119894119895119896119897 minus 120595119894minus1119895119896119897)2 + (120595119894119895119896119897 minus 120595119894119895minus1119896119897)2 (6)

119872 =[[[[[[[[

05119863

1119863

05119863

1119863 1 1

11986305119863

1119863

05119863

]]]]]]]]

(7)

where 119878119894119895 denotes the spatial frequency of high-pass domain119878max is the max gray of normalized source images Let = 25 lowast119878max 119881119879 = 119878max and 120572119879 = 00001In addition if the cross entropy is bigger than the last one

during the iterative process the cyclic process of the PCNNis accomplished

3 The TT-PCNN

There is no perfect transformation which achieves completedapproximation of various image details due to the inherentdefect of the multiscale transformThese details usually con-tain important features of the targets We use the Laplacianpyramid to decompose the low-pass band of source imagesin the TT domain The remaining details of the concealedobjects usually exist in the top layer of the Laplacian pyramidMeanwhile the rule based on average gradient is applied tofuse and enhance the details of objects which are sensitive tohuman vision In addition the detailed features of the hiddentargets in the PMMWimages are important to the subsequentrecognition limited by various factors such as the style ofimaging and electronic noise We adopted the coefficient ofthe high-pass band as the input of the PCNN in advanceThe enhancement operator of the PCNN has the capability ofenhancing the details of the hidden targets which is beneficialto the subsequent target recognitionAdditionally we fuse thehigh-coefficient of visible and PMMW images based on theSF The TT-PCNN and fusion rules are shown in Figure 4

Step 1 Decompose the source images into the low-pass andhigh-pass subbands via the TTThe following coefficients areexpressed as the high-pass coefficients119879HPA and119879HPB and thelow-pass coefficients 119879LPA and 119879LPBStep 2 119879LPA and 119879LPB are decomposed by the Laplacianpyramid named 119871PA and 119871PB respectively The fusion rulebased regional gradient is used for fusing the top layer of 119871PAand 119871PB Suppose that LPAtop(119894 119895) is the value of 119871PA located

Mathematical Problems in Engineering 5

Visible imageA PMMW image B

Tetrolet transform

Top layer image Other layer images

Fusion based on regionalaverage gradient

Fusion based on maximumabsolute value

Fusion and inverse pyramidtransform

Improved PCNN

Spatial frequency

Fusion based on firingtime

Inverse Tetrolet transform

Enhancement based onimproved PCNN

Output fusion result

Improved PCNN

Spatial frequency

Low-pass coefficientsT0 T0

High-pass coefficientsT(0

High-pass coefficientsT(0

Laplacian pyramid

Fusion result T0 Fusion result T(0

decomposition L0A and L0B

Figure 4 The flow chart of the TT-PCNN

at top layer The regional space at the center of LPAtop(119894 119895)is 3 times 3 LPFtop(119894 119895) is the fused result of LPAtop(119894 119895)and LPBtop(119894 119895) LPBtop(119894 119895) and 119866119861(119894 119895) have the similardefinition as LPAtop(119894 119895) and 119866119860(119894 119895) The regional averagegradient 119866119860(119894 119895) is expressed as

119866119860 (119894 119895) = 142sum119894=1

2sum119895=1

radicΔ1198681198942 + Δ1198681198952 (8)

where Δ119868119894 and Δ119868119895 are the first-order difference of LPAtop(119894 119895)in different directions So the fusion rule of top layer isdescribed as

LPFtop (119894 119895) = LPAtop (119894 119895) 119866119860 (119894 119895) ge 119866119861 (119894 119895)LPBtop (119894 119895) 119866119860 (119894 119895) lt 119866119861 (119894 119895)

(9)

In addition the rule of choosing the highest absolute value isdesigned to fuse the value of other layers of 119871PA and 119871PB

Step 3 Inverse Laplacian pyramid and obtain the fusionresult 119879LPFStep 4 The enhancement of targets area is based on theimproved PCNN Suppose that 119879HPB(119894 119895) represents thecoefficients of 119879HPB and let 119865119894119895(119899) = 119879HPB(119894 119895) Meanwhilethe other parameters remain the same settings as (7)

Step 5 This is the final fusion of the high-pass coefficientsThe SF is obtained from (6) in slipping windows 3 times 3which is the input of the improved PCNN The fusion ruleis designed as

6 Mathematical Problems in Engineering

119879HPF (119894 119895)

= 119879HPFA (119894 119895) Fire119860(119894119895) (119899) ge Fire119861(119894119895) (119899)119879HPFB (119894 119895) Fire119860(119894119895) (119899) lt Fire119861(119894119895) (119899)

(10)

where Fire(119894119895)(119899) denotes the firing time of each coefficientwhich is given by

Fire(119894119895) (119899) = Fire(119894119895) (119899 minus 1) + 119884119894119895 (119899) (11)

Step 6 Use the selected coefficients to reconstruct the fusedimage via the inverse TT

4 Experimental Results andPerformance Analysis

41 Evaluation Criteria The existing metrics are classifiedinto three categories statistics-based information-based andhuman-visual-system based classes The selected metricswith smaller correlation are beneficial to the objectivity ofthe evaluation [24] The statistics-based metrics are easilyinfluenced by the pseudoedges of targets so we evaluatethe fusion performance based on information-based andhuman-visual-system based metrics The information-basedevaluation indexes mainly contain information entropy (IE)and mutual information (MI) [25] Moreover 119876119860119861119865 is arepresentative model in the evaluation system based onhuman vision since it has strong correlation with otherhuman-visual-system based metrics [26] These formulas areshown as follows

IE

119867(119883) = 119899sum119894=1

119875 (119909119894) 119868 (119909119894) =119899sum119894=1

119875 (119909119894) log119887119875 (119909119894) (12)

MI

MI = sum119891119886

119901119865119860 (119891 119886) log 119901119865119860 (119891 119886)119901119865 (119891) 119901119860 (119886)

+sum119891119887

119901119865119861 (119891 119887) log 119901119865119861 (119891 119887)119901119865 (119891) 119901119861 (119887)

(13)

119876119860119861119865119876119860119861119865

= sum119873119899=1sum119872119898=1 119876119860119865 (119899119898)119908119860 (119899119898) + 119876119861119865 (119899119898)119908119861 (119899119898)sum119873119894=1sum119872119895=1 (119908119860 (119894 119895) + 119908119861 (119894 119895)) (14)

where 119875(119909119894) is the probability mass function of the inputimages119901119865119883(119886 119887)119901119860(119886) and119901119861(119886) is obtained by simple nor-malization of the joint and marginal histograms of the inputimages 119876119860119865(119899119898) and 119876119861119865(119899119898) are weighted by the coeffi-cients of the edge preservation values119908119860(119899119898) and119908119861(119899119898)reflect the perceptual importance of the corresponding edgeelements IE reflects the amount of average information inthe fused images MI reflects detailed information which

is obtained from source images whereas the metric 119876119860119861119865computes and measures the amount of edge informationtransferred from source images into the fused results Inaddition a larger value of these metrics means a better fusionresults

The source images derived from ThermoTrex Corpora-tion are shown in Figure 5 There are three soldiers with gunand grenade displayed in Figure 5(a) Due to the limitationof penetrability the information of targets under clothingis not included in the visible image But it contains richenvironmental details about imaging scene In contrastFigure 5(b) is the PMMW image The bright part of thePMMW image reflects the location and shape informationof the concealed objects The outline of the gun and grenadeis detected by the MMW owing to its penetrability and thecontour of three soldiers is heavily blurred It is difficult torecognize lawn from the PMMW image We use differentwavelets and fusion rules for acquiring the results in thesubsequent section in order to prove the effectiveness of theproposed algorithm

42The First Group of the Fused Results Thefirst group of thefused results is performed on the PMMW and visible imageFigure 6 illustrates the source images and fusion resultsobtained by different wavelet The fusion results achieved bythe DWT CT NSCT TT and TT-PCNN are displayed inFigures 6(a)ndash6(e) The fusion rule adopted by these waveletsis the same as the description of the TT [15] As can beseen from Figures 6(a)ndash6(e) the five methods successfullyfuse the PMMW and visible image and all the fused imagescontain the concealed objects information and backgroundinformation However it can be found that the fused resultobtained by the DWT has many artifacts due to the lack ofshift-invariance The contour of the gun is a little blurredcaused by the pseudo-Gibbs phenomenaThe CT NSCT andTT achieve a better performance than the DWT methodThe CT has superior performance of depicting the edgedetails So the concealed gun has complete structural featuresfor recognition If the background characteristics of sourceimages have significant differences the CT usually leads tothe decrease of image contrast Due to the shift-invariantof NSCT the pseudo-Gibbs phenomenon is eliminated suc-cessfully Limited by the fusion rules the concealed targetshave low contrast which produces serious impact on riskidentification Since the TT has superior capacity to describesmooth region and local details the fused result achievesbetter effect than the above methods The proposed methodprovides best visual effects Almost all the useful informationof concealed objects is transferred to the fused image andfewer artifacts are introduced during the fusion processTable 1 shows the evaluation results of the five methodsThe IE of the fused image obtained by the DWT and theTT is bigger than the TT-PCNN due to the introductionof invalid information The MI obtained by the TT-PCNNacquires the maximum It illustrates the fact that the fusedimage extracts more information from the original imagesFurthermore 119876119860119861119865 of the TT-PCNN is maximum whichindicates that the proposed algorithm preserves the detailed

Mathematical Problems in Engineering 7

(a) The visible image (b) The PMMW image

Figure 5 The source images

(a) DWT (b) CT

(c) NSCT (d) TT

(e) TT-PCNN

Figure 6 The fused results obtained by different wavelets and the TT-PCNN

8 Mathematical Problems in Engineering

Table 1 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

DWT 71282 77468 00191CT 63092 87047 00078NSCT 59518 90469 00239TT 69637 79843 00255TT-PCNN 67853 115794 00377

(a) CT-PCNN (b) NSCT-PCNN

(c) NSCT (d) TT-PCNN

Figure 7 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

information and extracts more edge information from sourceimages effectively The objective evaluation meets the visualobservation

43 The Second Group of the Fused Results The fusionresults of the NSCT-PCNN CT-PCNN NSCT and TT-PCNN are displayed in Figures 7(a)ndash7(d) As can be seenfrom Figures 7(a)ndash7(d) all of the methods successfully fusethe PMMW and visible images All the fused images stillcontain concealed targets information However the fusedimage obtained by the CT-PCNN still has low contrast dueto the background differences between source images whichis a common problem of the CT based methods While theNSCT-PCNN and NSCT achieve a better performance thanthe CT-PCNNThepseudo-Gibbs phenomenon is eliminatedowing to the shift-invariant of NSCT It is proven that thePCNN is conducive to enhance the details of interesting

targets So the PCNN is beneficial to the fusion of visible andPMMW images But the concealed objects and backgroundhave low contrast Especially the information of the grenadeis difficult to discrimination The TT-PCNN provides bettervisual effects The detailed information of gun and grenadeis preserved well Table 2 shows the evaluation results of thefour methods The IE of the fused image achieved by the TT-PCNN is the second maximum This means that the fusedresult contains a lot of information inherited from sourceimages MI and 119876119860119861119865 of the fused image obtained by theTT-PCNN gain the largest value This demonstrates that theproposed algorithm extracts abundant image informationfrom source images and achieves high contrast

44 The Third Group of the Fused Results As shown inFigure 8 the source images and fused results are displayedwell Figures 8(a) and 8(b) are the visible image and PMMW

Mathematical Problems in Engineering 9

(a) The visible image (b) The PMMW image

(c) CT-PCNN (d) NSCT-PCNN

(e) NSCT (f) TT-PCNN

Figure 8 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

10 Mathematical Problems in Engineering

Table 2 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 68243 84638 00361CT-PCNN [7] 65108 82892 00088NSCT [8] 67329 84112 00299TT-PCNN 67853 115794 00377

Table 3 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 72078 37030 05382CT-PCNN [7] 76778 56082 04273NSCT [8] 77781 47641 05802TT-PCNN 78731 47673 05811

image A single 94-Ghz radiometer on a scanning 24 in dishantenna is used to detect the MMW energy of concealedweapons [27] As can be seen from Figures 8(c)ndash8(f) all ofthe methods successfully synthesize the targets informationand the background information But the contrast of thefused image based on the CT-PCNN is relatively low TheNSCT and NSCT-PCNN methods improve the fusion effectand achieve high contrast However these two methods stillenhanced useless information such as the radiated infor-mation of the dress zipper The TT-PCNN synthesizes thePMMW and visible images highlights the information ofconcealed weapons and suppresses the invalid informationThe objective evaluation of the fused results is listed inTable 3 The TT-PCNN receives the maximum comparedwith other algorithms It proves that the fused result of theproposed method contains abundant target information andpreserves more object features well

5 Conclusion

In this paper an improved PCNN for the fusion of thePMMWand visible image is proposed in the Tetrolet domainThe improved PCNNmodel ismore simple and adaptive withfewer parameters We firstly adopted the improved PCNN tostrengthen the high-pass coefficients of the PMMW image inorder to enhance the contour of concealed targets And thena Laplacian pyramid is introduced for the decomposition oflow-pass band after the TTNext the SF is applied tomotivatethe improved PCNN neurons The flexible multiresolutionof the TT is associated with global coupling and pulsesynchronization characteristics of the PCNN Finally thefour groups of experiments are conducted for evaluatingthe fusion performance The results show that the proposedalgorithm has superior performance of fusing the visibleand PMMW images The fused results have high contrastremarkable target information and rich information ofbackground The proposed method is suitable for fusing theinfrared and visible image which is superior to the other

fusion algorithms in terms of visual quality and quantitativeevaluation

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this article

Acknowledgments

This work is supported by the Postdoctoral fund inJiangsu province under Grant no 1302027C the NaturalScience Foundation of Jiangsu Province under Grant no15KJB510008 and the State Key Laboratory of MillimeterWaves Project no K201714The support of Image ProcessingLab of Jiang Su University of Science and Technology isacknowledged Thanks are due to Dr Qu for publishingrelated program on the Internet andDr Larry DrMerit andDr Philip who collected a large number of PMMW images

References

[1] WYuXChen andLWu ldquoSegmentation ofConcealedObjectsin Passive Millimeter-Wave Images Based on the GaussianMixture Modelrdquo Journal of Infrared Millimeter and TerahertzWaves vol 36 no 4 pp 400ndash421 2015

[2] S Dill M Peichl and H Su ldquoStudy of passiveMMWpersonnelimaging with respect to suspicious and common concealedobjects for security applicationsrdquo in Proceedings of the Millime-tre Wave and Terahertz Sensors and Technology UK September2008

[3] W W Kong Y J Lei Y Lei and S Lu ldquoImage fusion techniquebased on non-subsampled contourlet transform and adaptiveunit-fast-linking pulse-coupled neural networkrdquo IET ImageProcessing vol 5 no 2 pp 113ndash121 2011

[4] X Song L Li and J Yang ldquoImage fusion algorithm for visibleand PMMW images based on em and Ncutrdquo in Proceedingsof the 2013 Joint Conference of International Conference onComputational Problem-Solving and International High Speed

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 2: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

2 Mathematical Problems in Engineering

MMW imaging sensor array

Imagingobject

PMMWimages

Figure 1 The schematic diagram of the PMMW imaging

feature expression about the concealed objects As a resultthe imaging quality is insufficient for supporting the follow-up task such as target recognition and localizationThere is aninevitable limitation for detecting the concealed targets basedon any type of sensors or methods Additionally the visibleimage has good readability and rich details of scene withthe defect of exploring the concealed objects The infraredsensor is used to obtain thermal radiation information of thehidden targets which is inferior to the PMMW sensor whendetecting the metal objects [2] The multisource informationfusion has maintained a strong vitality for obtaining multipledimensions information about the interesting targets underany complex viewing conditions It is effective for improvingthe validity and accuracy of recognizing the concealed objectswith the comprehensive description of a scene The fusedresults integrate complementary and redundant informationfrom source images and obtain a more sufficient descriptionof the targets than any single source images [3] Song etal [4] proposed a novel region-based algorithm based onExpectation-Maximization (EM) algorithm and NormalizedCut (Ncut) algorithm A region growing algorithm is used toextract the potential target regions and a statistical modelwith Gaussian mixture distortion is applied for producingthe fusion image Xiong et al [5] proposed a novel algorithmbased on clustering andnonsubsampled contourlet transform(NSCT) The fusion image is obtained by taking the inverseNSCT of the fusion coefficients These fusion algorithms aremore meaningful for obtaining the concealed information

The fused images contain much more comprehensiveand accurate information than a single image This is widelyexploited in the field of military medical science remotesensing and machine vision [9] Particularly the multiscaletransforms are usually applied to achieve sparse representa-tion of source images The final images are obtained throughfusion in accordance with certain rules There are severaltypes of the multiscale transform such as discrete wavelettransform (DWT) [10] Curvelet transform (CT) [11] NSCTand Tetrolet transform (TT) [12 13] The DWT is suitable fordealing with singular signal with the limitation of describinglinear signal and the CT is suitable for approximating theclosed curve The NSCT not only inherits the anisotropyof the CT but also expands multidirection and translationinvariance The TT executes sparse decomposition of sourceimages due to its excellent capability of multiscale geometryanalysis Krommweh firstly proved that the TT is betterthan the DWT CT and NSCT when describing geometric

structure characteristics [13] Shi et al presented a hybridmethod for image approximation using the TT and wavelet[14] The core of the algorithm is the further sparse rep-resentation of the low-pass band in the TT domain Afterthat some scholars began to explore the possibility of intro-ducing the TT into multisource image fusion For exampleHuang et al proposed different rules for the low and high-pass coefficients The local region gradient information wasapplied to get the low-pass fusion coefficients And the largerregion edge informationmeasurement factor is used to selectthe better coefficients for fusion [15] Shen et al proposedan improved algorithm based on the TT for fusing theinfrared and visible images [16] An optimization algorithmnamed compressive sampling matching pursuit (referred toas CoSaMP) is used to determine the fusion coefficients ThePMMW images contain relatively less information due to thedetecting principles The CoSaMP algorithm usually causescertain loss of useful information Yan et al introduced aregional gradient into the fusion process in the TT domain[17] The fused results are better than those algorithms basedon the wavelet transform and principal component analysis(PCA) methods However the low-frequency coefficients ofthe TT contain a small amount of details such as edge andcorner feature If these details are neglected the fused resultsalways lost a lot of targetsrsquo contours Zhang et al proposeda Laplacian pyramid for decomposing the low-frequencyportion of the TT and proved that the Laplacian pyramidis conducive to improve the capability of describing details[18] The result shows that the proposed algorithm performswell when fusing multichannel satellite cloud images If thesource images have similar structural characteristics thismethod has good performance when preserving image edgeand curvature geometric structure However due to thecharacteristic difference between the visible and PMMWimages the contour features of the concealed objects can besubmerged in the background easily

The pulse-coupled neural network (PCNN) is known asthe third-generation neural network developed by Eckhorn etal in 1990 It was founded on the experimental observationsof synchronous pulse bursts in cat and monkey visual cortex[19] Although PCNN achieves excellent results the PCNN-based fusion methods are complex and inefficient for dealingwith different source images Wang et al illustrated thatthe amount of the channels of the PCNN and parameterslimits its application [20] Many researchers have improvedthe original PCNN model for making it more appropriate

Mathematical Problems in Engineering 3

Figure 2 The five kinds of tetrominoes

for image fusion For example Deng and Ma proposedan improved PCNN model and implemented the initialparameters based on the max gray of normalized sourceimages [21] Chenhui and Jianying [22] decomposed thevisible and PMMW images in themultibandwavelet domainThe fusion ruler for low-frequency coefficients is based onlocal variance and the high-frequency coefficients are basedon the adaptive PCNN Xiong et al [23] adopted the CTtransform for obtaining the coefficients at different scalesThepotential target regions of the PMMW image is conducive todetermine the fusion coefficients which takes advantage ofthe particular ability of the PMMWimage in presentingmetaltarget So the fusion coefficients are determined by the featureof PMMW image based on the region growing algorithmand the improved PCNN is selected for the fine scalewhich enhances the performance of fusion for integratingthe important information of the visible and PMMW imageHowever the result of the region growing is restricted bythree major factors such as an initial growth point a growthcriteria and a terminating condition which directly affectedthe success rate of the potential target extraction

In this work we adopted a generic framework for thefusion of the source image instead of extracting the targetregion of the PMMW image Both the TT and the improvedPCNN are applied for fusing the visible and PMMW imageswith different rulesMeanwhile the PCNN is used to enhancethe clarity and contrast of the hidden targets The rest of thispaper is organized as follows The principles of the TT andthe PCNN are illustrated in Section 2 The proposed fusionmethod is described in Section 3 The results and analysisof experiments are shown in Section 4 Finally Section 5concludes the work

2 The Theory of the TT and PCNN

21 The Theory of the TT The TT possesses smaller supportdomain and avoids the Gibbs phenomenon at the edge ofimages Five basic structures of the TT are shown in Figure 2

Suppose a source image is expressed as 1198860 = (119886[119894 119895])119873minus1119894119895=0119873 = 2119869 (119869 isin 119873) The decomposition process of the TT isshown as follows

(I) Primary Decomposition The low-pass image 119886119903minus1 isdivided into several blocks 119876119894119895 119894 119895 = 0 1198734119903 minus 1(II) Tetrominoes Selection The low-pass coefficients aredefined as

119886119903(119888) = (119886119903(119888) [119904])3119904=0

119886119903(119888) [119904] = sum(1198981198991015840)isin119868(119888)119904

120576 [0 119871 (119898 119899)] 119886119903minus1 [119898 119899] (1)

and then the three high-pass coefficients for 119897 = 1 2 3 aregiven by

119908119903(119888)119897 = (119908119903(119888)119897 [119904])3119904=0

119908119903(119888)119897 [119904] = sum(119898119899)isin119868(119888)119904119904

120576 [119897 119871 (119898 119899)] 119886119903minus1 [119898 119899] (2)

Thus the covering 119888lowast is119888lowast = argmin

119888

3sum119897=1

10038171003817100381710038171003817119908119903(119888)119897 10038171003817100381710038171003817119897 = argmin119888

3sum119897=1

3sum119904=0

10038161003816100381610038161003816119908119903(119888)119897 [119904]10038161003816100381610038161003816 (3)

An optimal Tetrolet decomposition in the first phase is[119886119903(119888lowast) 119908119903(119888lowast)1 119908119903(119888lowast)2 119908119903(119888lowast)3 ](III) Rearranging Coefficients The low-frequency coefficientsof each block are retranslated into 2 times 2 blocks Then steps(I) and (II) are repeated for sparse representation

(IV) Image Reconstruction The fused image is reconstructedbased on the low-pass high-pass coefficients and the corre-sponding coverings

The flow chart of the TT is shown in Figure 3

22TheTheory of the PCNN Theneuronmodel of the PCNNis described as follows [20]

119865119894119895 (119899) = exp (minus120572119865) 119865119894119895 (119899 minus 1)+ 119881119865sum119908119894119895119896119897119884119894119895 (119899 minus 1) + 119878119894119895

119871 119894119895 (119899) = exp (minus120572119871) 119871 119894119895 (119899 minus 1)+ 119881119871sum119872119894119895119896119897119884119894119895 (119899 minus 1)

119880119894119895 (119899) = 119865119894119895 (119899) (1 + 120573119871 119894119895 (119899)) 119884119894119895 (119899) = 1 if 119880119894119895 (119899) gt 119879119894119895 (119899)119884119894119895 (119899) = 0 otherwise

119879119894119895 (119899) = exp (minus120572119879) 119879119894119895 (119899 minus 1) + 119881119879sum119884119894119895 (119899)

(4)

where 119878119894119895 and 119865119894119895(119899) denote the external input stimulus andthe feedback of119873(119894 119895) respectively 119880119894119895(119899) and 119879119894119895(119899) repre-sent the internal activity of neuron and the dynamic thresh-old respectively and 119871 119894119895(119899) is the linking item 119884119894119895(119899) isin[0 1] denotes the pulse output of 119873(119894 119895) 119872119894119895119896119897 and 119908119894119895119896119897denote the relationship between the current neuron and thesurrounding neurons respectively 120573 is the linking strengthor linking coefficient 120572119865 120572119871 and 120572119879 are the attenuationtime constants 119881119865 119881119871 and 119881119879 denote the inherent voltagepotential of 119865119894119895(119899) 119871 119894119895(119899) and 119879119894119895(119899) respectively

4 Mathematical Problems in Engineering

32

32

16

16

16

16

8

8Low-pass

High-pass

Low-pass

High-pass

64

64

4 times 4

4 times 4

2 times 2

2 times 2

2 times 2

2 times 2

12times1

12times1

12times1

12times1

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot

middot middot middot

middot middot middot

Figure 3 The diagram of the TT

The complexity limits the application of the PCNNin the field of image fusion Most of the parameters aredifficult to set up due to the change of source images Theseparameters are commonly adjusted by a lot of experimentsand experience The PCNN relies on sync pulse distributionphenomenon for giving rise to pixel change Because themathematics coupled characteristic of the PCNN itself has anoverwritten effect on biological characteristics the improvedPCNN and parameters setting basis are used to eliminatethe coupled characteristic [21] We adopted the optimizationmodel for fusing the high-pass coefficients Thereby theimproved model is given by

119865119894119895 (119899) = 119878119894119895119880119894119895 (119899) = 119865119894119895 (119899) [119863 + 119863sum119872119894119895119896119897119884119894119895 (119899 minus 1)] 119884119894119895 (119899) = 120576 [119880119894119895 (119899) minus 119879119894119895 (119899)] 119879119894119895 (119899) = exp (minus120572119879) 119879119894119895 (119899 minus 1) + 119881119879sum119884119894119895 (119899 minus 1)

(5)

where 119863 is the normalized parameter for finishing the weakcoupling connection The spatial frequency (SF) is fit formotivating the PCNN directly [6] It reflects the gradientfeatures of images in transform domain which is consideredan effective external input stimulus of the PCNN Let 120595119894119895119896119897represent the coefficients located at (119894 119895) in the 119896th subbandsat the 119897th decomposition level These parameters are

119878119894119895 = sum119894119895isin[33]

(120595119894119895119896119897 minus 120595119894minus1119895119896119897)2 + (120595119894119895119896119897 minus 120595119894119895minus1119896119897)2 (6)

119872 =[[[[[[[[

05119863

1119863

05119863

1119863 1 1

11986305119863

1119863

05119863

]]]]]]]]

(7)

where 119878119894119895 denotes the spatial frequency of high-pass domain119878max is the max gray of normalized source images Let = 25 lowast119878max 119881119879 = 119878max and 120572119879 = 00001In addition if the cross entropy is bigger than the last one

during the iterative process the cyclic process of the PCNNis accomplished

3 The TT-PCNN

There is no perfect transformation which achieves completedapproximation of various image details due to the inherentdefect of the multiscale transformThese details usually con-tain important features of the targets We use the Laplacianpyramid to decompose the low-pass band of source imagesin the TT domain The remaining details of the concealedobjects usually exist in the top layer of the Laplacian pyramidMeanwhile the rule based on average gradient is applied tofuse and enhance the details of objects which are sensitive tohuman vision In addition the detailed features of the hiddentargets in the PMMWimages are important to the subsequentrecognition limited by various factors such as the style ofimaging and electronic noise We adopted the coefficient ofthe high-pass band as the input of the PCNN in advanceThe enhancement operator of the PCNN has the capability ofenhancing the details of the hidden targets which is beneficialto the subsequent target recognitionAdditionally we fuse thehigh-coefficient of visible and PMMW images based on theSF The TT-PCNN and fusion rules are shown in Figure 4

Step 1 Decompose the source images into the low-pass andhigh-pass subbands via the TTThe following coefficients areexpressed as the high-pass coefficients119879HPA and119879HPB and thelow-pass coefficients 119879LPA and 119879LPBStep 2 119879LPA and 119879LPB are decomposed by the Laplacianpyramid named 119871PA and 119871PB respectively The fusion rulebased regional gradient is used for fusing the top layer of 119871PAand 119871PB Suppose that LPAtop(119894 119895) is the value of 119871PA located

Mathematical Problems in Engineering 5

Visible imageA PMMW image B

Tetrolet transform

Top layer image Other layer images

Fusion based on regionalaverage gradient

Fusion based on maximumabsolute value

Fusion and inverse pyramidtransform

Improved PCNN

Spatial frequency

Fusion based on firingtime

Inverse Tetrolet transform

Enhancement based onimproved PCNN

Output fusion result

Improved PCNN

Spatial frequency

Low-pass coefficientsT0 T0

High-pass coefficientsT(0

High-pass coefficientsT(0

Laplacian pyramid

Fusion result T0 Fusion result T(0

decomposition L0A and L0B

Figure 4 The flow chart of the TT-PCNN

at top layer The regional space at the center of LPAtop(119894 119895)is 3 times 3 LPFtop(119894 119895) is the fused result of LPAtop(119894 119895)and LPBtop(119894 119895) LPBtop(119894 119895) and 119866119861(119894 119895) have the similardefinition as LPAtop(119894 119895) and 119866119860(119894 119895) The regional averagegradient 119866119860(119894 119895) is expressed as

119866119860 (119894 119895) = 142sum119894=1

2sum119895=1

radicΔ1198681198942 + Δ1198681198952 (8)

where Δ119868119894 and Δ119868119895 are the first-order difference of LPAtop(119894 119895)in different directions So the fusion rule of top layer isdescribed as

LPFtop (119894 119895) = LPAtop (119894 119895) 119866119860 (119894 119895) ge 119866119861 (119894 119895)LPBtop (119894 119895) 119866119860 (119894 119895) lt 119866119861 (119894 119895)

(9)

In addition the rule of choosing the highest absolute value isdesigned to fuse the value of other layers of 119871PA and 119871PB

Step 3 Inverse Laplacian pyramid and obtain the fusionresult 119879LPFStep 4 The enhancement of targets area is based on theimproved PCNN Suppose that 119879HPB(119894 119895) represents thecoefficients of 119879HPB and let 119865119894119895(119899) = 119879HPB(119894 119895) Meanwhilethe other parameters remain the same settings as (7)

Step 5 This is the final fusion of the high-pass coefficientsThe SF is obtained from (6) in slipping windows 3 times 3which is the input of the improved PCNN The fusion ruleis designed as

6 Mathematical Problems in Engineering

119879HPF (119894 119895)

= 119879HPFA (119894 119895) Fire119860(119894119895) (119899) ge Fire119861(119894119895) (119899)119879HPFB (119894 119895) Fire119860(119894119895) (119899) lt Fire119861(119894119895) (119899)

(10)

where Fire(119894119895)(119899) denotes the firing time of each coefficientwhich is given by

Fire(119894119895) (119899) = Fire(119894119895) (119899 minus 1) + 119884119894119895 (119899) (11)

Step 6 Use the selected coefficients to reconstruct the fusedimage via the inverse TT

4 Experimental Results andPerformance Analysis

41 Evaluation Criteria The existing metrics are classifiedinto three categories statistics-based information-based andhuman-visual-system based classes The selected metricswith smaller correlation are beneficial to the objectivity ofthe evaluation [24] The statistics-based metrics are easilyinfluenced by the pseudoedges of targets so we evaluatethe fusion performance based on information-based andhuman-visual-system based metrics The information-basedevaluation indexes mainly contain information entropy (IE)and mutual information (MI) [25] Moreover 119876119860119861119865 is arepresentative model in the evaluation system based onhuman vision since it has strong correlation with otherhuman-visual-system based metrics [26] These formulas areshown as follows

IE

119867(119883) = 119899sum119894=1

119875 (119909119894) 119868 (119909119894) =119899sum119894=1

119875 (119909119894) log119887119875 (119909119894) (12)

MI

MI = sum119891119886

119901119865119860 (119891 119886) log 119901119865119860 (119891 119886)119901119865 (119891) 119901119860 (119886)

+sum119891119887

119901119865119861 (119891 119887) log 119901119865119861 (119891 119887)119901119865 (119891) 119901119861 (119887)

(13)

119876119860119861119865119876119860119861119865

= sum119873119899=1sum119872119898=1 119876119860119865 (119899119898)119908119860 (119899119898) + 119876119861119865 (119899119898)119908119861 (119899119898)sum119873119894=1sum119872119895=1 (119908119860 (119894 119895) + 119908119861 (119894 119895)) (14)

where 119875(119909119894) is the probability mass function of the inputimages119901119865119883(119886 119887)119901119860(119886) and119901119861(119886) is obtained by simple nor-malization of the joint and marginal histograms of the inputimages 119876119860119865(119899119898) and 119876119861119865(119899119898) are weighted by the coeffi-cients of the edge preservation values119908119860(119899119898) and119908119861(119899119898)reflect the perceptual importance of the corresponding edgeelements IE reflects the amount of average information inthe fused images MI reflects detailed information which

is obtained from source images whereas the metric 119876119860119861119865computes and measures the amount of edge informationtransferred from source images into the fused results Inaddition a larger value of these metrics means a better fusionresults

The source images derived from ThermoTrex Corpora-tion are shown in Figure 5 There are three soldiers with gunand grenade displayed in Figure 5(a) Due to the limitationof penetrability the information of targets under clothingis not included in the visible image But it contains richenvironmental details about imaging scene In contrastFigure 5(b) is the PMMW image The bright part of thePMMW image reflects the location and shape informationof the concealed objects The outline of the gun and grenadeis detected by the MMW owing to its penetrability and thecontour of three soldiers is heavily blurred It is difficult torecognize lawn from the PMMW image We use differentwavelets and fusion rules for acquiring the results in thesubsequent section in order to prove the effectiveness of theproposed algorithm

42The First Group of the Fused Results Thefirst group of thefused results is performed on the PMMW and visible imageFigure 6 illustrates the source images and fusion resultsobtained by different wavelet The fusion results achieved bythe DWT CT NSCT TT and TT-PCNN are displayed inFigures 6(a)ndash6(e) The fusion rule adopted by these waveletsis the same as the description of the TT [15] As can beseen from Figures 6(a)ndash6(e) the five methods successfullyfuse the PMMW and visible image and all the fused imagescontain the concealed objects information and backgroundinformation However it can be found that the fused resultobtained by the DWT has many artifacts due to the lack ofshift-invariance The contour of the gun is a little blurredcaused by the pseudo-Gibbs phenomenaThe CT NSCT andTT achieve a better performance than the DWT methodThe CT has superior performance of depicting the edgedetails So the concealed gun has complete structural featuresfor recognition If the background characteristics of sourceimages have significant differences the CT usually leads tothe decrease of image contrast Due to the shift-invariantof NSCT the pseudo-Gibbs phenomenon is eliminated suc-cessfully Limited by the fusion rules the concealed targetshave low contrast which produces serious impact on riskidentification Since the TT has superior capacity to describesmooth region and local details the fused result achievesbetter effect than the above methods The proposed methodprovides best visual effects Almost all the useful informationof concealed objects is transferred to the fused image andfewer artifacts are introduced during the fusion processTable 1 shows the evaluation results of the five methodsThe IE of the fused image obtained by the DWT and theTT is bigger than the TT-PCNN due to the introductionof invalid information The MI obtained by the TT-PCNNacquires the maximum It illustrates the fact that the fusedimage extracts more information from the original imagesFurthermore 119876119860119861119865 of the TT-PCNN is maximum whichindicates that the proposed algorithm preserves the detailed

Mathematical Problems in Engineering 7

(a) The visible image (b) The PMMW image

Figure 5 The source images

(a) DWT (b) CT

(c) NSCT (d) TT

(e) TT-PCNN

Figure 6 The fused results obtained by different wavelets and the TT-PCNN

8 Mathematical Problems in Engineering

Table 1 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

DWT 71282 77468 00191CT 63092 87047 00078NSCT 59518 90469 00239TT 69637 79843 00255TT-PCNN 67853 115794 00377

(a) CT-PCNN (b) NSCT-PCNN

(c) NSCT (d) TT-PCNN

Figure 7 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

information and extracts more edge information from sourceimages effectively The objective evaluation meets the visualobservation

43 The Second Group of the Fused Results The fusionresults of the NSCT-PCNN CT-PCNN NSCT and TT-PCNN are displayed in Figures 7(a)ndash7(d) As can be seenfrom Figures 7(a)ndash7(d) all of the methods successfully fusethe PMMW and visible images All the fused images stillcontain concealed targets information However the fusedimage obtained by the CT-PCNN still has low contrast dueto the background differences between source images whichis a common problem of the CT based methods While theNSCT-PCNN and NSCT achieve a better performance thanthe CT-PCNNThepseudo-Gibbs phenomenon is eliminatedowing to the shift-invariant of NSCT It is proven that thePCNN is conducive to enhance the details of interesting

targets So the PCNN is beneficial to the fusion of visible andPMMW images But the concealed objects and backgroundhave low contrast Especially the information of the grenadeis difficult to discrimination The TT-PCNN provides bettervisual effects The detailed information of gun and grenadeis preserved well Table 2 shows the evaluation results of thefour methods The IE of the fused image achieved by the TT-PCNN is the second maximum This means that the fusedresult contains a lot of information inherited from sourceimages MI and 119876119860119861119865 of the fused image obtained by theTT-PCNN gain the largest value This demonstrates that theproposed algorithm extracts abundant image informationfrom source images and achieves high contrast

44 The Third Group of the Fused Results As shown inFigure 8 the source images and fused results are displayedwell Figures 8(a) and 8(b) are the visible image and PMMW

Mathematical Problems in Engineering 9

(a) The visible image (b) The PMMW image

(c) CT-PCNN (d) NSCT-PCNN

(e) NSCT (f) TT-PCNN

Figure 8 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

10 Mathematical Problems in Engineering

Table 2 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 68243 84638 00361CT-PCNN [7] 65108 82892 00088NSCT [8] 67329 84112 00299TT-PCNN 67853 115794 00377

Table 3 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 72078 37030 05382CT-PCNN [7] 76778 56082 04273NSCT [8] 77781 47641 05802TT-PCNN 78731 47673 05811

image A single 94-Ghz radiometer on a scanning 24 in dishantenna is used to detect the MMW energy of concealedweapons [27] As can be seen from Figures 8(c)ndash8(f) all ofthe methods successfully synthesize the targets informationand the background information But the contrast of thefused image based on the CT-PCNN is relatively low TheNSCT and NSCT-PCNN methods improve the fusion effectand achieve high contrast However these two methods stillenhanced useless information such as the radiated infor-mation of the dress zipper The TT-PCNN synthesizes thePMMW and visible images highlights the information ofconcealed weapons and suppresses the invalid informationThe objective evaluation of the fused results is listed inTable 3 The TT-PCNN receives the maximum comparedwith other algorithms It proves that the fused result of theproposed method contains abundant target information andpreserves more object features well

5 Conclusion

In this paper an improved PCNN for the fusion of thePMMWand visible image is proposed in the Tetrolet domainThe improved PCNNmodel ismore simple and adaptive withfewer parameters We firstly adopted the improved PCNN tostrengthen the high-pass coefficients of the PMMW image inorder to enhance the contour of concealed targets And thena Laplacian pyramid is introduced for the decomposition oflow-pass band after the TTNext the SF is applied tomotivatethe improved PCNN neurons The flexible multiresolutionof the TT is associated with global coupling and pulsesynchronization characteristics of the PCNN Finally thefour groups of experiments are conducted for evaluatingthe fusion performance The results show that the proposedalgorithm has superior performance of fusing the visibleand PMMW images The fused results have high contrastremarkable target information and rich information ofbackground The proposed method is suitable for fusing theinfrared and visible image which is superior to the other

fusion algorithms in terms of visual quality and quantitativeevaluation

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this article

Acknowledgments

This work is supported by the Postdoctoral fund inJiangsu province under Grant no 1302027C the NaturalScience Foundation of Jiangsu Province under Grant no15KJB510008 and the State Key Laboratory of MillimeterWaves Project no K201714The support of Image ProcessingLab of Jiang Su University of Science and Technology isacknowledged Thanks are due to Dr Qu for publishingrelated program on the Internet andDr Larry DrMerit andDr Philip who collected a large number of PMMW images

References

[1] WYuXChen andLWu ldquoSegmentation ofConcealedObjectsin Passive Millimeter-Wave Images Based on the GaussianMixture Modelrdquo Journal of Infrared Millimeter and TerahertzWaves vol 36 no 4 pp 400ndash421 2015

[2] S Dill M Peichl and H Su ldquoStudy of passiveMMWpersonnelimaging with respect to suspicious and common concealedobjects for security applicationsrdquo in Proceedings of the Millime-tre Wave and Terahertz Sensors and Technology UK September2008

[3] W W Kong Y J Lei Y Lei and S Lu ldquoImage fusion techniquebased on non-subsampled contourlet transform and adaptiveunit-fast-linking pulse-coupled neural networkrdquo IET ImageProcessing vol 5 no 2 pp 113ndash121 2011

[4] X Song L Li and J Yang ldquoImage fusion algorithm for visibleand PMMW images based on em and Ncutrdquo in Proceedingsof the 2013 Joint Conference of International Conference onComputational Problem-Solving and International High Speed

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 3: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

Mathematical Problems in Engineering 3

Figure 2 The five kinds of tetrominoes

for image fusion For example Deng and Ma proposedan improved PCNN model and implemented the initialparameters based on the max gray of normalized sourceimages [21] Chenhui and Jianying [22] decomposed thevisible and PMMW images in themultibandwavelet domainThe fusion ruler for low-frequency coefficients is based onlocal variance and the high-frequency coefficients are basedon the adaptive PCNN Xiong et al [23] adopted the CTtransform for obtaining the coefficients at different scalesThepotential target regions of the PMMW image is conducive todetermine the fusion coefficients which takes advantage ofthe particular ability of the PMMWimage in presentingmetaltarget So the fusion coefficients are determined by the featureof PMMW image based on the region growing algorithmand the improved PCNN is selected for the fine scalewhich enhances the performance of fusion for integratingthe important information of the visible and PMMW imageHowever the result of the region growing is restricted bythree major factors such as an initial growth point a growthcriteria and a terminating condition which directly affectedthe success rate of the potential target extraction

In this work we adopted a generic framework for thefusion of the source image instead of extracting the targetregion of the PMMW image Both the TT and the improvedPCNN are applied for fusing the visible and PMMW imageswith different rulesMeanwhile the PCNN is used to enhancethe clarity and contrast of the hidden targets The rest of thispaper is organized as follows The principles of the TT andthe PCNN are illustrated in Section 2 The proposed fusionmethod is described in Section 3 The results and analysisof experiments are shown in Section 4 Finally Section 5concludes the work

2 The Theory of the TT and PCNN

21 The Theory of the TT The TT possesses smaller supportdomain and avoids the Gibbs phenomenon at the edge ofimages Five basic structures of the TT are shown in Figure 2

Suppose a source image is expressed as 1198860 = (119886[119894 119895])119873minus1119894119895=0119873 = 2119869 (119869 isin 119873) The decomposition process of the TT isshown as follows

(I) Primary Decomposition The low-pass image 119886119903minus1 isdivided into several blocks 119876119894119895 119894 119895 = 0 1198734119903 minus 1(II) Tetrominoes Selection The low-pass coefficients aredefined as

119886119903(119888) = (119886119903(119888) [119904])3119904=0

119886119903(119888) [119904] = sum(1198981198991015840)isin119868(119888)119904

120576 [0 119871 (119898 119899)] 119886119903minus1 [119898 119899] (1)

and then the three high-pass coefficients for 119897 = 1 2 3 aregiven by

119908119903(119888)119897 = (119908119903(119888)119897 [119904])3119904=0

119908119903(119888)119897 [119904] = sum(119898119899)isin119868(119888)119904119904

120576 [119897 119871 (119898 119899)] 119886119903minus1 [119898 119899] (2)

Thus the covering 119888lowast is119888lowast = argmin

119888

3sum119897=1

10038171003817100381710038171003817119908119903(119888)119897 10038171003817100381710038171003817119897 = argmin119888

3sum119897=1

3sum119904=0

10038161003816100381610038161003816119908119903(119888)119897 [119904]10038161003816100381610038161003816 (3)

An optimal Tetrolet decomposition in the first phase is[119886119903(119888lowast) 119908119903(119888lowast)1 119908119903(119888lowast)2 119908119903(119888lowast)3 ](III) Rearranging Coefficients The low-frequency coefficientsof each block are retranslated into 2 times 2 blocks Then steps(I) and (II) are repeated for sparse representation

(IV) Image Reconstruction The fused image is reconstructedbased on the low-pass high-pass coefficients and the corre-sponding coverings

The flow chart of the TT is shown in Figure 3

22TheTheory of the PCNN Theneuronmodel of the PCNNis described as follows [20]

119865119894119895 (119899) = exp (minus120572119865) 119865119894119895 (119899 minus 1)+ 119881119865sum119908119894119895119896119897119884119894119895 (119899 minus 1) + 119878119894119895

119871 119894119895 (119899) = exp (minus120572119871) 119871 119894119895 (119899 minus 1)+ 119881119871sum119872119894119895119896119897119884119894119895 (119899 minus 1)

119880119894119895 (119899) = 119865119894119895 (119899) (1 + 120573119871 119894119895 (119899)) 119884119894119895 (119899) = 1 if 119880119894119895 (119899) gt 119879119894119895 (119899)119884119894119895 (119899) = 0 otherwise

119879119894119895 (119899) = exp (minus120572119879) 119879119894119895 (119899 minus 1) + 119881119879sum119884119894119895 (119899)

(4)

where 119878119894119895 and 119865119894119895(119899) denote the external input stimulus andthe feedback of119873(119894 119895) respectively 119880119894119895(119899) and 119879119894119895(119899) repre-sent the internal activity of neuron and the dynamic thresh-old respectively and 119871 119894119895(119899) is the linking item 119884119894119895(119899) isin[0 1] denotes the pulse output of 119873(119894 119895) 119872119894119895119896119897 and 119908119894119895119896119897denote the relationship between the current neuron and thesurrounding neurons respectively 120573 is the linking strengthor linking coefficient 120572119865 120572119871 and 120572119879 are the attenuationtime constants 119881119865 119881119871 and 119881119879 denote the inherent voltagepotential of 119865119894119895(119899) 119871 119894119895(119899) and 119879119894119895(119899) respectively

4 Mathematical Problems in Engineering

32

32

16

16

16

16

8

8Low-pass

High-pass

Low-pass

High-pass

64

64

4 times 4

4 times 4

2 times 2

2 times 2

2 times 2

2 times 2

12times1

12times1

12times1

12times1

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot

middot middot middot

middot middot middot

Figure 3 The diagram of the TT

The complexity limits the application of the PCNNin the field of image fusion Most of the parameters aredifficult to set up due to the change of source images Theseparameters are commonly adjusted by a lot of experimentsand experience The PCNN relies on sync pulse distributionphenomenon for giving rise to pixel change Because themathematics coupled characteristic of the PCNN itself has anoverwritten effect on biological characteristics the improvedPCNN and parameters setting basis are used to eliminatethe coupled characteristic [21] We adopted the optimizationmodel for fusing the high-pass coefficients Thereby theimproved model is given by

119865119894119895 (119899) = 119878119894119895119880119894119895 (119899) = 119865119894119895 (119899) [119863 + 119863sum119872119894119895119896119897119884119894119895 (119899 minus 1)] 119884119894119895 (119899) = 120576 [119880119894119895 (119899) minus 119879119894119895 (119899)] 119879119894119895 (119899) = exp (minus120572119879) 119879119894119895 (119899 minus 1) + 119881119879sum119884119894119895 (119899 minus 1)

(5)

where 119863 is the normalized parameter for finishing the weakcoupling connection The spatial frequency (SF) is fit formotivating the PCNN directly [6] It reflects the gradientfeatures of images in transform domain which is consideredan effective external input stimulus of the PCNN Let 120595119894119895119896119897represent the coefficients located at (119894 119895) in the 119896th subbandsat the 119897th decomposition level These parameters are

119878119894119895 = sum119894119895isin[33]

(120595119894119895119896119897 minus 120595119894minus1119895119896119897)2 + (120595119894119895119896119897 minus 120595119894119895minus1119896119897)2 (6)

119872 =[[[[[[[[

05119863

1119863

05119863

1119863 1 1

11986305119863

1119863

05119863

]]]]]]]]

(7)

where 119878119894119895 denotes the spatial frequency of high-pass domain119878max is the max gray of normalized source images Let = 25 lowast119878max 119881119879 = 119878max and 120572119879 = 00001In addition if the cross entropy is bigger than the last one

during the iterative process the cyclic process of the PCNNis accomplished

3 The TT-PCNN

There is no perfect transformation which achieves completedapproximation of various image details due to the inherentdefect of the multiscale transformThese details usually con-tain important features of the targets We use the Laplacianpyramid to decompose the low-pass band of source imagesin the TT domain The remaining details of the concealedobjects usually exist in the top layer of the Laplacian pyramidMeanwhile the rule based on average gradient is applied tofuse and enhance the details of objects which are sensitive tohuman vision In addition the detailed features of the hiddentargets in the PMMWimages are important to the subsequentrecognition limited by various factors such as the style ofimaging and electronic noise We adopted the coefficient ofthe high-pass band as the input of the PCNN in advanceThe enhancement operator of the PCNN has the capability ofenhancing the details of the hidden targets which is beneficialto the subsequent target recognitionAdditionally we fuse thehigh-coefficient of visible and PMMW images based on theSF The TT-PCNN and fusion rules are shown in Figure 4

Step 1 Decompose the source images into the low-pass andhigh-pass subbands via the TTThe following coefficients areexpressed as the high-pass coefficients119879HPA and119879HPB and thelow-pass coefficients 119879LPA and 119879LPBStep 2 119879LPA and 119879LPB are decomposed by the Laplacianpyramid named 119871PA and 119871PB respectively The fusion rulebased regional gradient is used for fusing the top layer of 119871PAand 119871PB Suppose that LPAtop(119894 119895) is the value of 119871PA located

Mathematical Problems in Engineering 5

Visible imageA PMMW image B

Tetrolet transform

Top layer image Other layer images

Fusion based on regionalaverage gradient

Fusion based on maximumabsolute value

Fusion and inverse pyramidtransform

Improved PCNN

Spatial frequency

Fusion based on firingtime

Inverse Tetrolet transform

Enhancement based onimproved PCNN

Output fusion result

Improved PCNN

Spatial frequency

Low-pass coefficientsT0 T0

High-pass coefficientsT(0

High-pass coefficientsT(0

Laplacian pyramid

Fusion result T0 Fusion result T(0

decomposition L0A and L0B

Figure 4 The flow chart of the TT-PCNN

at top layer The regional space at the center of LPAtop(119894 119895)is 3 times 3 LPFtop(119894 119895) is the fused result of LPAtop(119894 119895)and LPBtop(119894 119895) LPBtop(119894 119895) and 119866119861(119894 119895) have the similardefinition as LPAtop(119894 119895) and 119866119860(119894 119895) The regional averagegradient 119866119860(119894 119895) is expressed as

119866119860 (119894 119895) = 142sum119894=1

2sum119895=1

radicΔ1198681198942 + Δ1198681198952 (8)

where Δ119868119894 and Δ119868119895 are the first-order difference of LPAtop(119894 119895)in different directions So the fusion rule of top layer isdescribed as

LPFtop (119894 119895) = LPAtop (119894 119895) 119866119860 (119894 119895) ge 119866119861 (119894 119895)LPBtop (119894 119895) 119866119860 (119894 119895) lt 119866119861 (119894 119895)

(9)

In addition the rule of choosing the highest absolute value isdesigned to fuse the value of other layers of 119871PA and 119871PB

Step 3 Inverse Laplacian pyramid and obtain the fusionresult 119879LPFStep 4 The enhancement of targets area is based on theimproved PCNN Suppose that 119879HPB(119894 119895) represents thecoefficients of 119879HPB and let 119865119894119895(119899) = 119879HPB(119894 119895) Meanwhilethe other parameters remain the same settings as (7)

Step 5 This is the final fusion of the high-pass coefficientsThe SF is obtained from (6) in slipping windows 3 times 3which is the input of the improved PCNN The fusion ruleis designed as

6 Mathematical Problems in Engineering

119879HPF (119894 119895)

= 119879HPFA (119894 119895) Fire119860(119894119895) (119899) ge Fire119861(119894119895) (119899)119879HPFB (119894 119895) Fire119860(119894119895) (119899) lt Fire119861(119894119895) (119899)

(10)

where Fire(119894119895)(119899) denotes the firing time of each coefficientwhich is given by

Fire(119894119895) (119899) = Fire(119894119895) (119899 minus 1) + 119884119894119895 (119899) (11)

Step 6 Use the selected coefficients to reconstruct the fusedimage via the inverse TT

4 Experimental Results andPerformance Analysis

41 Evaluation Criteria The existing metrics are classifiedinto three categories statistics-based information-based andhuman-visual-system based classes The selected metricswith smaller correlation are beneficial to the objectivity ofthe evaluation [24] The statistics-based metrics are easilyinfluenced by the pseudoedges of targets so we evaluatethe fusion performance based on information-based andhuman-visual-system based metrics The information-basedevaluation indexes mainly contain information entropy (IE)and mutual information (MI) [25] Moreover 119876119860119861119865 is arepresentative model in the evaluation system based onhuman vision since it has strong correlation with otherhuman-visual-system based metrics [26] These formulas areshown as follows

IE

119867(119883) = 119899sum119894=1

119875 (119909119894) 119868 (119909119894) =119899sum119894=1

119875 (119909119894) log119887119875 (119909119894) (12)

MI

MI = sum119891119886

119901119865119860 (119891 119886) log 119901119865119860 (119891 119886)119901119865 (119891) 119901119860 (119886)

+sum119891119887

119901119865119861 (119891 119887) log 119901119865119861 (119891 119887)119901119865 (119891) 119901119861 (119887)

(13)

119876119860119861119865119876119860119861119865

= sum119873119899=1sum119872119898=1 119876119860119865 (119899119898)119908119860 (119899119898) + 119876119861119865 (119899119898)119908119861 (119899119898)sum119873119894=1sum119872119895=1 (119908119860 (119894 119895) + 119908119861 (119894 119895)) (14)

where 119875(119909119894) is the probability mass function of the inputimages119901119865119883(119886 119887)119901119860(119886) and119901119861(119886) is obtained by simple nor-malization of the joint and marginal histograms of the inputimages 119876119860119865(119899119898) and 119876119861119865(119899119898) are weighted by the coeffi-cients of the edge preservation values119908119860(119899119898) and119908119861(119899119898)reflect the perceptual importance of the corresponding edgeelements IE reflects the amount of average information inthe fused images MI reflects detailed information which

is obtained from source images whereas the metric 119876119860119861119865computes and measures the amount of edge informationtransferred from source images into the fused results Inaddition a larger value of these metrics means a better fusionresults

The source images derived from ThermoTrex Corpora-tion are shown in Figure 5 There are three soldiers with gunand grenade displayed in Figure 5(a) Due to the limitationof penetrability the information of targets under clothingis not included in the visible image But it contains richenvironmental details about imaging scene In contrastFigure 5(b) is the PMMW image The bright part of thePMMW image reflects the location and shape informationof the concealed objects The outline of the gun and grenadeis detected by the MMW owing to its penetrability and thecontour of three soldiers is heavily blurred It is difficult torecognize lawn from the PMMW image We use differentwavelets and fusion rules for acquiring the results in thesubsequent section in order to prove the effectiveness of theproposed algorithm

42The First Group of the Fused Results Thefirst group of thefused results is performed on the PMMW and visible imageFigure 6 illustrates the source images and fusion resultsobtained by different wavelet The fusion results achieved bythe DWT CT NSCT TT and TT-PCNN are displayed inFigures 6(a)ndash6(e) The fusion rule adopted by these waveletsis the same as the description of the TT [15] As can beseen from Figures 6(a)ndash6(e) the five methods successfullyfuse the PMMW and visible image and all the fused imagescontain the concealed objects information and backgroundinformation However it can be found that the fused resultobtained by the DWT has many artifacts due to the lack ofshift-invariance The contour of the gun is a little blurredcaused by the pseudo-Gibbs phenomenaThe CT NSCT andTT achieve a better performance than the DWT methodThe CT has superior performance of depicting the edgedetails So the concealed gun has complete structural featuresfor recognition If the background characteristics of sourceimages have significant differences the CT usually leads tothe decrease of image contrast Due to the shift-invariantof NSCT the pseudo-Gibbs phenomenon is eliminated suc-cessfully Limited by the fusion rules the concealed targetshave low contrast which produces serious impact on riskidentification Since the TT has superior capacity to describesmooth region and local details the fused result achievesbetter effect than the above methods The proposed methodprovides best visual effects Almost all the useful informationof concealed objects is transferred to the fused image andfewer artifacts are introduced during the fusion processTable 1 shows the evaluation results of the five methodsThe IE of the fused image obtained by the DWT and theTT is bigger than the TT-PCNN due to the introductionof invalid information The MI obtained by the TT-PCNNacquires the maximum It illustrates the fact that the fusedimage extracts more information from the original imagesFurthermore 119876119860119861119865 of the TT-PCNN is maximum whichindicates that the proposed algorithm preserves the detailed

Mathematical Problems in Engineering 7

(a) The visible image (b) The PMMW image

Figure 5 The source images

(a) DWT (b) CT

(c) NSCT (d) TT

(e) TT-PCNN

Figure 6 The fused results obtained by different wavelets and the TT-PCNN

8 Mathematical Problems in Engineering

Table 1 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

DWT 71282 77468 00191CT 63092 87047 00078NSCT 59518 90469 00239TT 69637 79843 00255TT-PCNN 67853 115794 00377

(a) CT-PCNN (b) NSCT-PCNN

(c) NSCT (d) TT-PCNN

Figure 7 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

information and extracts more edge information from sourceimages effectively The objective evaluation meets the visualobservation

43 The Second Group of the Fused Results The fusionresults of the NSCT-PCNN CT-PCNN NSCT and TT-PCNN are displayed in Figures 7(a)ndash7(d) As can be seenfrom Figures 7(a)ndash7(d) all of the methods successfully fusethe PMMW and visible images All the fused images stillcontain concealed targets information However the fusedimage obtained by the CT-PCNN still has low contrast dueto the background differences between source images whichis a common problem of the CT based methods While theNSCT-PCNN and NSCT achieve a better performance thanthe CT-PCNNThepseudo-Gibbs phenomenon is eliminatedowing to the shift-invariant of NSCT It is proven that thePCNN is conducive to enhance the details of interesting

targets So the PCNN is beneficial to the fusion of visible andPMMW images But the concealed objects and backgroundhave low contrast Especially the information of the grenadeis difficult to discrimination The TT-PCNN provides bettervisual effects The detailed information of gun and grenadeis preserved well Table 2 shows the evaluation results of thefour methods The IE of the fused image achieved by the TT-PCNN is the second maximum This means that the fusedresult contains a lot of information inherited from sourceimages MI and 119876119860119861119865 of the fused image obtained by theTT-PCNN gain the largest value This demonstrates that theproposed algorithm extracts abundant image informationfrom source images and achieves high contrast

44 The Third Group of the Fused Results As shown inFigure 8 the source images and fused results are displayedwell Figures 8(a) and 8(b) are the visible image and PMMW

Mathematical Problems in Engineering 9

(a) The visible image (b) The PMMW image

(c) CT-PCNN (d) NSCT-PCNN

(e) NSCT (f) TT-PCNN

Figure 8 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

10 Mathematical Problems in Engineering

Table 2 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 68243 84638 00361CT-PCNN [7] 65108 82892 00088NSCT [8] 67329 84112 00299TT-PCNN 67853 115794 00377

Table 3 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 72078 37030 05382CT-PCNN [7] 76778 56082 04273NSCT [8] 77781 47641 05802TT-PCNN 78731 47673 05811

image A single 94-Ghz radiometer on a scanning 24 in dishantenna is used to detect the MMW energy of concealedweapons [27] As can be seen from Figures 8(c)ndash8(f) all ofthe methods successfully synthesize the targets informationand the background information But the contrast of thefused image based on the CT-PCNN is relatively low TheNSCT and NSCT-PCNN methods improve the fusion effectand achieve high contrast However these two methods stillenhanced useless information such as the radiated infor-mation of the dress zipper The TT-PCNN synthesizes thePMMW and visible images highlights the information ofconcealed weapons and suppresses the invalid informationThe objective evaluation of the fused results is listed inTable 3 The TT-PCNN receives the maximum comparedwith other algorithms It proves that the fused result of theproposed method contains abundant target information andpreserves more object features well

5 Conclusion

In this paper an improved PCNN for the fusion of thePMMWand visible image is proposed in the Tetrolet domainThe improved PCNNmodel ismore simple and adaptive withfewer parameters We firstly adopted the improved PCNN tostrengthen the high-pass coefficients of the PMMW image inorder to enhance the contour of concealed targets And thena Laplacian pyramid is introduced for the decomposition oflow-pass band after the TTNext the SF is applied tomotivatethe improved PCNN neurons The flexible multiresolutionof the TT is associated with global coupling and pulsesynchronization characteristics of the PCNN Finally thefour groups of experiments are conducted for evaluatingthe fusion performance The results show that the proposedalgorithm has superior performance of fusing the visibleand PMMW images The fused results have high contrastremarkable target information and rich information ofbackground The proposed method is suitable for fusing theinfrared and visible image which is superior to the other

fusion algorithms in terms of visual quality and quantitativeevaluation

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this article

Acknowledgments

This work is supported by the Postdoctoral fund inJiangsu province under Grant no 1302027C the NaturalScience Foundation of Jiangsu Province under Grant no15KJB510008 and the State Key Laboratory of MillimeterWaves Project no K201714The support of Image ProcessingLab of Jiang Su University of Science and Technology isacknowledged Thanks are due to Dr Qu for publishingrelated program on the Internet andDr Larry DrMerit andDr Philip who collected a large number of PMMW images

References

[1] WYuXChen andLWu ldquoSegmentation ofConcealedObjectsin Passive Millimeter-Wave Images Based on the GaussianMixture Modelrdquo Journal of Infrared Millimeter and TerahertzWaves vol 36 no 4 pp 400ndash421 2015

[2] S Dill M Peichl and H Su ldquoStudy of passiveMMWpersonnelimaging with respect to suspicious and common concealedobjects for security applicationsrdquo in Proceedings of the Millime-tre Wave and Terahertz Sensors and Technology UK September2008

[3] W W Kong Y J Lei Y Lei and S Lu ldquoImage fusion techniquebased on non-subsampled contourlet transform and adaptiveunit-fast-linking pulse-coupled neural networkrdquo IET ImageProcessing vol 5 no 2 pp 113ndash121 2011

[4] X Song L Li and J Yang ldquoImage fusion algorithm for visibleand PMMW images based on em and Ncutrdquo in Proceedingsof the 2013 Joint Conference of International Conference onComputational Problem-Solving and International High Speed

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 4: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

4 Mathematical Problems in Engineering

32

32

16

16

16

16

8

8Low-pass

High-pass

Low-pass

High-pass

64

64

4 times 4

4 times 4

2 times 2

2 times 2

2 times 2

2 times 2

12times1

12times1

12times1

12times1

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot middot middot middot

middot middot middot

middot middot middot

middot middot middot

Figure 3 The diagram of the TT

The complexity limits the application of the PCNNin the field of image fusion Most of the parameters aredifficult to set up due to the change of source images Theseparameters are commonly adjusted by a lot of experimentsand experience The PCNN relies on sync pulse distributionphenomenon for giving rise to pixel change Because themathematics coupled characteristic of the PCNN itself has anoverwritten effect on biological characteristics the improvedPCNN and parameters setting basis are used to eliminatethe coupled characteristic [21] We adopted the optimizationmodel for fusing the high-pass coefficients Thereby theimproved model is given by

119865119894119895 (119899) = 119878119894119895119880119894119895 (119899) = 119865119894119895 (119899) [119863 + 119863sum119872119894119895119896119897119884119894119895 (119899 minus 1)] 119884119894119895 (119899) = 120576 [119880119894119895 (119899) minus 119879119894119895 (119899)] 119879119894119895 (119899) = exp (minus120572119879) 119879119894119895 (119899 minus 1) + 119881119879sum119884119894119895 (119899 minus 1)

(5)

where 119863 is the normalized parameter for finishing the weakcoupling connection The spatial frequency (SF) is fit formotivating the PCNN directly [6] It reflects the gradientfeatures of images in transform domain which is consideredan effective external input stimulus of the PCNN Let 120595119894119895119896119897represent the coefficients located at (119894 119895) in the 119896th subbandsat the 119897th decomposition level These parameters are

119878119894119895 = sum119894119895isin[33]

(120595119894119895119896119897 minus 120595119894minus1119895119896119897)2 + (120595119894119895119896119897 minus 120595119894119895minus1119896119897)2 (6)

119872 =[[[[[[[[

05119863

1119863

05119863

1119863 1 1

11986305119863

1119863

05119863

]]]]]]]]

(7)

where 119878119894119895 denotes the spatial frequency of high-pass domain119878max is the max gray of normalized source images Let = 25 lowast119878max 119881119879 = 119878max and 120572119879 = 00001In addition if the cross entropy is bigger than the last one

during the iterative process the cyclic process of the PCNNis accomplished

3 The TT-PCNN

There is no perfect transformation which achieves completedapproximation of various image details due to the inherentdefect of the multiscale transformThese details usually con-tain important features of the targets We use the Laplacianpyramid to decompose the low-pass band of source imagesin the TT domain The remaining details of the concealedobjects usually exist in the top layer of the Laplacian pyramidMeanwhile the rule based on average gradient is applied tofuse and enhance the details of objects which are sensitive tohuman vision In addition the detailed features of the hiddentargets in the PMMWimages are important to the subsequentrecognition limited by various factors such as the style ofimaging and electronic noise We adopted the coefficient ofthe high-pass band as the input of the PCNN in advanceThe enhancement operator of the PCNN has the capability ofenhancing the details of the hidden targets which is beneficialto the subsequent target recognitionAdditionally we fuse thehigh-coefficient of visible and PMMW images based on theSF The TT-PCNN and fusion rules are shown in Figure 4

Step 1 Decompose the source images into the low-pass andhigh-pass subbands via the TTThe following coefficients areexpressed as the high-pass coefficients119879HPA and119879HPB and thelow-pass coefficients 119879LPA and 119879LPBStep 2 119879LPA and 119879LPB are decomposed by the Laplacianpyramid named 119871PA and 119871PB respectively The fusion rulebased regional gradient is used for fusing the top layer of 119871PAand 119871PB Suppose that LPAtop(119894 119895) is the value of 119871PA located

Mathematical Problems in Engineering 5

Visible imageA PMMW image B

Tetrolet transform

Top layer image Other layer images

Fusion based on regionalaverage gradient

Fusion based on maximumabsolute value

Fusion and inverse pyramidtransform

Improved PCNN

Spatial frequency

Fusion based on firingtime

Inverse Tetrolet transform

Enhancement based onimproved PCNN

Output fusion result

Improved PCNN

Spatial frequency

Low-pass coefficientsT0 T0

High-pass coefficientsT(0

High-pass coefficientsT(0

Laplacian pyramid

Fusion result T0 Fusion result T(0

decomposition L0A and L0B

Figure 4 The flow chart of the TT-PCNN

at top layer The regional space at the center of LPAtop(119894 119895)is 3 times 3 LPFtop(119894 119895) is the fused result of LPAtop(119894 119895)and LPBtop(119894 119895) LPBtop(119894 119895) and 119866119861(119894 119895) have the similardefinition as LPAtop(119894 119895) and 119866119860(119894 119895) The regional averagegradient 119866119860(119894 119895) is expressed as

119866119860 (119894 119895) = 142sum119894=1

2sum119895=1

radicΔ1198681198942 + Δ1198681198952 (8)

where Δ119868119894 and Δ119868119895 are the first-order difference of LPAtop(119894 119895)in different directions So the fusion rule of top layer isdescribed as

LPFtop (119894 119895) = LPAtop (119894 119895) 119866119860 (119894 119895) ge 119866119861 (119894 119895)LPBtop (119894 119895) 119866119860 (119894 119895) lt 119866119861 (119894 119895)

(9)

In addition the rule of choosing the highest absolute value isdesigned to fuse the value of other layers of 119871PA and 119871PB

Step 3 Inverse Laplacian pyramid and obtain the fusionresult 119879LPFStep 4 The enhancement of targets area is based on theimproved PCNN Suppose that 119879HPB(119894 119895) represents thecoefficients of 119879HPB and let 119865119894119895(119899) = 119879HPB(119894 119895) Meanwhilethe other parameters remain the same settings as (7)

Step 5 This is the final fusion of the high-pass coefficientsThe SF is obtained from (6) in slipping windows 3 times 3which is the input of the improved PCNN The fusion ruleis designed as

6 Mathematical Problems in Engineering

119879HPF (119894 119895)

= 119879HPFA (119894 119895) Fire119860(119894119895) (119899) ge Fire119861(119894119895) (119899)119879HPFB (119894 119895) Fire119860(119894119895) (119899) lt Fire119861(119894119895) (119899)

(10)

where Fire(119894119895)(119899) denotes the firing time of each coefficientwhich is given by

Fire(119894119895) (119899) = Fire(119894119895) (119899 minus 1) + 119884119894119895 (119899) (11)

Step 6 Use the selected coefficients to reconstruct the fusedimage via the inverse TT

4 Experimental Results andPerformance Analysis

41 Evaluation Criteria The existing metrics are classifiedinto three categories statistics-based information-based andhuman-visual-system based classes The selected metricswith smaller correlation are beneficial to the objectivity ofthe evaluation [24] The statistics-based metrics are easilyinfluenced by the pseudoedges of targets so we evaluatethe fusion performance based on information-based andhuman-visual-system based metrics The information-basedevaluation indexes mainly contain information entropy (IE)and mutual information (MI) [25] Moreover 119876119860119861119865 is arepresentative model in the evaluation system based onhuman vision since it has strong correlation with otherhuman-visual-system based metrics [26] These formulas areshown as follows

IE

119867(119883) = 119899sum119894=1

119875 (119909119894) 119868 (119909119894) =119899sum119894=1

119875 (119909119894) log119887119875 (119909119894) (12)

MI

MI = sum119891119886

119901119865119860 (119891 119886) log 119901119865119860 (119891 119886)119901119865 (119891) 119901119860 (119886)

+sum119891119887

119901119865119861 (119891 119887) log 119901119865119861 (119891 119887)119901119865 (119891) 119901119861 (119887)

(13)

119876119860119861119865119876119860119861119865

= sum119873119899=1sum119872119898=1 119876119860119865 (119899119898)119908119860 (119899119898) + 119876119861119865 (119899119898)119908119861 (119899119898)sum119873119894=1sum119872119895=1 (119908119860 (119894 119895) + 119908119861 (119894 119895)) (14)

where 119875(119909119894) is the probability mass function of the inputimages119901119865119883(119886 119887)119901119860(119886) and119901119861(119886) is obtained by simple nor-malization of the joint and marginal histograms of the inputimages 119876119860119865(119899119898) and 119876119861119865(119899119898) are weighted by the coeffi-cients of the edge preservation values119908119860(119899119898) and119908119861(119899119898)reflect the perceptual importance of the corresponding edgeelements IE reflects the amount of average information inthe fused images MI reflects detailed information which

is obtained from source images whereas the metric 119876119860119861119865computes and measures the amount of edge informationtransferred from source images into the fused results Inaddition a larger value of these metrics means a better fusionresults

The source images derived from ThermoTrex Corpora-tion are shown in Figure 5 There are three soldiers with gunand grenade displayed in Figure 5(a) Due to the limitationof penetrability the information of targets under clothingis not included in the visible image But it contains richenvironmental details about imaging scene In contrastFigure 5(b) is the PMMW image The bright part of thePMMW image reflects the location and shape informationof the concealed objects The outline of the gun and grenadeis detected by the MMW owing to its penetrability and thecontour of three soldiers is heavily blurred It is difficult torecognize lawn from the PMMW image We use differentwavelets and fusion rules for acquiring the results in thesubsequent section in order to prove the effectiveness of theproposed algorithm

42The First Group of the Fused Results Thefirst group of thefused results is performed on the PMMW and visible imageFigure 6 illustrates the source images and fusion resultsobtained by different wavelet The fusion results achieved bythe DWT CT NSCT TT and TT-PCNN are displayed inFigures 6(a)ndash6(e) The fusion rule adopted by these waveletsis the same as the description of the TT [15] As can beseen from Figures 6(a)ndash6(e) the five methods successfullyfuse the PMMW and visible image and all the fused imagescontain the concealed objects information and backgroundinformation However it can be found that the fused resultobtained by the DWT has many artifacts due to the lack ofshift-invariance The contour of the gun is a little blurredcaused by the pseudo-Gibbs phenomenaThe CT NSCT andTT achieve a better performance than the DWT methodThe CT has superior performance of depicting the edgedetails So the concealed gun has complete structural featuresfor recognition If the background characteristics of sourceimages have significant differences the CT usually leads tothe decrease of image contrast Due to the shift-invariantof NSCT the pseudo-Gibbs phenomenon is eliminated suc-cessfully Limited by the fusion rules the concealed targetshave low contrast which produces serious impact on riskidentification Since the TT has superior capacity to describesmooth region and local details the fused result achievesbetter effect than the above methods The proposed methodprovides best visual effects Almost all the useful informationof concealed objects is transferred to the fused image andfewer artifacts are introduced during the fusion processTable 1 shows the evaluation results of the five methodsThe IE of the fused image obtained by the DWT and theTT is bigger than the TT-PCNN due to the introductionof invalid information The MI obtained by the TT-PCNNacquires the maximum It illustrates the fact that the fusedimage extracts more information from the original imagesFurthermore 119876119860119861119865 of the TT-PCNN is maximum whichindicates that the proposed algorithm preserves the detailed

Mathematical Problems in Engineering 7

(a) The visible image (b) The PMMW image

Figure 5 The source images

(a) DWT (b) CT

(c) NSCT (d) TT

(e) TT-PCNN

Figure 6 The fused results obtained by different wavelets and the TT-PCNN

8 Mathematical Problems in Engineering

Table 1 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

DWT 71282 77468 00191CT 63092 87047 00078NSCT 59518 90469 00239TT 69637 79843 00255TT-PCNN 67853 115794 00377

(a) CT-PCNN (b) NSCT-PCNN

(c) NSCT (d) TT-PCNN

Figure 7 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

information and extracts more edge information from sourceimages effectively The objective evaluation meets the visualobservation

43 The Second Group of the Fused Results The fusionresults of the NSCT-PCNN CT-PCNN NSCT and TT-PCNN are displayed in Figures 7(a)ndash7(d) As can be seenfrom Figures 7(a)ndash7(d) all of the methods successfully fusethe PMMW and visible images All the fused images stillcontain concealed targets information However the fusedimage obtained by the CT-PCNN still has low contrast dueto the background differences between source images whichis a common problem of the CT based methods While theNSCT-PCNN and NSCT achieve a better performance thanthe CT-PCNNThepseudo-Gibbs phenomenon is eliminatedowing to the shift-invariant of NSCT It is proven that thePCNN is conducive to enhance the details of interesting

targets So the PCNN is beneficial to the fusion of visible andPMMW images But the concealed objects and backgroundhave low contrast Especially the information of the grenadeis difficult to discrimination The TT-PCNN provides bettervisual effects The detailed information of gun and grenadeis preserved well Table 2 shows the evaluation results of thefour methods The IE of the fused image achieved by the TT-PCNN is the second maximum This means that the fusedresult contains a lot of information inherited from sourceimages MI and 119876119860119861119865 of the fused image obtained by theTT-PCNN gain the largest value This demonstrates that theproposed algorithm extracts abundant image informationfrom source images and achieves high contrast

44 The Third Group of the Fused Results As shown inFigure 8 the source images and fused results are displayedwell Figures 8(a) and 8(b) are the visible image and PMMW

Mathematical Problems in Engineering 9

(a) The visible image (b) The PMMW image

(c) CT-PCNN (d) NSCT-PCNN

(e) NSCT (f) TT-PCNN

Figure 8 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

10 Mathematical Problems in Engineering

Table 2 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 68243 84638 00361CT-PCNN [7] 65108 82892 00088NSCT [8] 67329 84112 00299TT-PCNN 67853 115794 00377

Table 3 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 72078 37030 05382CT-PCNN [7] 76778 56082 04273NSCT [8] 77781 47641 05802TT-PCNN 78731 47673 05811

image A single 94-Ghz radiometer on a scanning 24 in dishantenna is used to detect the MMW energy of concealedweapons [27] As can be seen from Figures 8(c)ndash8(f) all ofthe methods successfully synthesize the targets informationand the background information But the contrast of thefused image based on the CT-PCNN is relatively low TheNSCT and NSCT-PCNN methods improve the fusion effectand achieve high contrast However these two methods stillenhanced useless information such as the radiated infor-mation of the dress zipper The TT-PCNN synthesizes thePMMW and visible images highlights the information ofconcealed weapons and suppresses the invalid informationThe objective evaluation of the fused results is listed inTable 3 The TT-PCNN receives the maximum comparedwith other algorithms It proves that the fused result of theproposed method contains abundant target information andpreserves more object features well

5 Conclusion

In this paper an improved PCNN for the fusion of thePMMWand visible image is proposed in the Tetrolet domainThe improved PCNNmodel ismore simple and adaptive withfewer parameters We firstly adopted the improved PCNN tostrengthen the high-pass coefficients of the PMMW image inorder to enhance the contour of concealed targets And thena Laplacian pyramid is introduced for the decomposition oflow-pass band after the TTNext the SF is applied tomotivatethe improved PCNN neurons The flexible multiresolutionof the TT is associated with global coupling and pulsesynchronization characteristics of the PCNN Finally thefour groups of experiments are conducted for evaluatingthe fusion performance The results show that the proposedalgorithm has superior performance of fusing the visibleand PMMW images The fused results have high contrastremarkable target information and rich information ofbackground The proposed method is suitable for fusing theinfrared and visible image which is superior to the other

fusion algorithms in terms of visual quality and quantitativeevaluation

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this article

Acknowledgments

This work is supported by the Postdoctoral fund inJiangsu province under Grant no 1302027C the NaturalScience Foundation of Jiangsu Province under Grant no15KJB510008 and the State Key Laboratory of MillimeterWaves Project no K201714The support of Image ProcessingLab of Jiang Su University of Science and Technology isacknowledged Thanks are due to Dr Qu for publishingrelated program on the Internet andDr Larry DrMerit andDr Philip who collected a large number of PMMW images

References

[1] WYuXChen andLWu ldquoSegmentation ofConcealedObjectsin Passive Millimeter-Wave Images Based on the GaussianMixture Modelrdquo Journal of Infrared Millimeter and TerahertzWaves vol 36 no 4 pp 400ndash421 2015

[2] S Dill M Peichl and H Su ldquoStudy of passiveMMWpersonnelimaging with respect to suspicious and common concealedobjects for security applicationsrdquo in Proceedings of the Millime-tre Wave and Terahertz Sensors and Technology UK September2008

[3] W W Kong Y J Lei Y Lei and S Lu ldquoImage fusion techniquebased on non-subsampled contourlet transform and adaptiveunit-fast-linking pulse-coupled neural networkrdquo IET ImageProcessing vol 5 no 2 pp 113ndash121 2011

[4] X Song L Li and J Yang ldquoImage fusion algorithm for visibleand PMMW images based on em and Ncutrdquo in Proceedingsof the 2013 Joint Conference of International Conference onComputational Problem-Solving and International High Speed

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 5: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

Mathematical Problems in Engineering 5

Visible imageA PMMW image B

Tetrolet transform

Top layer image Other layer images

Fusion based on regionalaverage gradient

Fusion based on maximumabsolute value

Fusion and inverse pyramidtransform

Improved PCNN

Spatial frequency

Fusion based on firingtime

Inverse Tetrolet transform

Enhancement based onimproved PCNN

Output fusion result

Improved PCNN

Spatial frequency

Low-pass coefficientsT0 T0

High-pass coefficientsT(0

High-pass coefficientsT(0

Laplacian pyramid

Fusion result T0 Fusion result T(0

decomposition L0A and L0B

Figure 4 The flow chart of the TT-PCNN

at top layer The regional space at the center of LPAtop(119894 119895)is 3 times 3 LPFtop(119894 119895) is the fused result of LPAtop(119894 119895)and LPBtop(119894 119895) LPBtop(119894 119895) and 119866119861(119894 119895) have the similardefinition as LPAtop(119894 119895) and 119866119860(119894 119895) The regional averagegradient 119866119860(119894 119895) is expressed as

119866119860 (119894 119895) = 142sum119894=1

2sum119895=1

radicΔ1198681198942 + Δ1198681198952 (8)

where Δ119868119894 and Δ119868119895 are the first-order difference of LPAtop(119894 119895)in different directions So the fusion rule of top layer isdescribed as

LPFtop (119894 119895) = LPAtop (119894 119895) 119866119860 (119894 119895) ge 119866119861 (119894 119895)LPBtop (119894 119895) 119866119860 (119894 119895) lt 119866119861 (119894 119895)

(9)

In addition the rule of choosing the highest absolute value isdesigned to fuse the value of other layers of 119871PA and 119871PB

Step 3 Inverse Laplacian pyramid and obtain the fusionresult 119879LPFStep 4 The enhancement of targets area is based on theimproved PCNN Suppose that 119879HPB(119894 119895) represents thecoefficients of 119879HPB and let 119865119894119895(119899) = 119879HPB(119894 119895) Meanwhilethe other parameters remain the same settings as (7)

Step 5 This is the final fusion of the high-pass coefficientsThe SF is obtained from (6) in slipping windows 3 times 3which is the input of the improved PCNN The fusion ruleis designed as

6 Mathematical Problems in Engineering

119879HPF (119894 119895)

= 119879HPFA (119894 119895) Fire119860(119894119895) (119899) ge Fire119861(119894119895) (119899)119879HPFB (119894 119895) Fire119860(119894119895) (119899) lt Fire119861(119894119895) (119899)

(10)

where Fire(119894119895)(119899) denotes the firing time of each coefficientwhich is given by

Fire(119894119895) (119899) = Fire(119894119895) (119899 minus 1) + 119884119894119895 (119899) (11)

Step 6 Use the selected coefficients to reconstruct the fusedimage via the inverse TT

4 Experimental Results andPerformance Analysis

41 Evaluation Criteria The existing metrics are classifiedinto three categories statistics-based information-based andhuman-visual-system based classes The selected metricswith smaller correlation are beneficial to the objectivity ofthe evaluation [24] The statistics-based metrics are easilyinfluenced by the pseudoedges of targets so we evaluatethe fusion performance based on information-based andhuman-visual-system based metrics The information-basedevaluation indexes mainly contain information entropy (IE)and mutual information (MI) [25] Moreover 119876119860119861119865 is arepresentative model in the evaluation system based onhuman vision since it has strong correlation with otherhuman-visual-system based metrics [26] These formulas areshown as follows

IE

119867(119883) = 119899sum119894=1

119875 (119909119894) 119868 (119909119894) =119899sum119894=1

119875 (119909119894) log119887119875 (119909119894) (12)

MI

MI = sum119891119886

119901119865119860 (119891 119886) log 119901119865119860 (119891 119886)119901119865 (119891) 119901119860 (119886)

+sum119891119887

119901119865119861 (119891 119887) log 119901119865119861 (119891 119887)119901119865 (119891) 119901119861 (119887)

(13)

119876119860119861119865119876119860119861119865

= sum119873119899=1sum119872119898=1 119876119860119865 (119899119898)119908119860 (119899119898) + 119876119861119865 (119899119898)119908119861 (119899119898)sum119873119894=1sum119872119895=1 (119908119860 (119894 119895) + 119908119861 (119894 119895)) (14)

where 119875(119909119894) is the probability mass function of the inputimages119901119865119883(119886 119887)119901119860(119886) and119901119861(119886) is obtained by simple nor-malization of the joint and marginal histograms of the inputimages 119876119860119865(119899119898) and 119876119861119865(119899119898) are weighted by the coeffi-cients of the edge preservation values119908119860(119899119898) and119908119861(119899119898)reflect the perceptual importance of the corresponding edgeelements IE reflects the amount of average information inthe fused images MI reflects detailed information which

is obtained from source images whereas the metric 119876119860119861119865computes and measures the amount of edge informationtransferred from source images into the fused results Inaddition a larger value of these metrics means a better fusionresults

The source images derived from ThermoTrex Corpora-tion are shown in Figure 5 There are three soldiers with gunand grenade displayed in Figure 5(a) Due to the limitationof penetrability the information of targets under clothingis not included in the visible image But it contains richenvironmental details about imaging scene In contrastFigure 5(b) is the PMMW image The bright part of thePMMW image reflects the location and shape informationof the concealed objects The outline of the gun and grenadeis detected by the MMW owing to its penetrability and thecontour of three soldiers is heavily blurred It is difficult torecognize lawn from the PMMW image We use differentwavelets and fusion rules for acquiring the results in thesubsequent section in order to prove the effectiveness of theproposed algorithm

42The First Group of the Fused Results Thefirst group of thefused results is performed on the PMMW and visible imageFigure 6 illustrates the source images and fusion resultsobtained by different wavelet The fusion results achieved bythe DWT CT NSCT TT and TT-PCNN are displayed inFigures 6(a)ndash6(e) The fusion rule adopted by these waveletsis the same as the description of the TT [15] As can beseen from Figures 6(a)ndash6(e) the five methods successfullyfuse the PMMW and visible image and all the fused imagescontain the concealed objects information and backgroundinformation However it can be found that the fused resultobtained by the DWT has many artifacts due to the lack ofshift-invariance The contour of the gun is a little blurredcaused by the pseudo-Gibbs phenomenaThe CT NSCT andTT achieve a better performance than the DWT methodThe CT has superior performance of depicting the edgedetails So the concealed gun has complete structural featuresfor recognition If the background characteristics of sourceimages have significant differences the CT usually leads tothe decrease of image contrast Due to the shift-invariantof NSCT the pseudo-Gibbs phenomenon is eliminated suc-cessfully Limited by the fusion rules the concealed targetshave low contrast which produces serious impact on riskidentification Since the TT has superior capacity to describesmooth region and local details the fused result achievesbetter effect than the above methods The proposed methodprovides best visual effects Almost all the useful informationof concealed objects is transferred to the fused image andfewer artifacts are introduced during the fusion processTable 1 shows the evaluation results of the five methodsThe IE of the fused image obtained by the DWT and theTT is bigger than the TT-PCNN due to the introductionof invalid information The MI obtained by the TT-PCNNacquires the maximum It illustrates the fact that the fusedimage extracts more information from the original imagesFurthermore 119876119860119861119865 of the TT-PCNN is maximum whichindicates that the proposed algorithm preserves the detailed

Mathematical Problems in Engineering 7

(a) The visible image (b) The PMMW image

Figure 5 The source images

(a) DWT (b) CT

(c) NSCT (d) TT

(e) TT-PCNN

Figure 6 The fused results obtained by different wavelets and the TT-PCNN

8 Mathematical Problems in Engineering

Table 1 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

DWT 71282 77468 00191CT 63092 87047 00078NSCT 59518 90469 00239TT 69637 79843 00255TT-PCNN 67853 115794 00377

(a) CT-PCNN (b) NSCT-PCNN

(c) NSCT (d) TT-PCNN

Figure 7 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

information and extracts more edge information from sourceimages effectively The objective evaluation meets the visualobservation

43 The Second Group of the Fused Results The fusionresults of the NSCT-PCNN CT-PCNN NSCT and TT-PCNN are displayed in Figures 7(a)ndash7(d) As can be seenfrom Figures 7(a)ndash7(d) all of the methods successfully fusethe PMMW and visible images All the fused images stillcontain concealed targets information However the fusedimage obtained by the CT-PCNN still has low contrast dueto the background differences between source images whichis a common problem of the CT based methods While theNSCT-PCNN and NSCT achieve a better performance thanthe CT-PCNNThepseudo-Gibbs phenomenon is eliminatedowing to the shift-invariant of NSCT It is proven that thePCNN is conducive to enhance the details of interesting

targets So the PCNN is beneficial to the fusion of visible andPMMW images But the concealed objects and backgroundhave low contrast Especially the information of the grenadeis difficult to discrimination The TT-PCNN provides bettervisual effects The detailed information of gun and grenadeis preserved well Table 2 shows the evaluation results of thefour methods The IE of the fused image achieved by the TT-PCNN is the second maximum This means that the fusedresult contains a lot of information inherited from sourceimages MI and 119876119860119861119865 of the fused image obtained by theTT-PCNN gain the largest value This demonstrates that theproposed algorithm extracts abundant image informationfrom source images and achieves high contrast

44 The Third Group of the Fused Results As shown inFigure 8 the source images and fused results are displayedwell Figures 8(a) and 8(b) are the visible image and PMMW

Mathematical Problems in Engineering 9

(a) The visible image (b) The PMMW image

(c) CT-PCNN (d) NSCT-PCNN

(e) NSCT (f) TT-PCNN

Figure 8 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

10 Mathematical Problems in Engineering

Table 2 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 68243 84638 00361CT-PCNN [7] 65108 82892 00088NSCT [8] 67329 84112 00299TT-PCNN 67853 115794 00377

Table 3 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 72078 37030 05382CT-PCNN [7] 76778 56082 04273NSCT [8] 77781 47641 05802TT-PCNN 78731 47673 05811

image A single 94-Ghz radiometer on a scanning 24 in dishantenna is used to detect the MMW energy of concealedweapons [27] As can be seen from Figures 8(c)ndash8(f) all ofthe methods successfully synthesize the targets informationand the background information But the contrast of thefused image based on the CT-PCNN is relatively low TheNSCT and NSCT-PCNN methods improve the fusion effectand achieve high contrast However these two methods stillenhanced useless information such as the radiated infor-mation of the dress zipper The TT-PCNN synthesizes thePMMW and visible images highlights the information ofconcealed weapons and suppresses the invalid informationThe objective evaluation of the fused results is listed inTable 3 The TT-PCNN receives the maximum comparedwith other algorithms It proves that the fused result of theproposed method contains abundant target information andpreserves more object features well

5 Conclusion

In this paper an improved PCNN for the fusion of thePMMWand visible image is proposed in the Tetrolet domainThe improved PCNNmodel ismore simple and adaptive withfewer parameters We firstly adopted the improved PCNN tostrengthen the high-pass coefficients of the PMMW image inorder to enhance the contour of concealed targets And thena Laplacian pyramid is introduced for the decomposition oflow-pass band after the TTNext the SF is applied tomotivatethe improved PCNN neurons The flexible multiresolutionof the TT is associated with global coupling and pulsesynchronization characteristics of the PCNN Finally thefour groups of experiments are conducted for evaluatingthe fusion performance The results show that the proposedalgorithm has superior performance of fusing the visibleand PMMW images The fused results have high contrastremarkable target information and rich information ofbackground The proposed method is suitable for fusing theinfrared and visible image which is superior to the other

fusion algorithms in terms of visual quality and quantitativeevaluation

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this article

Acknowledgments

This work is supported by the Postdoctoral fund inJiangsu province under Grant no 1302027C the NaturalScience Foundation of Jiangsu Province under Grant no15KJB510008 and the State Key Laboratory of MillimeterWaves Project no K201714The support of Image ProcessingLab of Jiang Su University of Science and Technology isacknowledged Thanks are due to Dr Qu for publishingrelated program on the Internet andDr Larry DrMerit andDr Philip who collected a large number of PMMW images

References

[1] WYuXChen andLWu ldquoSegmentation ofConcealedObjectsin Passive Millimeter-Wave Images Based on the GaussianMixture Modelrdquo Journal of Infrared Millimeter and TerahertzWaves vol 36 no 4 pp 400ndash421 2015

[2] S Dill M Peichl and H Su ldquoStudy of passiveMMWpersonnelimaging with respect to suspicious and common concealedobjects for security applicationsrdquo in Proceedings of the Millime-tre Wave and Terahertz Sensors and Technology UK September2008

[3] W W Kong Y J Lei Y Lei and S Lu ldquoImage fusion techniquebased on non-subsampled contourlet transform and adaptiveunit-fast-linking pulse-coupled neural networkrdquo IET ImageProcessing vol 5 no 2 pp 113ndash121 2011

[4] X Song L Li and J Yang ldquoImage fusion algorithm for visibleand PMMW images based on em and Ncutrdquo in Proceedingsof the 2013 Joint Conference of International Conference onComputational Problem-Solving and International High Speed

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 6: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

6 Mathematical Problems in Engineering

119879HPF (119894 119895)

= 119879HPFA (119894 119895) Fire119860(119894119895) (119899) ge Fire119861(119894119895) (119899)119879HPFB (119894 119895) Fire119860(119894119895) (119899) lt Fire119861(119894119895) (119899)

(10)

where Fire(119894119895)(119899) denotes the firing time of each coefficientwhich is given by

Fire(119894119895) (119899) = Fire(119894119895) (119899 minus 1) + 119884119894119895 (119899) (11)

Step 6 Use the selected coefficients to reconstruct the fusedimage via the inverse TT

4 Experimental Results andPerformance Analysis

41 Evaluation Criteria The existing metrics are classifiedinto three categories statistics-based information-based andhuman-visual-system based classes The selected metricswith smaller correlation are beneficial to the objectivity ofthe evaluation [24] The statistics-based metrics are easilyinfluenced by the pseudoedges of targets so we evaluatethe fusion performance based on information-based andhuman-visual-system based metrics The information-basedevaluation indexes mainly contain information entropy (IE)and mutual information (MI) [25] Moreover 119876119860119861119865 is arepresentative model in the evaluation system based onhuman vision since it has strong correlation with otherhuman-visual-system based metrics [26] These formulas areshown as follows

IE

119867(119883) = 119899sum119894=1

119875 (119909119894) 119868 (119909119894) =119899sum119894=1

119875 (119909119894) log119887119875 (119909119894) (12)

MI

MI = sum119891119886

119901119865119860 (119891 119886) log 119901119865119860 (119891 119886)119901119865 (119891) 119901119860 (119886)

+sum119891119887

119901119865119861 (119891 119887) log 119901119865119861 (119891 119887)119901119865 (119891) 119901119861 (119887)

(13)

119876119860119861119865119876119860119861119865

= sum119873119899=1sum119872119898=1 119876119860119865 (119899119898)119908119860 (119899119898) + 119876119861119865 (119899119898)119908119861 (119899119898)sum119873119894=1sum119872119895=1 (119908119860 (119894 119895) + 119908119861 (119894 119895)) (14)

where 119875(119909119894) is the probability mass function of the inputimages119901119865119883(119886 119887)119901119860(119886) and119901119861(119886) is obtained by simple nor-malization of the joint and marginal histograms of the inputimages 119876119860119865(119899119898) and 119876119861119865(119899119898) are weighted by the coeffi-cients of the edge preservation values119908119860(119899119898) and119908119861(119899119898)reflect the perceptual importance of the corresponding edgeelements IE reflects the amount of average information inthe fused images MI reflects detailed information which

is obtained from source images whereas the metric 119876119860119861119865computes and measures the amount of edge informationtransferred from source images into the fused results Inaddition a larger value of these metrics means a better fusionresults

The source images derived from ThermoTrex Corpora-tion are shown in Figure 5 There are three soldiers with gunand grenade displayed in Figure 5(a) Due to the limitationof penetrability the information of targets under clothingis not included in the visible image But it contains richenvironmental details about imaging scene In contrastFigure 5(b) is the PMMW image The bright part of thePMMW image reflects the location and shape informationof the concealed objects The outline of the gun and grenadeis detected by the MMW owing to its penetrability and thecontour of three soldiers is heavily blurred It is difficult torecognize lawn from the PMMW image We use differentwavelets and fusion rules for acquiring the results in thesubsequent section in order to prove the effectiveness of theproposed algorithm

42The First Group of the Fused Results Thefirst group of thefused results is performed on the PMMW and visible imageFigure 6 illustrates the source images and fusion resultsobtained by different wavelet The fusion results achieved bythe DWT CT NSCT TT and TT-PCNN are displayed inFigures 6(a)ndash6(e) The fusion rule adopted by these waveletsis the same as the description of the TT [15] As can beseen from Figures 6(a)ndash6(e) the five methods successfullyfuse the PMMW and visible image and all the fused imagescontain the concealed objects information and backgroundinformation However it can be found that the fused resultobtained by the DWT has many artifacts due to the lack ofshift-invariance The contour of the gun is a little blurredcaused by the pseudo-Gibbs phenomenaThe CT NSCT andTT achieve a better performance than the DWT methodThe CT has superior performance of depicting the edgedetails So the concealed gun has complete structural featuresfor recognition If the background characteristics of sourceimages have significant differences the CT usually leads tothe decrease of image contrast Due to the shift-invariantof NSCT the pseudo-Gibbs phenomenon is eliminated suc-cessfully Limited by the fusion rules the concealed targetshave low contrast which produces serious impact on riskidentification Since the TT has superior capacity to describesmooth region and local details the fused result achievesbetter effect than the above methods The proposed methodprovides best visual effects Almost all the useful informationof concealed objects is transferred to the fused image andfewer artifacts are introduced during the fusion processTable 1 shows the evaluation results of the five methodsThe IE of the fused image obtained by the DWT and theTT is bigger than the TT-PCNN due to the introductionof invalid information The MI obtained by the TT-PCNNacquires the maximum It illustrates the fact that the fusedimage extracts more information from the original imagesFurthermore 119876119860119861119865 of the TT-PCNN is maximum whichindicates that the proposed algorithm preserves the detailed

Mathematical Problems in Engineering 7

(a) The visible image (b) The PMMW image

Figure 5 The source images

(a) DWT (b) CT

(c) NSCT (d) TT

(e) TT-PCNN

Figure 6 The fused results obtained by different wavelets and the TT-PCNN

8 Mathematical Problems in Engineering

Table 1 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

DWT 71282 77468 00191CT 63092 87047 00078NSCT 59518 90469 00239TT 69637 79843 00255TT-PCNN 67853 115794 00377

(a) CT-PCNN (b) NSCT-PCNN

(c) NSCT (d) TT-PCNN

Figure 7 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

information and extracts more edge information from sourceimages effectively The objective evaluation meets the visualobservation

43 The Second Group of the Fused Results The fusionresults of the NSCT-PCNN CT-PCNN NSCT and TT-PCNN are displayed in Figures 7(a)ndash7(d) As can be seenfrom Figures 7(a)ndash7(d) all of the methods successfully fusethe PMMW and visible images All the fused images stillcontain concealed targets information However the fusedimage obtained by the CT-PCNN still has low contrast dueto the background differences between source images whichis a common problem of the CT based methods While theNSCT-PCNN and NSCT achieve a better performance thanthe CT-PCNNThepseudo-Gibbs phenomenon is eliminatedowing to the shift-invariant of NSCT It is proven that thePCNN is conducive to enhance the details of interesting

targets So the PCNN is beneficial to the fusion of visible andPMMW images But the concealed objects and backgroundhave low contrast Especially the information of the grenadeis difficult to discrimination The TT-PCNN provides bettervisual effects The detailed information of gun and grenadeis preserved well Table 2 shows the evaluation results of thefour methods The IE of the fused image achieved by the TT-PCNN is the second maximum This means that the fusedresult contains a lot of information inherited from sourceimages MI and 119876119860119861119865 of the fused image obtained by theTT-PCNN gain the largest value This demonstrates that theproposed algorithm extracts abundant image informationfrom source images and achieves high contrast

44 The Third Group of the Fused Results As shown inFigure 8 the source images and fused results are displayedwell Figures 8(a) and 8(b) are the visible image and PMMW

Mathematical Problems in Engineering 9

(a) The visible image (b) The PMMW image

(c) CT-PCNN (d) NSCT-PCNN

(e) NSCT (f) TT-PCNN

Figure 8 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

10 Mathematical Problems in Engineering

Table 2 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 68243 84638 00361CT-PCNN [7] 65108 82892 00088NSCT [8] 67329 84112 00299TT-PCNN 67853 115794 00377

Table 3 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 72078 37030 05382CT-PCNN [7] 76778 56082 04273NSCT [8] 77781 47641 05802TT-PCNN 78731 47673 05811

image A single 94-Ghz radiometer on a scanning 24 in dishantenna is used to detect the MMW energy of concealedweapons [27] As can be seen from Figures 8(c)ndash8(f) all ofthe methods successfully synthesize the targets informationand the background information But the contrast of thefused image based on the CT-PCNN is relatively low TheNSCT and NSCT-PCNN methods improve the fusion effectand achieve high contrast However these two methods stillenhanced useless information such as the radiated infor-mation of the dress zipper The TT-PCNN synthesizes thePMMW and visible images highlights the information ofconcealed weapons and suppresses the invalid informationThe objective evaluation of the fused results is listed inTable 3 The TT-PCNN receives the maximum comparedwith other algorithms It proves that the fused result of theproposed method contains abundant target information andpreserves more object features well

5 Conclusion

In this paper an improved PCNN for the fusion of thePMMWand visible image is proposed in the Tetrolet domainThe improved PCNNmodel ismore simple and adaptive withfewer parameters We firstly adopted the improved PCNN tostrengthen the high-pass coefficients of the PMMW image inorder to enhance the contour of concealed targets And thena Laplacian pyramid is introduced for the decomposition oflow-pass band after the TTNext the SF is applied tomotivatethe improved PCNN neurons The flexible multiresolutionof the TT is associated with global coupling and pulsesynchronization characteristics of the PCNN Finally thefour groups of experiments are conducted for evaluatingthe fusion performance The results show that the proposedalgorithm has superior performance of fusing the visibleand PMMW images The fused results have high contrastremarkable target information and rich information ofbackground The proposed method is suitable for fusing theinfrared and visible image which is superior to the other

fusion algorithms in terms of visual quality and quantitativeevaluation

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this article

Acknowledgments

This work is supported by the Postdoctoral fund inJiangsu province under Grant no 1302027C the NaturalScience Foundation of Jiangsu Province under Grant no15KJB510008 and the State Key Laboratory of MillimeterWaves Project no K201714The support of Image ProcessingLab of Jiang Su University of Science and Technology isacknowledged Thanks are due to Dr Qu for publishingrelated program on the Internet andDr Larry DrMerit andDr Philip who collected a large number of PMMW images

References

[1] WYuXChen andLWu ldquoSegmentation ofConcealedObjectsin Passive Millimeter-Wave Images Based on the GaussianMixture Modelrdquo Journal of Infrared Millimeter and TerahertzWaves vol 36 no 4 pp 400ndash421 2015

[2] S Dill M Peichl and H Su ldquoStudy of passiveMMWpersonnelimaging with respect to suspicious and common concealedobjects for security applicationsrdquo in Proceedings of the Millime-tre Wave and Terahertz Sensors and Technology UK September2008

[3] W W Kong Y J Lei Y Lei and S Lu ldquoImage fusion techniquebased on non-subsampled contourlet transform and adaptiveunit-fast-linking pulse-coupled neural networkrdquo IET ImageProcessing vol 5 no 2 pp 113ndash121 2011

[4] X Song L Li and J Yang ldquoImage fusion algorithm for visibleand PMMW images based on em and Ncutrdquo in Proceedingsof the 2013 Joint Conference of International Conference onComputational Problem-Solving and International High Speed

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 7: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

Mathematical Problems in Engineering 7

(a) The visible image (b) The PMMW image

Figure 5 The source images

(a) DWT (b) CT

(c) NSCT (d) TT

(e) TT-PCNN

Figure 6 The fused results obtained by different wavelets and the TT-PCNN

8 Mathematical Problems in Engineering

Table 1 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

DWT 71282 77468 00191CT 63092 87047 00078NSCT 59518 90469 00239TT 69637 79843 00255TT-PCNN 67853 115794 00377

(a) CT-PCNN (b) NSCT-PCNN

(c) NSCT (d) TT-PCNN

Figure 7 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

information and extracts more edge information from sourceimages effectively The objective evaluation meets the visualobservation

43 The Second Group of the Fused Results The fusionresults of the NSCT-PCNN CT-PCNN NSCT and TT-PCNN are displayed in Figures 7(a)ndash7(d) As can be seenfrom Figures 7(a)ndash7(d) all of the methods successfully fusethe PMMW and visible images All the fused images stillcontain concealed targets information However the fusedimage obtained by the CT-PCNN still has low contrast dueto the background differences between source images whichis a common problem of the CT based methods While theNSCT-PCNN and NSCT achieve a better performance thanthe CT-PCNNThepseudo-Gibbs phenomenon is eliminatedowing to the shift-invariant of NSCT It is proven that thePCNN is conducive to enhance the details of interesting

targets So the PCNN is beneficial to the fusion of visible andPMMW images But the concealed objects and backgroundhave low contrast Especially the information of the grenadeis difficult to discrimination The TT-PCNN provides bettervisual effects The detailed information of gun and grenadeis preserved well Table 2 shows the evaluation results of thefour methods The IE of the fused image achieved by the TT-PCNN is the second maximum This means that the fusedresult contains a lot of information inherited from sourceimages MI and 119876119860119861119865 of the fused image obtained by theTT-PCNN gain the largest value This demonstrates that theproposed algorithm extracts abundant image informationfrom source images and achieves high contrast

44 The Third Group of the Fused Results As shown inFigure 8 the source images and fused results are displayedwell Figures 8(a) and 8(b) are the visible image and PMMW

Mathematical Problems in Engineering 9

(a) The visible image (b) The PMMW image

(c) CT-PCNN (d) NSCT-PCNN

(e) NSCT (f) TT-PCNN

Figure 8 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

10 Mathematical Problems in Engineering

Table 2 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 68243 84638 00361CT-PCNN [7] 65108 82892 00088NSCT [8] 67329 84112 00299TT-PCNN 67853 115794 00377

Table 3 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 72078 37030 05382CT-PCNN [7] 76778 56082 04273NSCT [8] 77781 47641 05802TT-PCNN 78731 47673 05811

image A single 94-Ghz radiometer on a scanning 24 in dishantenna is used to detect the MMW energy of concealedweapons [27] As can be seen from Figures 8(c)ndash8(f) all ofthe methods successfully synthesize the targets informationand the background information But the contrast of thefused image based on the CT-PCNN is relatively low TheNSCT and NSCT-PCNN methods improve the fusion effectand achieve high contrast However these two methods stillenhanced useless information such as the radiated infor-mation of the dress zipper The TT-PCNN synthesizes thePMMW and visible images highlights the information ofconcealed weapons and suppresses the invalid informationThe objective evaluation of the fused results is listed inTable 3 The TT-PCNN receives the maximum comparedwith other algorithms It proves that the fused result of theproposed method contains abundant target information andpreserves more object features well

5 Conclusion

In this paper an improved PCNN for the fusion of thePMMWand visible image is proposed in the Tetrolet domainThe improved PCNNmodel ismore simple and adaptive withfewer parameters We firstly adopted the improved PCNN tostrengthen the high-pass coefficients of the PMMW image inorder to enhance the contour of concealed targets And thena Laplacian pyramid is introduced for the decomposition oflow-pass band after the TTNext the SF is applied tomotivatethe improved PCNN neurons The flexible multiresolutionof the TT is associated with global coupling and pulsesynchronization characteristics of the PCNN Finally thefour groups of experiments are conducted for evaluatingthe fusion performance The results show that the proposedalgorithm has superior performance of fusing the visibleand PMMW images The fused results have high contrastremarkable target information and rich information ofbackground The proposed method is suitable for fusing theinfrared and visible image which is superior to the other

fusion algorithms in terms of visual quality and quantitativeevaluation

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this article

Acknowledgments

This work is supported by the Postdoctoral fund inJiangsu province under Grant no 1302027C the NaturalScience Foundation of Jiangsu Province under Grant no15KJB510008 and the State Key Laboratory of MillimeterWaves Project no K201714The support of Image ProcessingLab of Jiang Su University of Science and Technology isacknowledged Thanks are due to Dr Qu for publishingrelated program on the Internet andDr Larry DrMerit andDr Philip who collected a large number of PMMW images

References

[1] WYuXChen andLWu ldquoSegmentation ofConcealedObjectsin Passive Millimeter-Wave Images Based on the GaussianMixture Modelrdquo Journal of Infrared Millimeter and TerahertzWaves vol 36 no 4 pp 400ndash421 2015

[2] S Dill M Peichl and H Su ldquoStudy of passiveMMWpersonnelimaging with respect to suspicious and common concealedobjects for security applicationsrdquo in Proceedings of the Millime-tre Wave and Terahertz Sensors and Technology UK September2008

[3] W W Kong Y J Lei Y Lei and S Lu ldquoImage fusion techniquebased on non-subsampled contourlet transform and adaptiveunit-fast-linking pulse-coupled neural networkrdquo IET ImageProcessing vol 5 no 2 pp 113ndash121 2011

[4] X Song L Li and J Yang ldquoImage fusion algorithm for visibleand PMMW images based on em and Ncutrdquo in Proceedingsof the 2013 Joint Conference of International Conference onComputational Problem-Solving and International High Speed

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 8: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

8 Mathematical Problems in Engineering

Table 1 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

DWT 71282 77468 00191CT 63092 87047 00078NSCT 59518 90469 00239TT 69637 79843 00255TT-PCNN 67853 115794 00377

(a) CT-PCNN (b) NSCT-PCNN

(c) NSCT (d) TT-PCNN

Figure 7 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

information and extracts more edge information from sourceimages effectively The objective evaluation meets the visualobservation

43 The Second Group of the Fused Results The fusionresults of the NSCT-PCNN CT-PCNN NSCT and TT-PCNN are displayed in Figures 7(a)ndash7(d) As can be seenfrom Figures 7(a)ndash7(d) all of the methods successfully fusethe PMMW and visible images All the fused images stillcontain concealed targets information However the fusedimage obtained by the CT-PCNN still has low contrast dueto the background differences between source images whichis a common problem of the CT based methods While theNSCT-PCNN and NSCT achieve a better performance thanthe CT-PCNNThepseudo-Gibbs phenomenon is eliminatedowing to the shift-invariant of NSCT It is proven that thePCNN is conducive to enhance the details of interesting

targets So the PCNN is beneficial to the fusion of visible andPMMW images But the concealed objects and backgroundhave low contrast Especially the information of the grenadeis difficult to discrimination The TT-PCNN provides bettervisual effects The detailed information of gun and grenadeis preserved well Table 2 shows the evaluation results of thefour methods The IE of the fused image achieved by the TT-PCNN is the second maximum This means that the fusedresult contains a lot of information inherited from sourceimages MI and 119876119860119861119865 of the fused image obtained by theTT-PCNN gain the largest value This demonstrates that theproposed algorithm extracts abundant image informationfrom source images and achieves high contrast

44 The Third Group of the Fused Results As shown inFigure 8 the source images and fused results are displayedwell Figures 8(a) and 8(b) are the visible image and PMMW

Mathematical Problems in Engineering 9

(a) The visible image (b) The PMMW image

(c) CT-PCNN (d) NSCT-PCNN

(e) NSCT (f) TT-PCNN

Figure 8 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

10 Mathematical Problems in Engineering

Table 2 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 68243 84638 00361CT-PCNN [7] 65108 82892 00088NSCT [8] 67329 84112 00299TT-PCNN 67853 115794 00377

Table 3 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 72078 37030 05382CT-PCNN [7] 76778 56082 04273NSCT [8] 77781 47641 05802TT-PCNN 78731 47673 05811

image A single 94-Ghz radiometer on a scanning 24 in dishantenna is used to detect the MMW energy of concealedweapons [27] As can be seen from Figures 8(c)ndash8(f) all ofthe methods successfully synthesize the targets informationand the background information But the contrast of thefused image based on the CT-PCNN is relatively low TheNSCT and NSCT-PCNN methods improve the fusion effectand achieve high contrast However these two methods stillenhanced useless information such as the radiated infor-mation of the dress zipper The TT-PCNN synthesizes thePMMW and visible images highlights the information ofconcealed weapons and suppresses the invalid informationThe objective evaluation of the fused results is listed inTable 3 The TT-PCNN receives the maximum comparedwith other algorithms It proves that the fused result of theproposed method contains abundant target information andpreserves more object features well

5 Conclusion

In this paper an improved PCNN for the fusion of thePMMWand visible image is proposed in the Tetrolet domainThe improved PCNNmodel ismore simple and adaptive withfewer parameters We firstly adopted the improved PCNN tostrengthen the high-pass coefficients of the PMMW image inorder to enhance the contour of concealed targets And thena Laplacian pyramid is introduced for the decomposition oflow-pass band after the TTNext the SF is applied tomotivatethe improved PCNN neurons The flexible multiresolutionof the TT is associated with global coupling and pulsesynchronization characteristics of the PCNN Finally thefour groups of experiments are conducted for evaluatingthe fusion performance The results show that the proposedalgorithm has superior performance of fusing the visibleand PMMW images The fused results have high contrastremarkable target information and rich information ofbackground The proposed method is suitable for fusing theinfrared and visible image which is superior to the other

fusion algorithms in terms of visual quality and quantitativeevaluation

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this article

Acknowledgments

This work is supported by the Postdoctoral fund inJiangsu province under Grant no 1302027C the NaturalScience Foundation of Jiangsu Province under Grant no15KJB510008 and the State Key Laboratory of MillimeterWaves Project no K201714The support of Image ProcessingLab of Jiang Su University of Science and Technology isacknowledged Thanks are due to Dr Qu for publishingrelated program on the Internet andDr Larry DrMerit andDr Philip who collected a large number of PMMW images

References

[1] WYuXChen andLWu ldquoSegmentation ofConcealedObjectsin Passive Millimeter-Wave Images Based on the GaussianMixture Modelrdquo Journal of Infrared Millimeter and TerahertzWaves vol 36 no 4 pp 400ndash421 2015

[2] S Dill M Peichl and H Su ldquoStudy of passiveMMWpersonnelimaging with respect to suspicious and common concealedobjects for security applicationsrdquo in Proceedings of the Millime-tre Wave and Terahertz Sensors and Technology UK September2008

[3] W W Kong Y J Lei Y Lei and S Lu ldquoImage fusion techniquebased on non-subsampled contourlet transform and adaptiveunit-fast-linking pulse-coupled neural networkrdquo IET ImageProcessing vol 5 no 2 pp 113ndash121 2011

[4] X Song L Li and J Yang ldquoImage fusion algorithm for visibleand PMMW images based on em and Ncutrdquo in Proceedingsof the 2013 Joint Conference of International Conference onComputational Problem-Solving and International High Speed

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 9: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

Mathematical Problems in Engineering 9

(a) The visible image (b) The PMMW image

(c) CT-PCNN (d) NSCT-PCNN

(e) NSCT (f) TT-PCNN

Figure 8 The fused results of the CT-PCNN NSCT-PCNN NSCT and TT-PCNN

10 Mathematical Problems in Engineering

Table 2 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 68243 84638 00361CT-PCNN [7] 65108 82892 00088NSCT [8] 67329 84112 00299TT-PCNN 67853 115794 00377

Table 3 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 72078 37030 05382CT-PCNN [7] 76778 56082 04273NSCT [8] 77781 47641 05802TT-PCNN 78731 47673 05811

image A single 94-Ghz radiometer on a scanning 24 in dishantenna is used to detect the MMW energy of concealedweapons [27] As can be seen from Figures 8(c)ndash8(f) all ofthe methods successfully synthesize the targets informationand the background information But the contrast of thefused image based on the CT-PCNN is relatively low TheNSCT and NSCT-PCNN methods improve the fusion effectand achieve high contrast However these two methods stillenhanced useless information such as the radiated infor-mation of the dress zipper The TT-PCNN synthesizes thePMMW and visible images highlights the information ofconcealed weapons and suppresses the invalid informationThe objective evaluation of the fused results is listed inTable 3 The TT-PCNN receives the maximum comparedwith other algorithms It proves that the fused result of theproposed method contains abundant target information andpreserves more object features well

5 Conclusion

In this paper an improved PCNN for the fusion of thePMMWand visible image is proposed in the Tetrolet domainThe improved PCNNmodel ismore simple and adaptive withfewer parameters We firstly adopted the improved PCNN tostrengthen the high-pass coefficients of the PMMW image inorder to enhance the contour of concealed targets And thena Laplacian pyramid is introduced for the decomposition oflow-pass band after the TTNext the SF is applied tomotivatethe improved PCNN neurons The flexible multiresolutionof the TT is associated with global coupling and pulsesynchronization characteristics of the PCNN Finally thefour groups of experiments are conducted for evaluatingthe fusion performance The results show that the proposedalgorithm has superior performance of fusing the visibleand PMMW images The fused results have high contrastremarkable target information and rich information ofbackground The proposed method is suitable for fusing theinfrared and visible image which is superior to the other

fusion algorithms in terms of visual quality and quantitativeevaluation

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this article

Acknowledgments

This work is supported by the Postdoctoral fund inJiangsu province under Grant no 1302027C the NaturalScience Foundation of Jiangsu Province under Grant no15KJB510008 and the State Key Laboratory of MillimeterWaves Project no K201714The support of Image ProcessingLab of Jiang Su University of Science and Technology isacknowledged Thanks are due to Dr Qu for publishingrelated program on the Internet andDr Larry DrMerit andDr Philip who collected a large number of PMMW images

References

[1] WYuXChen andLWu ldquoSegmentation ofConcealedObjectsin Passive Millimeter-Wave Images Based on the GaussianMixture Modelrdquo Journal of Infrared Millimeter and TerahertzWaves vol 36 no 4 pp 400ndash421 2015

[2] S Dill M Peichl and H Su ldquoStudy of passiveMMWpersonnelimaging with respect to suspicious and common concealedobjects for security applicationsrdquo in Proceedings of the Millime-tre Wave and Terahertz Sensors and Technology UK September2008

[3] W W Kong Y J Lei Y Lei and S Lu ldquoImage fusion techniquebased on non-subsampled contourlet transform and adaptiveunit-fast-linking pulse-coupled neural networkrdquo IET ImageProcessing vol 5 no 2 pp 113ndash121 2011

[4] X Song L Li and J Yang ldquoImage fusion algorithm for visibleand PMMW images based on em and Ncutrdquo in Proceedingsof the 2013 Joint Conference of International Conference onComputational Problem-Solving and International High Speed

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 10: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

10 Mathematical Problems in Engineering

Table 2 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 68243 84638 00361CT-PCNN [7] 65108 82892 00088NSCT [8] 67329 84112 00299TT-PCNN 67853 115794 00377

Table 3 The comparison of the fused results

Method type Evaluation standardsIE MI 119876119860119861119865

NSCT-PCNN [6] 72078 37030 05382CT-PCNN [7] 76778 56082 04273NSCT [8] 77781 47641 05802TT-PCNN 78731 47673 05811

image A single 94-Ghz radiometer on a scanning 24 in dishantenna is used to detect the MMW energy of concealedweapons [27] As can be seen from Figures 8(c)ndash8(f) all ofthe methods successfully synthesize the targets informationand the background information But the contrast of thefused image based on the CT-PCNN is relatively low TheNSCT and NSCT-PCNN methods improve the fusion effectand achieve high contrast However these two methods stillenhanced useless information such as the radiated infor-mation of the dress zipper The TT-PCNN synthesizes thePMMW and visible images highlights the information ofconcealed weapons and suppresses the invalid informationThe objective evaluation of the fused results is listed inTable 3 The TT-PCNN receives the maximum comparedwith other algorithms It proves that the fused result of theproposed method contains abundant target information andpreserves more object features well

5 Conclusion

In this paper an improved PCNN for the fusion of thePMMWand visible image is proposed in the Tetrolet domainThe improved PCNNmodel ismore simple and adaptive withfewer parameters We firstly adopted the improved PCNN tostrengthen the high-pass coefficients of the PMMW image inorder to enhance the contour of concealed targets And thena Laplacian pyramid is introduced for the decomposition oflow-pass band after the TTNext the SF is applied tomotivatethe improved PCNN neurons The flexible multiresolutionof the TT is associated with global coupling and pulsesynchronization characteristics of the PCNN Finally thefour groups of experiments are conducted for evaluatingthe fusion performance The results show that the proposedalgorithm has superior performance of fusing the visibleand PMMW images The fused results have high contrastremarkable target information and rich information ofbackground The proposed method is suitable for fusing theinfrared and visible image which is superior to the other

fusion algorithms in terms of visual quality and quantitativeevaluation

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this article

Acknowledgments

This work is supported by the Postdoctoral fund inJiangsu province under Grant no 1302027C the NaturalScience Foundation of Jiangsu Province under Grant no15KJB510008 and the State Key Laboratory of MillimeterWaves Project no K201714The support of Image ProcessingLab of Jiang Su University of Science and Technology isacknowledged Thanks are due to Dr Qu for publishingrelated program on the Internet andDr Larry DrMerit andDr Philip who collected a large number of PMMW images

References

[1] WYuXChen andLWu ldquoSegmentation ofConcealedObjectsin Passive Millimeter-Wave Images Based on the GaussianMixture Modelrdquo Journal of Infrared Millimeter and TerahertzWaves vol 36 no 4 pp 400ndash421 2015

[2] S Dill M Peichl and H Su ldquoStudy of passiveMMWpersonnelimaging with respect to suspicious and common concealedobjects for security applicationsrdquo in Proceedings of the Millime-tre Wave and Terahertz Sensors and Technology UK September2008

[3] W W Kong Y J Lei Y Lei and S Lu ldquoImage fusion techniquebased on non-subsampled contourlet transform and adaptiveunit-fast-linking pulse-coupled neural networkrdquo IET ImageProcessing vol 5 no 2 pp 113ndash121 2011

[4] X Song L Li and J Yang ldquoImage fusion algorithm for visibleand PMMW images based on em and Ncutrdquo in Proceedingsof the 2013 Joint Conference of International Conference onComputational Problem-Solving and International High Speed

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 11: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

Mathematical Problems in Engineering 11

Intelligent Communication Forum ICCP and HSIC 2013 pp319ndash323 China October 2013

[5] J Xiong W Xie J Yang Y Fu K Hu and Z Zhong ldquoANovel Image Fusion Algorithm for Visible and PMMW Imagesbased on Clustering and NSCTrdquo in Proceedings of the 20168th International Conference on Computer and AutomationEngineering ICCAE 2016 pp 1ndash5 Australia March 2016

[6] X B Qu J W Yan H Z Xiao and Z Zhu ldquoImage fusionalgorithm based on spatial frequency-motivated pulse cou-pled neural networks in nonsubsampled contourlet transformdomainrdquo Acta Automatica Sinica vol 34 no 12 pp 1508ndash15142008

[7] J Zhao and S Qu ldquoA better algorithm for fusion of infrared andvisible image based on curvelet transform and adaptive pulsecoupled neural networks (PCNN)rdquo Journal of NorthwesternPolytechnical University vol 29 no 6 pp 849ndash853 2011

[8] S-L Zhou T Zhang D-J Kuai J Zheng and Z-Y ZhouldquoNonsubsampled contourlet image fusion algorithm based ondirectional regionrdquo Laser amp Infrared vol 43 no 2 pp 205ndash2072013

[9] S Li B Yang and J Hu ldquoPerformance comparison of differentmulti-resolution transforms for image fusionrdquo InformationFusion vol 12 no 2 pp 74ndash84 2011

[10] Q Guo and S Liu ldquoPerformance analysis of multi-spectral andpanchromatic image fusion techniques based on two waveletdiscrete approachesrdquo Optik - International Journal for Light andElectron Optics vol 122 no 9 pp 811ndash819 2011

[11] W Wang F Chang T Ji and G Zhang ldquoFusion of multi-focus images based on the 2-generation Curvelet transformrdquoInternational Journal of Digital Content Technology and itsApplications vol 5 no 1 pp 32ndash42 2011

[12] X Chang L C Jiao and J H Jia ldquoMultisensor image adaptivefusion based on nonsubsampled contourletrdquo Chinese Journal ofComputers vol 32 no 11 pp 2229ndash2237 2009

[13] J Krommweh ldquoTetrolet transform A new adaptive Haarwavelet algorithm for sparse image representationrdquo Journal ofVisual Communication and Image Representation vol 21 no 4pp 364ndash374 2010

[14] C Shi J Zhang H Chen and Y Zhang ldquoA novel hybridmethod for remote sensing image approximation using thetetrolet transformrdquo IEEE Journal of Selected Topics in AppliedEarth Observations and Remote Sensing vol 7 no 12 pp 4949ndash4959 2014

[15] Y Huang D Zhang B Yuan and J Kang ldquoFusion of visibleand infrared image based on stationary tetrolet transformrdquoin Proceedings of the 32nd Youth Academic Annual Conferenceof Chinese Association of Automation YAC 2017 pp 854ndash859China May 2017

[16] Y Shen J-W Dang X Feng Y-P Wang and Y Hou ldquoInfraredand visible images fusion based on Tetrolet transformrdquo GuangPu Xue Yu Guang Pu Fen XiSpectroscopy and Spectral Analysisvol 33 no 6 pp 1506ndash1511 2013

[17] X Yan H-L Qin S-Q Liu T-W Yang Z-J Yang and L-ZXue ldquoImage fusion based on Tetrolet transformrdquo GuangdianziJiguangJournal of Optoelectronics Laser vol 24 no 8 pp 1629ndash1633 2013

[18] C-J Zhang Y Chen C Duanmu and H-J Feng ldquoMulti-channel satellite cloud image fusion in the tetrolet transformdomainrdquo International Journal of Remote Sensing vol 35 no24 pp 8138ndash8168 2014

[19] M M Subashini and S K Sahoo ldquoPulse coupled neuralnetworks and its applicationsrdquoExpert SystemswithApplicationsvol 41 no 8 pp 3965ndash3974 2014

[20] Z Wang S Wang Y Zhu and Y Ma ldquoReview of Image FusionBased on Pulse-Coupled Neural Networkrdquo Archives of Compu-tational Methods in Engineering State-of-the-Art Reviews vol23 no 4 pp 659ndash671 2016

[21] X-Y Deng and Y-D Ma ldquoPCNN model automatic parame-ters determination and its modified modelrdquo Tien Tzu HsuehPaoActa Electronica Sinica vol 40 no 5 pp 955ndash964 2012

[22] L Chenhui and N Jianying ldquoFusion algorithm for visibleand PMMW image based on multi-band wavelet and adaptivePCNNrdquo Video Engineering vol 40 no 10 pp 28ndash32 2016

[23] J Xiong R Tan L Li and J Yang ldquoImage fusion algorithm forvisible and PMMW images based on Curvelet and improvedPCNNrdquo in Proceedings of the 2012 11th International Conferenceon Signal Processing ICSP 2012 vol 2 pp 903ndash907 ChinaOctober 2012

[24] X-L Zhang X-F Li and J Li ldquoValidation and correlationanalysis of metrics for evaluating performance of image fusionrdquoZidonghua XuebaoActa Automatica Sinica vol 40 no 2 pp306ndash315 2014

[25] G H Qu D L Zhang and P F Yan ldquoInformation measure forperformance of image fusionrdquo IEEE Electronics Letters vol 38no 7 pp 313ndash315 2002

[26] V Petrovi and C Xydeas ldquoOn the effects of sensor noise inpixel-level image fusion performancerdquo in Proceedings of the 3rdInternational Conference on Information Fusion FUSION 2000pp WeC314ndashWeC319 France July 2000

[27] L Yujiri M Shoucri and P Moffa ldquoPassive millimeter waveimagingrdquo IEEE Microwave Magazine vol 4 no 3 pp 39ndash502003

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 12: A Visible and Passive Millimeter Wave Image Fusion Algorithm …downloads.hindawi.com/journals/mpe/2018/4205308.pdf · 2018-11-11 · LPA and LPB. Step 2. LPA and LPB aredecomposedbytheLaplacian

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom