14
Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical Image Fusion Using Generalized Gaussian Density Guocheng Yang, 1,2,3 Meiling Li, 4 Leiting Chen, 1,3 and Jie Yu 1 1 School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China 2 Department of Biomedical Engineering, Sichuan Medical University, Zhongshan Road, Luzhou, Sichuan 646000, China 3 Provincial Key Laboratory of Digital Media, Chengdu 611731, China 4 School of Life Science and Technology, University of Electronic Science and Technology of China, Chengdu 610054, China Correspondence should be addressed to Guocheng Yang; [email protected] Received 24 April 2015; Accepted 10 September 2015 Academic Editor: Laszlo G. Nyul Copyright © 2015 Guocheng Yang et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. We propose a novel medical image fusion scheme based on the statistical dependencies between coefficients in the nonsubsampled contourlet transform (NSCT) domain, in which the probability density function of the NSCT coefficients is concisely fitted using generalized Gaussian density (GGD), as well as the similarity measurement of two subbands is accurately computed by Jensen- Shannon divergence of two GGDs. To preserve more useful information from source images, the new fusion rules are developed to combine the subbands with the varied frequencies. at is, the low frequency subbands are fused by utilizing two activity measures based on the regional standard deviation and Shannon entropy and the high frequency subbands are merged together via weight maps which are determined by the saliency values of pixels. e experimental results demonstrate that the proposed method significantly outperforms the conventional NSCT based medical image fusion approaches in both visual perception and evaluation indices. 1. Introduction Multimodal medical image fusion (MIF) is a process of extracting complementary information from different source images and integrating them into a resultant image. e integration of multimodality medical images can provide more comprehensive pathological information for doctors, which greatly helps their diagnosis and treatment. For exam- ple, the fusion of computed tomography (CT) and magnetic resonance imaging (MRI) may simultaneously provide dense structures like bones and pathological soſt tissue informa- tion. e combination of single-photon emission computed tomography (SPECT) and MRI image not only displays anatomical information, but also provides functional and metabolic information. Additionally, combining CT and the positive electron tomography (PET) image can concurrently visualize anatomical and physiological characteristics of the human body, the result of which is used to view tumor activity in oncology and discern tumor boundaries in organ diagnosis. erefore, MIF technique can effectively provide support for medical diagnostic and healthcare. Nowadays, multiresolution decomposition (MSD) based MIF has been recognized as an effective work, which can extract more abundant information from source images of different modalities. is technique has had a fast develop- ment and extensive application in the past decades. For exam- ple, Qu et al. [1] have utilized wavelet transform to fuse med- ical images. Ali et al. performed the combination of CT and MRI by the curvelet transform in [2] and Yang et al. proposed a fusion algorithm for multimodal medical images based on contourlet transform (CT) [3]. Li and Wang employed the nonsubsampled contourlet transform (NSCT) for the combination of MRI and SPECT in [4]. Compared with other multiscale decomposition, NSCT proposed by da Cunha et al. [5] is a more prominent tool and it has been successfully used in image denoising [6] and image enhancement [7]. Because of its properties of multiscale, multidirection, and the full shiſt-invariance, when it is used for image decomposition, it can capture the higher dimensional singularities such as edges and contours that cannot be effectively represented by the wavelets and avoid pseudo-Gibbs phenomena that presents in the contourlet transform. Specifically, when it Hindawi Publishing Corporation Computational and Mathematical Methods in Medicine Volume 2015, Article ID 262819, 13 pages http://dx.doi.org/10.1155/2015/262819

Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

Research ArticleThe Nonsubsampled Contourlet Transform Based StatisticalMedical Image Fusion Using Generalized Gaussian Density

Guocheng Yang123 Meiling Li4 Leiting Chen13 and Jie Yu1

1School of Computer Science and Engineering University of Electronic Science and Technology of China Chengdu 611731 China2Department of Biomedical Engineering Sichuan Medical University Zhongshan Road Luzhou Sichuan 646000 China3Provincial Key Laboratory of Digital Media Chengdu 611731 China4School of Life Science and Technology University of Electronic Science and Technology of China Chengdu 610054 China

Correspondence should be addressed to Guocheng Yang gchyangli163com

Received 24 April 2015 Accepted 10 September 2015

Academic Editor Laszlo G Nyul

Copyright copy 2015 Guocheng Yang et alThis is an open access article distributed under the Creative CommonsAttribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

We propose a novel medical image fusion scheme based on the statistical dependencies between coefficients in the nonsubsampledcontourlet transform (NSCT) domain in which the probability density function of the NSCT coefficients is concisely fitted usinggeneralized Gaussian density (GGD) as well as the similarity measurement of two subbands is accurately computed by Jensen-Shannon divergence of two GGDs To preserve more useful information from source images the new fusion rules are developed tocombine the subbands with the varied frequenciesThat is the low frequency subbands are fused by utilizing two activity measuresbased on the regional standard deviation and Shannon entropy and the high frequency subbands are merged together via weightmaps which are determined by the saliency values of pixels The experimental results demonstrate that the proposed methodsignificantly outperforms the conventional NSCT based medical image fusion approaches in both visual perception and evaluationindices

1 Introduction

Multimodal medical image fusion (MIF) is a process ofextracting complementary information from different sourceimages and integrating them into a resultant image Theintegration of multimodality medical images can providemore comprehensive pathological information for doctorswhich greatly helps their diagnosis and treatment For exam-ple the fusion of computed tomography (CT) and magneticresonance imaging (MRI) may simultaneously provide densestructures like bones and pathological soft tissue informa-tion The combination of single-photon emission computedtomography (SPECT) and MRI image not only displaysanatomical information but also provides functional andmetabolic information Additionally combining CT and thepositive electron tomography (PET) image can concurrentlyvisualize anatomical and physiological characteristics of thehuman body the result of which is used to view tumoractivity in oncology and discern tumor boundaries in organdiagnosis Therefore MIF technique can effectively providesupport for medical diagnostic and healthcare

Nowadays multiresolution decomposition (MSD) basedMIF has been recognized as an effective work which canextract more abundant information from source images ofdifferent modalities This technique has had a fast develop-ment and extensive application in the past decades For exam-ple Qu et al [1] have utilized wavelet transform to fuse med-ical images Ali et al performed the combination of CT andMRI by the curvelet transform in [2] and Yang et al proposeda fusion algorithm for multimodal medical images basedon contourlet transform (CT) [3] Li and Wang employedthe nonsubsampled contourlet transform (NSCT) for thecombination ofMRI and SPECT in [4] Comparedwith othermultiscale decompositionNSCTproposed by daCunha et al[5] is a more prominent tool and it has been successfully usedin image denoising [6] and image enhancement [7] Becauseof its properties of multiscale multidirection and the fullshift-invariance when it is used for image decompositionit can capture the higher dimensional singularities such asedges and contours that cannot be effectively representedby the wavelets and avoid pseudo-Gibbs phenomena thatpresents in the contourlet transform Specifically when it

Hindawi Publishing CorporationComputational and Mathematical Methods in MedicineVolume 2015 Article ID 262819 13 pageshttpdxdoiorg1011552015262819

2 Computational and Mathematical Methods in Medicine

NSCT

NSCT

Update high frequency coefficients

Update high frequency coefficients

High frequency subbands

High frequency subbands

Low frequency subband

Low frequency subbandFused high frequency subbands

Fused low frequency subbandsFused image

Inverse NSCT

Statistics ofcoefficients

Statistics ofcoefficients

Input imageA

Input image B

middot middot middot

middot middotmiddot

middot middot middot

middot middotmiddot

middot middot middot

middot middotmiddot

middot middot middot

middot middotmiddot

Figure 1 The schematic diagram of the proposed medical image fusion method

is used for image fusion the impacts of misregistration onthe fused results can also be reduced effectively [8] and thecorrespondence between different subbands is easily foundTherefore NSCT is more suitable for medical image fusionAlthough medical image fusion methods based NSCT haveachieved good results [9ndash13] most existing fusion methodsneglect the dependencies between subband coefficients at theinterscale and intrascaleHowever the dependencies betweendecomposition coefficients commonly exist Whatrsquos morethe characteristics show non-Gaussian distribution and havethe heavy tailed phenomenon Thus making full use of thestatistical dependencies between subband coefficients willeffectively improve fusion performance

In this paper we present a novel NSCT based statisticalmultimodalMIF schemewhich utilizes generalizedGaussiandensity (GGD) to fit the marginal distributions of the highfrequency coefficients and quantify the similarity measure-ment between two subbands by the symmetric Jensen-Shannon divergence (JSD) [14 15] of two GGDs Combiningthe relationships between subband coefficients the high fre-quency coefficients are updated and finally fusedThe generalframework is shown in Figure 1 The main contributions ofthe proposed method are summarized as follows

(1) This study proposes a novel MIF method whichexplores the dependencies between subband coeffi-cients in NSCT domain

(2) GGD and JSD based statistical model is developed tonicely fitmarginal distributions of the NSCT subbandcoefficients

(3) Thenew fusion rules are developed to fuse coefficientswith the low frequency and high frequency respec-tively

The rest of this paper is organized as follows In Section 2related studies are reviewed In Section 3 we first give a briefintroduction of NSCT and then analyze the characteristicsbetween subband coefficients in NSCT domain and computetheir dependencies The novel fusion rules are developed tofuse the low frequency subbands and high frequency sub-bands in Section 4 Section 5 provides experimental resultsand discussion Conclusions are drawn in the last section

2 Related Research

A plethora of MIF methods based on NSCT assume thatthe coefficients of decomposition subbands are statisticallyindependent namely there are no dependencies betweensubband coefficients across scales and within scale Thus thiskind of methods usually results in loss of some informationof the source images However for a decomposition imageusing NSCT there really exist the dependencies betweensubbands in different levels and different orientations atthe same scale Several famous statistical models based onmultiresolution analysis have been proposed to characterizethe dependencies of subband coefficients across scales Forexample the statistical models integrating Hidden MarkovTree (HMT) with the discrete wavelet transform (DWT) orthe contourlet transform have been applied in the imagedenosing [16ndash18] Moreover the model of combining HMTand DWT is successfully applied in image segmentation [19]

Computational and Mathematical Methods in Medicine 3

MRI image

NSP

NSP

NSDFB

NSDFB

Y2

Y1

Y0

High frequency subbands in the first levelLow frequency subband

High frequency subbands in the second level

Figure 2 An illustration of the NSCT the decomposition levels correspond to the first and second level and the number of directions ofNSDFB is set to [2 3] respectively

As two recent examples Wang et al proposed two statisticalmodels in the shift invariant shearlet transform domain onecombines the HMT [20] and the other utilizes GGD [21]Although the statistical model based on HMT has successfulapplications it contains some defects such as the low fittingprecision the high dependency for convergence of functionthe lack of flexibility for the quad-tree structure itself and soon

In this paper we present a novel statistical model tomeasure the dependencies of subband coefficients in NSCTdomain The advantage of the model is that one parent nodemay have any number of child leaves instead of havinglimitation of one to four as HMTmodel Our work seeminglyshares some themes with literatures [21 22] where theprobability density function (PDF) of each decompositionsubband is modeled with the GGD and the similarity mea-surement between subbands is computed by the Kullback-Leibler distance (KLD) of twoGGDs However our statisticalmodel focuses on the statistics of the NSCT coefficientsand we evaluate the similarity of subbands across scales bythe JSD rather than KLD In addition different fusion rulesare respectively developed to combine components with lowfrequency and high frequency

3 The Proposed Algorithm

31 Overview of NSCT NSCT as a shift invariant version ofcontourlet is an overcomplete transform with flexible multi-scale multidirectional expansion for images [5] The decom-position process of the NSCT is divided into two phases thatis the nonsubsampled pyramids (NSP) and the nonsubsam-pled directional filter bank (NSDFB) The former performsmultiscale decomposition and the later provides directiondecomposition The NSP divides image into a low frequencysubband and a high frequency subband in each level Giventhat the decomposition level is 119896 NSP will generate 119896+1 sub-band images which consist of one low frequency image and 119896

high frequency images The subsequent NSDFB decomposes

the high frequency subbands from NSP in each level Asfor a specific subband let the number of decompositiondirections be 119897 then 2

119897 directional subbands are obtainedwhose sizes are all the same as the source image After thelow frequency component is decomposed iteratively by thesame way an image is finally decomposed into one lowfrequency subimage and a series of high frequency directionalsubband images (sum

119896

119895=12119897119895) wherein 119897119895 denotes the number

of decomposition directions at the 119895 scale Figure 2 showsan intuitive example of NSCT The diagram only enumeratesthe first two decomposition levels and the number of NSDFBdirections is set to [2 3] from coarser to finer scale

32 Characteristics of the NSCT Subband CoefficientsFigure 3 plots the conditional distributions of the NSCTcoefficients which characterizes the correlations betweensubband coefficients of the MRI image in Figure 2 whereinFigures 3(a) and 3(b) are probability distribution betweentwo subband coefficients at different scales and Figures3(c) and 3(d) are probability distribution between twosubband coefficients with different directions at the samescale Mathematically the conditional distributions can bedescribed as 119875(119883 | 119875119883 = 119901119909) and 119875(119883 | 119862119883 = 119888119909) here119901119909 and 119888119909 show the coefficients of parents and cousinsAs shown in Figure 3 the relationships between subbandcoefficients demonstrate the nonlinear and interlacedaliasing on the whole which illustrates that there existinterdependencies between subband coefficients in NSCTdomain Simultaneously there is approximately independentor the slight correlation between subband coefficients withdifferent directions at the same scale (cousin-cousin) whilethere is stronger correlation between subband coefficientsat different scales (parents-children) Thus the relationshipsof the NSCT coefficients mainly exist between parents andchildren

Figure 4 corresponds to the histograms of four subimagesin Figure 3 Obviously all the characteristic diagrams havesimilar features with a very sharp peak at the zero amplitude

4 Computational and Mathematical Methods in Medicine

50 10 15minus5minus10minus15

Coefficients

minus6

minus4

minus2

0

2

4

6

Pare

ntsrsquo

coeffi

cien

ts

(a)

minus30

minus20

minus10

0

10

20

30

Pare

ntsrsquo

coeffi

cien

ts

0 5 10minus5minus10

Coefficients(b)

minus30

minus20

minus10

0

10

20

30

Cou

sinrsquos

coeffi

cien

ts

3010 200minus30 minus20 minus10

Coefficients(c)

0 5 10minus5minus10

Coefficients

15

10

5

0

minus5

minus10

minus15

Cou

sinrsquos

coeffi

cien

ts

(d)

Figure 3 The conditional distribution between subband coefficients of the MRI image in Figure 2 (a) and (b) are the distributioncharacteristics between subband coefficients at different scales (c) and (d) are distribution characteristics of subband coefficientswith differentdirections at the same scale

and the extended tails in both sides of the peak whichindicates that the NSCT coefficients are sparse and themajority of coefficients are close to zero Further the kurtosisof each map is respectively measured as 2074 832 1044and 2055 (corresponding to (a) (b) (c) and (d) of the firstrow of Figure 4 in order) Clearly these values are muchlarger than the kurtosis of Gaussian distribution (kurtosisis about 30) What is more through a large number ofexperiments the coefficient characteristics (sparse and heavytailed phenomenon) are similar for other NSCT subbands Sothere exists a fact that the NSCT coefficients are sparse andhighly non-Gaussian

How to quantify the dependencies between NSCT coef-ficients by a statistical model is a subject worthy of studyInspired by the earlier statistical model of MSD coefficientsof image [23ndash26] in which the PDFs of coefficients across

scales and within scale are nicely fitted by the GGD functionwe fit the distribution characteristics by the same way andcalculate the dependencies of the NSCT coefficients Figure 5provides four PDFs of theNSCTcoefficients togetherwith thecurves of the fitted GGDs (as shown purple curves) It can beseen that these fitted curves are very close to the actual caseTherefore the statistical model can be applied to describe thespatial distribution characteristics of the NSCT coefficients

33 Statistics of the NSCT Coefficients The GGD model hasbeen extensively applied to describe the marginal densityof subband coefficients due to its flexible parametric formwhich adapts to a large family of distributions from super-Gaussian to sub-Gaussian Accordingly the approximation ofthe marginal density for the particular NSCT coefficient may

Computational and Mathematical Methods in Medicine 5

08

06

04

02

0

Prob

abili

ty d

ensit

y

The parameters of GGD120572 = 07474 120573 = 03720

minus10 minus5 0 5 10

NSCT coefficient amplitudes

minus10 minus5 0 5 10

Kurtosis = 207422

NSCT coefficient amplitudes

times104

25

2

15

1

05

0

The h

istog

ram

(a)

The parameters of GGD120572 = 11213 120573 = 04926

Prob

abili

ty d

ensit

y

08

06

04

02

0

1

minus5 0 5

NSCT coefficient amplitudes

times103

minus5 0 5

Kurtosis = 83149

NSCT coefficient amplitudes

The h

istog

ram

12

10

8

6

4

2

0

(b)

Prob

abili

ty d

ensit

y

The parameters of GGD120572 = 09625 120573 = 04120

08

06

04

02

0

1

NSCT coefficient amplitudesminus6 minus4 minus2 0 2 4 6

The h

istog

ram

Kurtosis = 104347

NSCT coefficient amplitudes

16

14

12

10

8

6

4

2

0minus6 minus4 minus2 0 2 4 6

times103

(c)

The parameters of GGD120572 = 08259 120573 = 03035

Prob

abili

ty d

ensit

y

08

06

04

02

0

1

minus5 0 5

NSCT coefficient amplitudes

25

2

3

15

1

05

0

The h

istog

ram

Kurtosis = 205509

minus5 0 5

NSCT coefficient amplitudes

times104

(d)

Figure 4 Histograms of four distribution maps in Figure 3 (the first row) and the curves fitted with GGDs (the second row)

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 5 The fusion results of different fusion methods for the first set of CT and MRI images (a) (b) Source images fused images by (c)DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

be achieved by varying two parameters of the GGD which isdefined as

119875 (119909 120572 120573) =120573

2120572Γ (1120573)119890minus(|119909|120572)

120573

(1)

where Γ(sdot) is the Gamma function 120572 is the scale parameter(width of the PDF peak) and 120573 is the shape parameterwhich tunes the decay rate of the density function Normally

the parameters 120572 and 120573 are computed by the maximumlikelihood (ML) estimator which has shown to be the desiredestimator [27] As for each subband the likelihood functionof the sample 119909 = (1199091 1199092 119909119899) is defined as

119871 (119909 120572 120573) = log119899

prod

119894=1

119901 (119909119894 120572 120573) (2)

6 Computational and Mathematical Methods in Medicine

In this case 120572 and 120573 are parameters that need to beestimated We can obtain the unique root by the likelihoodequations below here Ψ(sdot) denotes the digamma function

120597119871 (119909 120572 120573)

120597120573= minus

119871

120572+

119873

sum

119894=1

1205731003816100381610038161003816119909119894

1003816100381610038161003816120573120572minus120573

120572= 0 (3)

120597119871 (119909 120572 120573)

120597120573=

119871

120573+

119871Ψ (1120573)

1205732

minus

119873

sum

119894=1

(

10038161003816100381610038161199091198941003816100381610038161003816

120572)

120573

log(1003816100381610038161003816119909119894

1003816100381610038161003816

120572) = 0

(4)

Let 120573 be fixed and 120573 gt 0 then (4) has the unique solutionwhich is the real and positive value

= (120573

119873

119873

sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

1120573

(5)

Combining (4) and (5) the shape parameter 120573 can besolved by the following transcendental equation

1 +Ψ (1120573)

120573minus

sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573 log 1003816100381610038161003816119909119894

1003816100381610038161003816

sum1003816100381610038161003816119909119894

1003816100381610038161003816120573

+

log((120573119873)sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

120573= 0

(6)

In (6) the determination of 120573 can be effectively solvedusing Newton-Raphson iterative procedure [26 28] and thealgorithm is detailedly described in [22]Therefore with onlytwo parameters we can accurately characterize the marginaldistribution of the NSCT coefficients

34 The Dependency of Different NSCT Subbands The KLDis a common and justified way of measuring the distancebetween two distributions119863KL(119875 119876) is applied to describethe deficiency of using one distribution 119902 to represent the truedistribution 119901 which is generally used for comparing tworelated distributions The KLD between two distributions for119875119876 the PDFs of which are respectively denoted as 119901(119883 1205791)119901(119883 1205792) is defined as

119863KL (119875 119876) = 119863 (119901 (119909 1205791) 119901 (119909 1205792))

= int119901 (119909 1205791) log119901 (119909 1205791)

119901 (119909 1205792)119889119909

(7)

where 1205791 and 1205792 are a set of estimated parameters Giventwo GGD distributions of NSCT subbands the similaritybetween two GGDs for NSCT subbands can be defined bythe parameters 120572 and 120573 Substitute (1) into (7) and after some

manipulations the KLD between two PDFs can be expressedas

119863KL (119875 119876) = 119863KL (119901 (sdot 1198861 1205731) 119901 (sdot 1205722 1205732))

= log(12057311205722Γ (11205732)

12057321205721Γ (11205731))

+ (1205721

1205722

)

1205732 Γ ((1205732 + 1) 1205731)

Γ (11205731)minus

1

1205731

(8)

However there are some deficiencies with the KLDwhich makes it less ideal First the KLD is asymmetricthat is (119863KL(119875 119876)) is different from (119863KL(119876 119875))Second if 119902(119909) = 0 and 119901(119909) = 0 for any 119909 then119863KL(119875 119876) is undefined Third the KLD does not offerany nice upper bounds [14] On the other hand the JSDhas the characteristics of nonnegativity finiteness symmetryand boundedness [15 29] So we use the symmetric JSD tomeasure the similarity between two NSCT subbands in thisstudy The JSD between GGDs is derived from the KLDmathematically it is defined as

119863JS (119875 119876) = 119863JS (119876 119875)

=1

2(119863KL (119875 119872) + 119863KL (119876 119872))

119872 =119875 + 119876

2

(9)

4 The Proposed Image Fusion Technique

41 Fusion of the Low Frequency Coefficients The low fre-quency subbands represent the approximation componentsof the source images The simplest way to combine subbandcoefficients is the averaging method However this methodeasily leads to the low contrast and blurred result To extractmore useful information from the source images for thelow frequency coefficients we employ the fusion rule basedon two activity level measurements which consists of theregional standard deviation and Shannon entropy In prin-ciple the local texture features of an image are related withthe variation of the coefficients around neighborhood Onthe other hand the entropy indicates how much informationan image contains Thus combining the two together canextract more complementary information present in thesource images The process is listed as follows

(1) Computing the regional standard deviation119863120582(119909 119910)

119863120582 (119909 119910)

= radic sum

119898isin119872119899isin119873

120596 (119898 119899) times [119862120582 (119909 + 119898 119910 + 119899) minus 119878120582 (119909 119910)]2(10)

(2) Calculating the normalized Shannon entropy

119864120582 (119909 119910) =1

|119877|sum

119894119895

(119862120582

0(119894 119895))

2

log (1198621205820(119894 119895))

2

(11)

Computational and Mathematical Methods in Medicine 7

(3) Computing the weights (120575120582 120585120582) of the standarddeviation 119863120582(119909 119910) and the information entropy 119864120582(119909 119910)respectively

120575120582 =

1003816100381610038161003816119863120582 (119909 119910)1003816100381610038161003816120572

1003816100381610038161003816119863119860 (119909 119910)1003816100381610038161003816120572+1003816100381610038161003816119863119861 (119909 119910)

1003816100381610038161003816120572

120585120582 =119864120582 (119909 119910)

119864119860 (119909 119910) + 119864119861 (119909 119910)

(12)

where the parameter 120572 is a constant which tunes thesharpness of fused image by adjusting the value of parameterit is set to 12 in our experiment

Let 119862120582

0(119909 119910) denote the low frequency subband coeffi-

cient at location (119909 119910)120582 is input image119860119861 Finally the fusedimage can be obtained by

119862119865

0(119909 119910) = sum

120582=119860119861

[120575120582119862120582

0(119909 119910) + 120585120582119862

120582

0(119909 119910)] (13)

42 Fusion of the High Frequency Coefficients High fre-quency subbands correspond to detailed information in theseregions such as edges lines and corners Because differentimaging modalities contain redundant and complementaryinformation of each other the purpose of selection rule ismainly to capture salient information of the source imagesas much as possible Maximum selection rule is not suitablefor medical image fusion because it works well on thispremise that only an original image provides good pixelat each corresponding location thus vast complementaryinformation will be lost when it is used for MIF To improvethe fusion performance for the high frequency subbands wepropose the fusion scheme based on weight maps which aredetermined by the saliency maps According to the fact thatthere exist dependencies between the NSCT coefficients thehigh frequency coefficients are first updated by utilizing therelationships between NSCT subbands and then combiningtogether by using weight maps The process is described asfollows

(1) Updating of the High Frequency Subband CoefficientsFirst we calculate the horizontal dependency 119895119904119889

119897120579ℎbetween

coefficients with different directions at the same scale 119897 as

119895119904119889119897120579ℎ

(119909 119910) =

119870

sum

119895=1119895 =119894

119863JS (119862119897120579119894 (119909 119910) 119862119897120579119895 (119909 119910)) (14)

where119870 is the total of the subbands at the 119897th scaleThen we calculate the vertical dependency 119895119904119889

119897120579Vbetween the specified subbandrsquos (for instance subband 119894)parents and children

119895119904119889119897120579V (119909 119910) =

119870

sum

119895=1

119863JS (119862119897120579119894 (119909 119910) 119862119897minus1120579119895 (119909 119910))

+ 119863JS (119862119897120579119894 (119909 119910) 119862119897+1120579119895 (119909 119910))

(15)

Further the horizontal and vertical dependency compo-nents are normalized respectively

119895119904119889119897120579ℎ

(119909 119910) =119895119904119889119897120579ℎ

(119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

119895119904119889119897120579V (119909 119910) =

119895119904119889119897120579V (119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

(16)

Finally the high frequency NSCT coefficients are revisedas

119862119897120579 (119909 119910)

= 119862119897120579 (119909 119910)radic1 + 119895119904119889

119897120579ℎ(119909 119910)

2+ 119895119904119889119897120579V (119909 119910)

2

(17)

(2) Construction of Weight Maps Weight maps are derivedfrom the saliency maps which describe each pixel by thesaliency level of salient information We apply Gaussianfilter to each high pass subband which tends to assign ahigh weight value to important elements such as edges andcorners A saliency map is constructed by the local average ofthe absolute value of the filter response

119878119897120579 (119909 119910) =1003816100381610038161003816119862119897120579 (119909 119910)

1003816100381610038161003816 lowast 119892119903119892 120579119892(119909 119910) (18)

where119892(sdot) is a Gaussian low pass filter whose size is (2119903119892+1)times(2119903119892 + 1) and the parameters 119903119892 and 120579119892 are set to 5 Next theweight maps are determined by comparison of the saliencymaps (119878119899

119897120579(119909 119910) 119899 isin [119860 119861])

119882119899

119897120579(119909 119910)

=

1 if 119878119899119897120579

(119909 119910) = max (119878119860119897120579

(119909 119910) 119878119861

119897120579(119909 119910))

0 otherwise

(19)

Finally the fused subband coefficients 119862119865

119897120579(119909 119910) can be

obtained by the weighted summation

119862119865

119897120579(119909 119910) = 119882

119860

119897120579(119909 119910) 119862

119860

119897120579(119909 119910)

+ 119882119861

119897120579(119909 119910) 119862

119861

119897120579(119909 119910)

(20)

5 Experimental Results and Discussion

Five different data sets of human brain images are usedand the source images consist of two different modalitiesincluding CTMRI MRIPET and MRISPECT images Allthe images have the size of 256 times 256 pixels which havebeen registered by some kind of registration method as [30]To verify the effectiveness and applicability of the proposedfusion scheme the results produced by the proposed methodare respectively compared with results of other state-of-the-art schemes such as discrete wavelet transform (DWT) [1]gradient pyramid (GP) [31] principal component analysis(PCA) [32] Intensity Hue and Saturation color model (IHS)[33] guided filtering (GF) [34] the contourlet transform(CT) [3] NSCT the shearlet transform (ST) [35] and

8 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 6 The fusion results of different fusion methods for the second set of CT and MRI images (a) (b) Source images fused images by(c) DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

the nonsubsampled shearlet transform (NSST) based meth-ods For simplicity MIF method [12] based on pulse-coupledneural network and modified spatial frequency in NSCTdomain is denoted as NSCT-1 NSCT based MIF method inthe scheme [36] is denoted as NSCT-2 The fusion method[37] based on neighborhood characteristic and regionaliza-tion in NSCT domain and NSCT based MIF method [10]in 119897 120572 120573 color space are denoted as NSCT-3 and NSCT-4 respectively Accordingly NSST based MIF method usingGGD model [21] is termed as NSST-1 NSST based fusionscheme of the literature [38] and MIF method [39] byutilizing the features in NSST domain are termed as NSST-2andNSST-3 respectively NSST based statisticalMIFmethod[20] using HMT model is termed as NSST-4 For the NSCTand NSST methods we adopt the average-maximum fusionscheme namely the low frequency coefficients are fused bythe average of the corresponding coefficients and the highfrequency coefficients are fused by using absolute maximumFor allMSDmethods the original images are all decomposedinto 4 levels with the number of the directions 2 2 33 Additionally the quantitative comparison based on fiveimage fusion quality metrics is also employed to demonstratethe fusion performance of different methods

51 Experiments on CT-MRI Image Fusion Figure 5 showsa fusion experiment of CT and MRI image It can be seenthat PCA based method gives poor result relative to the otheralgorithms in which the bone structure of original CT imageis almost invisible For the GP based method the final imageis darker and has lower contrast some detail information isunclear The results from Figures 5(c) 5(f) 5(g) 5(h) and5(i) have some improvement to various degrees and produce

better visual effect on bone structures however the detailsof the soft tissue regions from these methods still retainunsharpness By contrast the proposed method can wellpreserve the detailed features of the original images withoutproducing visible artifacts Figure 6 is another example ofCT and MRI image fusion As seen from Figures 6(c) 6(d)6(e) and 6(f) their results have low contrast and lose a lotof details What is worse there are undesirable artifacts onthe edges of these final images (see regions labeled by the redellipses in Figure 6) Accordingly CT based method otherNSCT based methods and the proposed method providebetter visual effects with good contrast the abundant infor-mation of the source images can be successfully transferred tothe fused image Both tests imply that the proposed methodis suitable for fusion of CT and MRI images

52 Experiments onMRI-PET andMRI-SPECT Image FusionIn this section a case of MRI and PET fusion for a 70-year-old man affected with Alzheimerrsquos disease is shown InFigure 7 the source MRI image shows that the hemisphericsulci is widened andmore prominent in the parietal lobes thecorresponding PET shows that regional cerebral metabolismis abnormal and hypometabolism heavily happens in anteriortemporal and posterior parietal regions meanwhile the righthemisphere is slightly more affected than the left Herethe proposed method is compared with other seven fusionschemes Obviously the results of Figures 7(d) and 7(e) havedisadvantages with serious color distortion and low contrastAlthough the results of Figures 7(c) and 7(f) improve thefusion performance to some extent they are not saturated inbrightness so that some parts are unidentifiable Addition-ally for the contrast and color fidelity the results fromFigures

Computational and Mathematical Methods in Medicine 9

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 7 The fusion results of different fusion methods for the MRI and PET images (a) (b) Source images fused images by (c) DWT (d)PCA (e) IHS (f) GP (g) CT (h) NSST (i) NSST-1 and (j) the proposed method

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 8 The fusion results of different fusion methods for the first set of MRI and SPECT images (a) (b) Source images fused images by(c) DWT (d) GP (e) IHS (f) NSST-2 (g) NSST-3 (h) NSCT-3 (i) NSCT-4 and (j) the proposed method

7(g) 7(h) and 7(i) have better fusion performance than thesemethods mentioned above but the structural information oftheMRI image is not successfully transferred to fused imagesThrough the comparison of these fused results it is foundthat the proposed method can well extract the structuraland functional information from the source images and fusethem with much less information distortion As illustratedin these regions highlighted by red arrows and ellipses inFigure 7 the proposed method well preserves complemen-tary information of different modal medical images andachieves the best visual effect in terms of contrast clarity andcolor fidelity

Figure 8 is a fusion experiment of MRI and SPECTimages The source MRI image demonstrates that tumors arelocated in the left temporal region as shown in the high signalintensity region (the white region labeled by the right redarrow in Figure 8(a)) From Figures 8(e) 8(f) and 8(h) theresults produced by IHS based methods are distinctly colordistortion in the lesion region The results produced by theDWT andGP basedmethods cannot well inherit PET imagersquosfunctional information and produce the low contrast images(see Figures 8(c) and 8(d)) By comparison Figures 8(g) and8(i) gain better results in terms of contrast and color fidelityHowever for the spatial details the fused result obtained by

10 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 9 The fusion results of different fusion methods for the second set of MRI and SPECT images (a) (b) Source images fused imagesby (c) DWT (d) GP (e) IHS (f) ST (g) CT (h) NSST-4 (i) NSCT and (j) the proposed method

the proposedmethod ismore close to the originalMRI image(see the region labeled by the left red arrow) and the spectralfeatures are also natural Figure 9 provides another exampleof MRI and SPECT image fusion In this test the proposedmethod is specifically compared with the typical schemes[20] which is the MIF method based on the statisticaldependencies between coefficients in NSST domain Fromall the fused results it is easily observed that the proposedmethod not only inherits the salient information existing inboth the original images but also hardly causes the problemofcolor distortion Through the above examples it can be seenthat the proposed method can be extended to combine theanatomical and functional medical images and achieves goodvisual effects

53 Objective Evaluation and Analysis In addition to thevisual analysis five fusion quality metrics namely mutualinformation (MI) [40] entropy (EN) [41] spatial frequency(SF) 119876

119860119861119865 [42] and the uniform intensity distribution(UID) [43] are employed to test the validity of the proposedmethod They reflect the fusion performance from claritycontrast color distortions and the amount of informationMI as an information measure for evaluating image fusionperformance represents how much information is obtainedfrom the source images The higher value of EN shows thefused image has more information contents and the highervalue of SF indicates the final image is clearer The index119876119860119861119865 measures the amount of information transferred from

source images to the fused image and the UID is used for thedescription of uniform intensity and color distribution andthe higher UID means better color information

The quantitative comparisons are listed in Tables 1 2 and3 It can be seen that for all the indices the proposedmethodhas a stable performance (most of values rank the first andonly a few rank the second) It shows that the objective

results based on these quality metrics also coincide with thesubjective visual perception Particularly for theMI values ofall tests the proposed scheme all gets the largest value It isconfirmed that the proposed statistical model can transformmore detailed information from the source images into thefinal image by exploiting dependencies between the NSCTcoefficients Therefore it can be concluded that the proposedmethod is effective and is suitable for medical image fusion

54 Computational Complexity Analysis To investigate thecomputational complexity of different schemes we recordthe running time of different fusion algorithms used inFigure 5 (see Table 4) All the tests are implemented byMatlab 2014a on a PC with double Intel core i7-3770k CPU35GHz 8GB RAM As shown in Table 4 among all fusionmethods the consumption time of PCA based algorithm isthe lowest the reason of which is that it does not involvemultiscale decomposition Additionally DWT GP GF andCT basedmethods have also low time consumption (less than008 s) However their performance is poor Relatively threemethods based on NSCT are slower which is also a commonproblem of the algorithms based on NSCT Due to using thecomplex neural network and mechanism of NSCT NSCT-1 needs the most time (about 34 s) The proposed methodand NSCT-2 consume the similar time (1713 s and 1529 s)Actually the decomposition and reverse construction almostcost 910 of the total time Without exception the proposedmethod increases the cost of computational complexity withutilizing NSCT tool yet it achieves the better effect thanprevious methods

6 Conclusions

In this paper we propose a novel NSCT based statisticalmultimodal medical image fusion method which utilizes

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 2: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

2 Computational and Mathematical Methods in Medicine

NSCT

NSCT

Update high frequency coefficients

Update high frequency coefficients

High frequency subbands

High frequency subbands

Low frequency subband

Low frequency subbandFused high frequency subbands

Fused low frequency subbandsFused image

Inverse NSCT

Statistics ofcoefficients

Statistics ofcoefficients

Input imageA

Input image B

middot middot middot

middot middotmiddot

middot middot middot

middot middotmiddot

middot middot middot

middot middotmiddot

middot middot middot

middot middotmiddot

Figure 1 The schematic diagram of the proposed medical image fusion method

is used for image fusion the impacts of misregistration onthe fused results can also be reduced effectively [8] and thecorrespondence between different subbands is easily foundTherefore NSCT is more suitable for medical image fusionAlthough medical image fusion methods based NSCT haveachieved good results [9ndash13] most existing fusion methodsneglect the dependencies between subband coefficients at theinterscale and intrascaleHowever the dependencies betweendecomposition coefficients commonly exist Whatrsquos morethe characteristics show non-Gaussian distribution and havethe heavy tailed phenomenon Thus making full use of thestatistical dependencies between subband coefficients willeffectively improve fusion performance

In this paper we present a novel NSCT based statisticalmultimodalMIF schemewhich utilizes generalizedGaussiandensity (GGD) to fit the marginal distributions of the highfrequency coefficients and quantify the similarity measure-ment between two subbands by the symmetric Jensen-Shannon divergence (JSD) [14 15] of two GGDs Combiningthe relationships between subband coefficients the high fre-quency coefficients are updated and finally fusedThe generalframework is shown in Figure 1 The main contributions ofthe proposed method are summarized as follows

(1) This study proposes a novel MIF method whichexplores the dependencies between subband coeffi-cients in NSCT domain

(2) GGD and JSD based statistical model is developed tonicely fitmarginal distributions of the NSCT subbandcoefficients

(3) Thenew fusion rules are developed to fuse coefficientswith the low frequency and high frequency respec-tively

The rest of this paper is organized as follows In Section 2related studies are reviewed In Section 3 we first give a briefintroduction of NSCT and then analyze the characteristicsbetween subband coefficients in NSCT domain and computetheir dependencies The novel fusion rules are developed tofuse the low frequency subbands and high frequency sub-bands in Section 4 Section 5 provides experimental resultsand discussion Conclusions are drawn in the last section

2 Related Research

A plethora of MIF methods based on NSCT assume thatthe coefficients of decomposition subbands are statisticallyindependent namely there are no dependencies betweensubband coefficients across scales and within scale Thus thiskind of methods usually results in loss of some informationof the source images However for a decomposition imageusing NSCT there really exist the dependencies betweensubbands in different levels and different orientations atthe same scale Several famous statistical models based onmultiresolution analysis have been proposed to characterizethe dependencies of subband coefficients across scales Forexample the statistical models integrating Hidden MarkovTree (HMT) with the discrete wavelet transform (DWT) orthe contourlet transform have been applied in the imagedenosing [16ndash18] Moreover the model of combining HMTand DWT is successfully applied in image segmentation [19]

Computational and Mathematical Methods in Medicine 3

MRI image

NSP

NSP

NSDFB

NSDFB

Y2

Y1

Y0

High frequency subbands in the first levelLow frequency subband

High frequency subbands in the second level

Figure 2 An illustration of the NSCT the decomposition levels correspond to the first and second level and the number of directions ofNSDFB is set to [2 3] respectively

As two recent examples Wang et al proposed two statisticalmodels in the shift invariant shearlet transform domain onecombines the HMT [20] and the other utilizes GGD [21]Although the statistical model based on HMT has successfulapplications it contains some defects such as the low fittingprecision the high dependency for convergence of functionthe lack of flexibility for the quad-tree structure itself and soon

In this paper we present a novel statistical model tomeasure the dependencies of subband coefficients in NSCTdomain The advantage of the model is that one parent nodemay have any number of child leaves instead of havinglimitation of one to four as HMTmodel Our work seeminglyshares some themes with literatures [21 22] where theprobability density function (PDF) of each decompositionsubband is modeled with the GGD and the similarity mea-surement between subbands is computed by the Kullback-Leibler distance (KLD) of twoGGDs However our statisticalmodel focuses on the statistics of the NSCT coefficientsand we evaluate the similarity of subbands across scales bythe JSD rather than KLD In addition different fusion rulesare respectively developed to combine components with lowfrequency and high frequency

3 The Proposed Algorithm

31 Overview of NSCT NSCT as a shift invariant version ofcontourlet is an overcomplete transform with flexible multi-scale multidirectional expansion for images [5] The decom-position process of the NSCT is divided into two phases thatis the nonsubsampled pyramids (NSP) and the nonsubsam-pled directional filter bank (NSDFB) The former performsmultiscale decomposition and the later provides directiondecomposition The NSP divides image into a low frequencysubband and a high frequency subband in each level Giventhat the decomposition level is 119896 NSP will generate 119896+1 sub-band images which consist of one low frequency image and 119896

high frequency images The subsequent NSDFB decomposes

the high frequency subbands from NSP in each level Asfor a specific subband let the number of decompositiondirections be 119897 then 2

119897 directional subbands are obtainedwhose sizes are all the same as the source image After thelow frequency component is decomposed iteratively by thesame way an image is finally decomposed into one lowfrequency subimage and a series of high frequency directionalsubband images (sum

119896

119895=12119897119895) wherein 119897119895 denotes the number

of decomposition directions at the 119895 scale Figure 2 showsan intuitive example of NSCT The diagram only enumeratesthe first two decomposition levels and the number of NSDFBdirections is set to [2 3] from coarser to finer scale

32 Characteristics of the NSCT Subband CoefficientsFigure 3 plots the conditional distributions of the NSCTcoefficients which characterizes the correlations betweensubband coefficients of the MRI image in Figure 2 whereinFigures 3(a) and 3(b) are probability distribution betweentwo subband coefficients at different scales and Figures3(c) and 3(d) are probability distribution between twosubband coefficients with different directions at the samescale Mathematically the conditional distributions can bedescribed as 119875(119883 | 119875119883 = 119901119909) and 119875(119883 | 119862119883 = 119888119909) here119901119909 and 119888119909 show the coefficients of parents and cousinsAs shown in Figure 3 the relationships between subbandcoefficients demonstrate the nonlinear and interlacedaliasing on the whole which illustrates that there existinterdependencies between subband coefficients in NSCTdomain Simultaneously there is approximately independentor the slight correlation between subband coefficients withdifferent directions at the same scale (cousin-cousin) whilethere is stronger correlation between subband coefficientsat different scales (parents-children) Thus the relationshipsof the NSCT coefficients mainly exist between parents andchildren

Figure 4 corresponds to the histograms of four subimagesin Figure 3 Obviously all the characteristic diagrams havesimilar features with a very sharp peak at the zero amplitude

4 Computational and Mathematical Methods in Medicine

50 10 15minus5minus10minus15

Coefficients

minus6

minus4

minus2

0

2

4

6

Pare

ntsrsquo

coeffi

cien

ts

(a)

minus30

minus20

minus10

0

10

20

30

Pare

ntsrsquo

coeffi

cien

ts

0 5 10minus5minus10

Coefficients(b)

minus30

minus20

minus10

0

10

20

30

Cou

sinrsquos

coeffi

cien

ts

3010 200minus30 minus20 minus10

Coefficients(c)

0 5 10minus5minus10

Coefficients

15

10

5

0

minus5

minus10

minus15

Cou

sinrsquos

coeffi

cien

ts

(d)

Figure 3 The conditional distribution between subband coefficients of the MRI image in Figure 2 (a) and (b) are the distributioncharacteristics between subband coefficients at different scales (c) and (d) are distribution characteristics of subband coefficientswith differentdirections at the same scale

and the extended tails in both sides of the peak whichindicates that the NSCT coefficients are sparse and themajority of coefficients are close to zero Further the kurtosisof each map is respectively measured as 2074 832 1044and 2055 (corresponding to (a) (b) (c) and (d) of the firstrow of Figure 4 in order) Clearly these values are muchlarger than the kurtosis of Gaussian distribution (kurtosisis about 30) What is more through a large number ofexperiments the coefficient characteristics (sparse and heavytailed phenomenon) are similar for other NSCT subbands Sothere exists a fact that the NSCT coefficients are sparse andhighly non-Gaussian

How to quantify the dependencies between NSCT coef-ficients by a statistical model is a subject worthy of studyInspired by the earlier statistical model of MSD coefficientsof image [23ndash26] in which the PDFs of coefficients across

scales and within scale are nicely fitted by the GGD functionwe fit the distribution characteristics by the same way andcalculate the dependencies of the NSCT coefficients Figure 5provides four PDFs of theNSCTcoefficients togetherwith thecurves of the fitted GGDs (as shown purple curves) It can beseen that these fitted curves are very close to the actual caseTherefore the statistical model can be applied to describe thespatial distribution characteristics of the NSCT coefficients

33 Statistics of the NSCT Coefficients The GGD model hasbeen extensively applied to describe the marginal densityof subband coefficients due to its flexible parametric formwhich adapts to a large family of distributions from super-Gaussian to sub-Gaussian Accordingly the approximation ofthe marginal density for the particular NSCT coefficient may

Computational and Mathematical Methods in Medicine 5

08

06

04

02

0

Prob

abili

ty d

ensit

y

The parameters of GGD120572 = 07474 120573 = 03720

minus10 minus5 0 5 10

NSCT coefficient amplitudes

minus10 minus5 0 5 10

Kurtosis = 207422

NSCT coefficient amplitudes

times104

25

2

15

1

05

0

The h

istog

ram

(a)

The parameters of GGD120572 = 11213 120573 = 04926

Prob

abili

ty d

ensit

y

08

06

04

02

0

1

minus5 0 5

NSCT coefficient amplitudes

times103

minus5 0 5

Kurtosis = 83149

NSCT coefficient amplitudes

The h

istog

ram

12

10

8

6

4

2

0

(b)

Prob

abili

ty d

ensit

y

The parameters of GGD120572 = 09625 120573 = 04120

08

06

04

02

0

1

NSCT coefficient amplitudesminus6 minus4 minus2 0 2 4 6

The h

istog

ram

Kurtosis = 104347

NSCT coefficient amplitudes

16

14

12

10

8

6

4

2

0minus6 minus4 minus2 0 2 4 6

times103

(c)

The parameters of GGD120572 = 08259 120573 = 03035

Prob

abili

ty d

ensit

y

08

06

04

02

0

1

minus5 0 5

NSCT coefficient amplitudes

25

2

3

15

1

05

0

The h

istog

ram

Kurtosis = 205509

minus5 0 5

NSCT coefficient amplitudes

times104

(d)

Figure 4 Histograms of four distribution maps in Figure 3 (the first row) and the curves fitted with GGDs (the second row)

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 5 The fusion results of different fusion methods for the first set of CT and MRI images (a) (b) Source images fused images by (c)DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

be achieved by varying two parameters of the GGD which isdefined as

119875 (119909 120572 120573) =120573

2120572Γ (1120573)119890minus(|119909|120572)

120573

(1)

where Γ(sdot) is the Gamma function 120572 is the scale parameter(width of the PDF peak) and 120573 is the shape parameterwhich tunes the decay rate of the density function Normally

the parameters 120572 and 120573 are computed by the maximumlikelihood (ML) estimator which has shown to be the desiredestimator [27] As for each subband the likelihood functionof the sample 119909 = (1199091 1199092 119909119899) is defined as

119871 (119909 120572 120573) = log119899

prod

119894=1

119901 (119909119894 120572 120573) (2)

6 Computational and Mathematical Methods in Medicine

In this case 120572 and 120573 are parameters that need to beestimated We can obtain the unique root by the likelihoodequations below here Ψ(sdot) denotes the digamma function

120597119871 (119909 120572 120573)

120597120573= minus

119871

120572+

119873

sum

119894=1

1205731003816100381610038161003816119909119894

1003816100381610038161003816120573120572minus120573

120572= 0 (3)

120597119871 (119909 120572 120573)

120597120573=

119871

120573+

119871Ψ (1120573)

1205732

minus

119873

sum

119894=1

(

10038161003816100381610038161199091198941003816100381610038161003816

120572)

120573

log(1003816100381610038161003816119909119894

1003816100381610038161003816

120572) = 0

(4)

Let 120573 be fixed and 120573 gt 0 then (4) has the unique solutionwhich is the real and positive value

= (120573

119873

119873

sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

1120573

(5)

Combining (4) and (5) the shape parameter 120573 can besolved by the following transcendental equation

1 +Ψ (1120573)

120573minus

sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573 log 1003816100381610038161003816119909119894

1003816100381610038161003816

sum1003816100381610038161003816119909119894

1003816100381610038161003816120573

+

log((120573119873)sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

120573= 0

(6)

In (6) the determination of 120573 can be effectively solvedusing Newton-Raphson iterative procedure [26 28] and thealgorithm is detailedly described in [22]Therefore with onlytwo parameters we can accurately characterize the marginaldistribution of the NSCT coefficients

34 The Dependency of Different NSCT Subbands The KLDis a common and justified way of measuring the distancebetween two distributions119863KL(119875 119876) is applied to describethe deficiency of using one distribution 119902 to represent the truedistribution 119901 which is generally used for comparing tworelated distributions The KLD between two distributions for119875119876 the PDFs of which are respectively denoted as 119901(119883 1205791)119901(119883 1205792) is defined as

119863KL (119875 119876) = 119863 (119901 (119909 1205791) 119901 (119909 1205792))

= int119901 (119909 1205791) log119901 (119909 1205791)

119901 (119909 1205792)119889119909

(7)

where 1205791 and 1205792 are a set of estimated parameters Giventwo GGD distributions of NSCT subbands the similaritybetween two GGDs for NSCT subbands can be defined bythe parameters 120572 and 120573 Substitute (1) into (7) and after some

manipulations the KLD between two PDFs can be expressedas

119863KL (119875 119876) = 119863KL (119901 (sdot 1198861 1205731) 119901 (sdot 1205722 1205732))

= log(12057311205722Γ (11205732)

12057321205721Γ (11205731))

+ (1205721

1205722

)

1205732 Γ ((1205732 + 1) 1205731)

Γ (11205731)minus

1

1205731

(8)

However there are some deficiencies with the KLDwhich makes it less ideal First the KLD is asymmetricthat is (119863KL(119875 119876)) is different from (119863KL(119876 119875))Second if 119902(119909) = 0 and 119901(119909) = 0 for any 119909 then119863KL(119875 119876) is undefined Third the KLD does not offerany nice upper bounds [14] On the other hand the JSDhas the characteristics of nonnegativity finiteness symmetryand boundedness [15 29] So we use the symmetric JSD tomeasure the similarity between two NSCT subbands in thisstudy The JSD between GGDs is derived from the KLDmathematically it is defined as

119863JS (119875 119876) = 119863JS (119876 119875)

=1

2(119863KL (119875 119872) + 119863KL (119876 119872))

119872 =119875 + 119876

2

(9)

4 The Proposed Image Fusion Technique

41 Fusion of the Low Frequency Coefficients The low fre-quency subbands represent the approximation componentsof the source images The simplest way to combine subbandcoefficients is the averaging method However this methodeasily leads to the low contrast and blurred result To extractmore useful information from the source images for thelow frequency coefficients we employ the fusion rule basedon two activity level measurements which consists of theregional standard deviation and Shannon entropy In prin-ciple the local texture features of an image are related withthe variation of the coefficients around neighborhood Onthe other hand the entropy indicates how much informationan image contains Thus combining the two together canextract more complementary information present in thesource images The process is listed as follows

(1) Computing the regional standard deviation119863120582(119909 119910)

119863120582 (119909 119910)

= radic sum

119898isin119872119899isin119873

120596 (119898 119899) times [119862120582 (119909 + 119898 119910 + 119899) minus 119878120582 (119909 119910)]2(10)

(2) Calculating the normalized Shannon entropy

119864120582 (119909 119910) =1

|119877|sum

119894119895

(119862120582

0(119894 119895))

2

log (1198621205820(119894 119895))

2

(11)

Computational and Mathematical Methods in Medicine 7

(3) Computing the weights (120575120582 120585120582) of the standarddeviation 119863120582(119909 119910) and the information entropy 119864120582(119909 119910)respectively

120575120582 =

1003816100381610038161003816119863120582 (119909 119910)1003816100381610038161003816120572

1003816100381610038161003816119863119860 (119909 119910)1003816100381610038161003816120572+1003816100381610038161003816119863119861 (119909 119910)

1003816100381610038161003816120572

120585120582 =119864120582 (119909 119910)

119864119860 (119909 119910) + 119864119861 (119909 119910)

(12)

where the parameter 120572 is a constant which tunes thesharpness of fused image by adjusting the value of parameterit is set to 12 in our experiment

Let 119862120582

0(119909 119910) denote the low frequency subband coeffi-

cient at location (119909 119910)120582 is input image119860119861 Finally the fusedimage can be obtained by

119862119865

0(119909 119910) = sum

120582=119860119861

[120575120582119862120582

0(119909 119910) + 120585120582119862

120582

0(119909 119910)] (13)

42 Fusion of the High Frequency Coefficients High fre-quency subbands correspond to detailed information in theseregions such as edges lines and corners Because differentimaging modalities contain redundant and complementaryinformation of each other the purpose of selection rule ismainly to capture salient information of the source imagesas much as possible Maximum selection rule is not suitablefor medical image fusion because it works well on thispremise that only an original image provides good pixelat each corresponding location thus vast complementaryinformation will be lost when it is used for MIF To improvethe fusion performance for the high frequency subbands wepropose the fusion scheme based on weight maps which aredetermined by the saliency maps According to the fact thatthere exist dependencies between the NSCT coefficients thehigh frequency coefficients are first updated by utilizing therelationships between NSCT subbands and then combiningtogether by using weight maps The process is described asfollows

(1) Updating of the High Frequency Subband CoefficientsFirst we calculate the horizontal dependency 119895119904119889

119897120579ℎbetween

coefficients with different directions at the same scale 119897 as

119895119904119889119897120579ℎ

(119909 119910) =

119870

sum

119895=1119895 =119894

119863JS (119862119897120579119894 (119909 119910) 119862119897120579119895 (119909 119910)) (14)

where119870 is the total of the subbands at the 119897th scaleThen we calculate the vertical dependency 119895119904119889

119897120579Vbetween the specified subbandrsquos (for instance subband 119894)parents and children

119895119904119889119897120579V (119909 119910) =

119870

sum

119895=1

119863JS (119862119897120579119894 (119909 119910) 119862119897minus1120579119895 (119909 119910))

+ 119863JS (119862119897120579119894 (119909 119910) 119862119897+1120579119895 (119909 119910))

(15)

Further the horizontal and vertical dependency compo-nents are normalized respectively

119895119904119889119897120579ℎ

(119909 119910) =119895119904119889119897120579ℎ

(119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

119895119904119889119897120579V (119909 119910) =

119895119904119889119897120579V (119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

(16)

Finally the high frequency NSCT coefficients are revisedas

119862119897120579 (119909 119910)

= 119862119897120579 (119909 119910)radic1 + 119895119904119889

119897120579ℎ(119909 119910)

2+ 119895119904119889119897120579V (119909 119910)

2

(17)

(2) Construction of Weight Maps Weight maps are derivedfrom the saliency maps which describe each pixel by thesaliency level of salient information We apply Gaussianfilter to each high pass subband which tends to assign ahigh weight value to important elements such as edges andcorners A saliency map is constructed by the local average ofthe absolute value of the filter response

119878119897120579 (119909 119910) =1003816100381610038161003816119862119897120579 (119909 119910)

1003816100381610038161003816 lowast 119892119903119892 120579119892(119909 119910) (18)

where119892(sdot) is a Gaussian low pass filter whose size is (2119903119892+1)times(2119903119892 + 1) and the parameters 119903119892 and 120579119892 are set to 5 Next theweight maps are determined by comparison of the saliencymaps (119878119899

119897120579(119909 119910) 119899 isin [119860 119861])

119882119899

119897120579(119909 119910)

=

1 if 119878119899119897120579

(119909 119910) = max (119878119860119897120579

(119909 119910) 119878119861

119897120579(119909 119910))

0 otherwise

(19)

Finally the fused subband coefficients 119862119865

119897120579(119909 119910) can be

obtained by the weighted summation

119862119865

119897120579(119909 119910) = 119882

119860

119897120579(119909 119910) 119862

119860

119897120579(119909 119910)

+ 119882119861

119897120579(119909 119910) 119862

119861

119897120579(119909 119910)

(20)

5 Experimental Results and Discussion

Five different data sets of human brain images are usedand the source images consist of two different modalitiesincluding CTMRI MRIPET and MRISPECT images Allthe images have the size of 256 times 256 pixels which havebeen registered by some kind of registration method as [30]To verify the effectiveness and applicability of the proposedfusion scheme the results produced by the proposed methodare respectively compared with results of other state-of-the-art schemes such as discrete wavelet transform (DWT) [1]gradient pyramid (GP) [31] principal component analysis(PCA) [32] Intensity Hue and Saturation color model (IHS)[33] guided filtering (GF) [34] the contourlet transform(CT) [3] NSCT the shearlet transform (ST) [35] and

8 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 6 The fusion results of different fusion methods for the second set of CT and MRI images (a) (b) Source images fused images by(c) DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

the nonsubsampled shearlet transform (NSST) based meth-ods For simplicity MIF method [12] based on pulse-coupledneural network and modified spatial frequency in NSCTdomain is denoted as NSCT-1 NSCT based MIF method inthe scheme [36] is denoted as NSCT-2 The fusion method[37] based on neighborhood characteristic and regionaliza-tion in NSCT domain and NSCT based MIF method [10]in 119897 120572 120573 color space are denoted as NSCT-3 and NSCT-4 respectively Accordingly NSST based MIF method usingGGD model [21] is termed as NSST-1 NSST based fusionscheme of the literature [38] and MIF method [39] byutilizing the features in NSST domain are termed as NSST-2andNSST-3 respectively NSST based statisticalMIFmethod[20] using HMT model is termed as NSST-4 For the NSCTand NSST methods we adopt the average-maximum fusionscheme namely the low frequency coefficients are fused bythe average of the corresponding coefficients and the highfrequency coefficients are fused by using absolute maximumFor allMSDmethods the original images are all decomposedinto 4 levels with the number of the directions 2 2 33 Additionally the quantitative comparison based on fiveimage fusion quality metrics is also employed to demonstratethe fusion performance of different methods

51 Experiments on CT-MRI Image Fusion Figure 5 showsa fusion experiment of CT and MRI image It can be seenthat PCA based method gives poor result relative to the otheralgorithms in which the bone structure of original CT imageis almost invisible For the GP based method the final imageis darker and has lower contrast some detail information isunclear The results from Figures 5(c) 5(f) 5(g) 5(h) and5(i) have some improvement to various degrees and produce

better visual effect on bone structures however the detailsof the soft tissue regions from these methods still retainunsharpness By contrast the proposed method can wellpreserve the detailed features of the original images withoutproducing visible artifacts Figure 6 is another example ofCT and MRI image fusion As seen from Figures 6(c) 6(d)6(e) and 6(f) their results have low contrast and lose a lotof details What is worse there are undesirable artifacts onthe edges of these final images (see regions labeled by the redellipses in Figure 6) Accordingly CT based method otherNSCT based methods and the proposed method providebetter visual effects with good contrast the abundant infor-mation of the source images can be successfully transferred tothe fused image Both tests imply that the proposed methodis suitable for fusion of CT and MRI images

52 Experiments onMRI-PET andMRI-SPECT Image FusionIn this section a case of MRI and PET fusion for a 70-year-old man affected with Alzheimerrsquos disease is shown InFigure 7 the source MRI image shows that the hemisphericsulci is widened andmore prominent in the parietal lobes thecorresponding PET shows that regional cerebral metabolismis abnormal and hypometabolism heavily happens in anteriortemporal and posterior parietal regions meanwhile the righthemisphere is slightly more affected than the left Herethe proposed method is compared with other seven fusionschemes Obviously the results of Figures 7(d) and 7(e) havedisadvantages with serious color distortion and low contrastAlthough the results of Figures 7(c) and 7(f) improve thefusion performance to some extent they are not saturated inbrightness so that some parts are unidentifiable Addition-ally for the contrast and color fidelity the results fromFigures

Computational and Mathematical Methods in Medicine 9

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 7 The fusion results of different fusion methods for the MRI and PET images (a) (b) Source images fused images by (c) DWT (d)PCA (e) IHS (f) GP (g) CT (h) NSST (i) NSST-1 and (j) the proposed method

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 8 The fusion results of different fusion methods for the first set of MRI and SPECT images (a) (b) Source images fused images by(c) DWT (d) GP (e) IHS (f) NSST-2 (g) NSST-3 (h) NSCT-3 (i) NSCT-4 and (j) the proposed method

7(g) 7(h) and 7(i) have better fusion performance than thesemethods mentioned above but the structural information oftheMRI image is not successfully transferred to fused imagesThrough the comparison of these fused results it is foundthat the proposed method can well extract the structuraland functional information from the source images and fusethem with much less information distortion As illustratedin these regions highlighted by red arrows and ellipses inFigure 7 the proposed method well preserves complemen-tary information of different modal medical images andachieves the best visual effect in terms of contrast clarity andcolor fidelity

Figure 8 is a fusion experiment of MRI and SPECTimages The source MRI image demonstrates that tumors arelocated in the left temporal region as shown in the high signalintensity region (the white region labeled by the right redarrow in Figure 8(a)) From Figures 8(e) 8(f) and 8(h) theresults produced by IHS based methods are distinctly colordistortion in the lesion region The results produced by theDWT andGP basedmethods cannot well inherit PET imagersquosfunctional information and produce the low contrast images(see Figures 8(c) and 8(d)) By comparison Figures 8(g) and8(i) gain better results in terms of contrast and color fidelityHowever for the spatial details the fused result obtained by

10 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 9 The fusion results of different fusion methods for the second set of MRI and SPECT images (a) (b) Source images fused imagesby (c) DWT (d) GP (e) IHS (f) ST (g) CT (h) NSST-4 (i) NSCT and (j) the proposed method

the proposedmethod ismore close to the originalMRI image(see the region labeled by the left red arrow) and the spectralfeatures are also natural Figure 9 provides another exampleof MRI and SPECT image fusion In this test the proposedmethod is specifically compared with the typical schemes[20] which is the MIF method based on the statisticaldependencies between coefficients in NSST domain Fromall the fused results it is easily observed that the proposedmethod not only inherits the salient information existing inboth the original images but also hardly causes the problemofcolor distortion Through the above examples it can be seenthat the proposed method can be extended to combine theanatomical and functional medical images and achieves goodvisual effects

53 Objective Evaluation and Analysis In addition to thevisual analysis five fusion quality metrics namely mutualinformation (MI) [40] entropy (EN) [41] spatial frequency(SF) 119876

119860119861119865 [42] and the uniform intensity distribution(UID) [43] are employed to test the validity of the proposedmethod They reflect the fusion performance from claritycontrast color distortions and the amount of informationMI as an information measure for evaluating image fusionperformance represents how much information is obtainedfrom the source images The higher value of EN shows thefused image has more information contents and the highervalue of SF indicates the final image is clearer The index119876119860119861119865 measures the amount of information transferred from

source images to the fused image and the UID is used for thedescription of uniform intensity and color distribution andthe higher UID means better color information

The quantitative comparisons are listed in Tables 1 2 and3 It can be seen that for all the indices the proposedmethodhas a stable performance (most of values rank the first andonly a few rank the second) It shows that the objective

results based on these quality metrics also coincide with thesubjective visual perception Particularly for theMI values ofall tests the proposed scheme all gets the largest value It isconfirmed that the proposed statistical model can transformmore detailed information from the source images into thefinal image by exploiting dependencies between the NSCTcoefficients Therefore it can be concluded that the proposedmethod is effective and is suitable for medical image fusion

54 Computational Complexity Analysis To investigate thecomputational complexity of different schemes we recordthe running time of different fusion algorithms used inFigure 5 (see Table 4) All the tests are implemented byMatlab 2014a on a PC with double Intel core i7-3770k CPU35GHz 8GB RAM As shown in Table 4 among all fusionmethods the consumption time of PCA based algorithm isthe lowest the reason of which is that it does not involvemultiscale decomposition Additionally DWT GP GF andCT basedmethods have also low time consumption (less than008 s) However their performance is poor Relatively threemethods based on NSCT are slower which is also a commonproblem of the algorithms based on NSCT Due to using thecomplex neural network and mechanism of NSCT NSCT-1 needs the most time (about 34 s) The proposed methodand NSCT-2 consume the similar time (1713 s and 1529 s)Actually the decomposition and reverse construction almostcost 910 of the total time Without exception the proposedmethod increases the cost of computational complexity withutilizing NSCT tool yet it achieves the better effect thanprevious methods

6 Conclusions

In this paper we propose a novel NSCT based statisticalmultimodal medical image fusion method which utilizes

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 3: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

Computational and Mathematical Methods in Medicine 3

MRI image

NSP

NSP

NSDFB

NSDFB

Y2

Y1

Y0

High frequency subbands in the first levelLow frequency subband

High frequency subbands in the second level

Figure 2 An illustration of the NSCT the decomposition levels correspond to the first and second level and the number of directions ofNSDFB is set to [2 3] respectively

As two recent examples Wang et al proposed two statisticalmodels in the shift invariant shearlet transform domain onecombines the HMT [20] and the other utilizes GGD [21]Although the statistical model based on HMT has successfulapplications it contains some defects such as the low fittingprecision the high dependency for convergence of functionthe lack of flexibility for the quad-tree structure itself and soon

In this paper we present a novel statistical model tomeasure the dependencies of subband coefficients in NSCTdomain The advantage of the model is that one parent nodemay have any number of child leaves instead of havinglimitation of one to four as HMTmodel Our work seeminglyshares some themes with literatures [21 22] where theprobability density function (PDF) of each decompositionsubband is modeled with the GGD and the similarity mea-surement between subbands is computed by the Kullback-Leibler distance (KLD) of twoGGDs However our statisticalmodel focuses on the statistics of the NSCT coefficientsand we evaluate the similarity of subbands across scales bythe JSD rather than KLD In addition different fusion rulesare respectively developed to combine components with lowfrequency and high frequency

3 The Proposed Algorithm

31 Overview of NSCT NSCT as a shift invariant version ofcontourlet is an overcomplete transform with flexible multi-scale multidirectional expansion for images [5] The decom-position process of the NSCT is divided into two phases thatis the nonsubsampled pyramids (NSP) and the nonsubsam-pled directional filter bank (NSDFB) The former performsmultiscale decomposition and the later provides directiondecomposition The NSP divides image into a low frequencysubband and a high frequency subband in each level Giventhat the decomposition level is 119896 NSP will generate 119896+1 sub-band images which consist of one low frequency image and 119896

high frequency images The subsequent NSDFB decomposes

the high frequency subbands from NSP in each level Asfor a specific subband let the number of decompositiondirections be 119897 then 2

119897 directional subbands are obtainedwhose sizes are all the same as the source image After thelow frequency component is decomposed iteratively by thesame way an image is finally decomposed into one lowfrequency subimage and a series of high frequency directionalsubband images (sum

119896

119895=12119897119895) wherein 119897119895 denotes the number

of decomposition directions at the 119895 scale Figure 2 showsan intuitive example of NSCT The diagram only enumeratesthe first two decomposition levels and the number of NSDFBdirections is set to [2 3] from coarser to finer scale

32 Characteristics of the NSCT Subband CoefficientsFigure 3 plots the conditional distributions of the NSCTcoefficients which characterizes the correlations betweensubband coefficients of the MRI image in Figure 2 whereinFigures 3(a) and 3(b) are probability distribution betweentwo subband coefficients at different scales and Figures3(c) and 3(d) are probability distribution between twosubband coefficients with different directions at the samescale Mathematically the conditional distributions can bedescribed as 119875(119883 | 119875119883 = 119901119909) and 119875(119883 | 119862119883 = 119888119909) here119901119909 and 119888119909 show the coefficients of parents and cousinsAs shown in Figure 3 the relationships between subbandcoefficients demonstrate the nonlinear and interlacedaliasing on the whole which illustrates that there existinterdependencies between subband coefficients in NSCTdomain Simultaneously there is approximately independentor the slight correlation between subband coefficients withdifferent directions at the same scale (cousin-cousin) whilethere is stronger correlation between subband coefficientsat different scales (parents-children) Thus the relationshipsof the NSCT coefficients mainly exist between parents andchildren

Figure 4 corresponds to the histograms of four subimagesin Figure 3 Obviously all the characteristic diagrams havesimilar features with a very sharp peak at the zero amplitude

4 Computational and Mathematical Methods in Medicine

50 10 15minus5minus10minus15

Coefficients

minus6

minus4

minus2

0

2

4

6

Pare

ntsrsquo

coeffi

cien

ts

(a)

minus30

minus20

minus10

0

10

20

30

Pare

ntsrsquo

coeffi

cien

ts

0 5 10minus5minus10

Coefficients(b)

minus30

minus20

minus10

0

10

20

30

Cou

sinrsquos

coeffi

cien

ts

3010 200minus30 minus20 minus10

Coefficients(c)

0 5 10minus5minus10

Coefficients

15

10

5

0

minus5

minus10

minus15

Cou

sinrsquos

coeffi

cien

ts

(d)

Figure 3 The conditional distribution between subband coefficients of the MRI image in Figure 2 (a) and (b) are the distributioncharacteristics between subband coefficients at different scales (c) and (d) are distribution characteristics of subband coefficientswith differentdirections at the same scale

and the extended tails in both sides of the peak whichindicates that the NSCT coefficients are sparse and themajority of coefficients are close to zero Further the kurtosisof each map is respectively measured as 2074 832 1044and 2055 (corresponding to (a) (b) (c) and (d) of the firstrow of Figure 4 in order) Clearly these values are muchlarger than the kurtosis of Gaussian distribution (kurtosisis about 30) What is more through a large number ofexperiments the coefficient characteristics (sparse and heavytailed phenomenon) are similar for other NSCT subbands Sothere exists a fact that the NSCT coefficients are sparse andhighly non-Gaussian

How to quantify the dependencies between NSCT coef-ficients by a statistical model is a subject worthy of studyInspired by the earlier statistical model of MSD coefficientsof image [23ndash26] in which the PDFs of coefficients across

scales and within scale are nicely fitted by the GGD functionwe fit the distribution characteristics by the same way andcalculate the dependencies of the NSCT coefficients Figure 5provides four PDFs of theNSCTcoefficients togetherwith thecurves of the fitted GGDs (as shown purple curves) It can beseen that these fitted curves are very close to the actual caseTherefore the statistical model can be applied to describe thespatial distribution characteristics of the NSCT coefficients

33 Statistics of the NSCT Coefficients The GGD model hasbeen extensively applied to describe the marginal densityof subband coefficients due to its flexible parametric formwhich adapts to a large family of distributions from super-Gaussian to sub-Gaussian Accordingly the approximation ofthe marginal density for the particular NSCT coefficient may

Computational and Mathematical Methods in Medicine 5

08

06

04

02

0

Prob

abili

ty d

ensit

y

The parameters of GGD120572 = 07474 120573 = 03720

minus10 minus5 0 5 10

NSCT coefficient amplitudes

minus10 minus5 0 5 10

Kurtosis = 207422

NSCT coefficient amplitudes

times104

25

2

15

1

05

0

The h

istog

ram

(a)

The parameters of GGD120572 = 11213 120573 = 04926

Prob

abili

ty d

ensit

y

08

06

04

02

0

1

minus5 0 5

NSCT coefficient amplitudes

times103

minus5 0 5

Kurtosis = 83149

NSCT coefficient amplitudes

The h

istog

ram

12

10

8

6

4

2

0

(b)

Prob

abili

ty d

ensit

y

The parameters of GGD120572 = 09625 120573 = 04120

08

06

04

02

0

1

NSCT coefficient amplitudesminus6 minus4 minus2 0 2 4 6

The h

istog

ram

Kurtosis = 104347

NSCT coefficient amplitudes

16

14

12

10

8

6

4

2

0minus6 minus4 minus2 0 2 4 6

times103

(c)

The parameters of GGD120572 = 08259 120573 = 03035

Prob

abili

ty d

ensit

y

08

06

04

02

0

1

minus5 0 5

NSCT coefficient amplitudes

25

2

3

15

1

05

0

The h

istog

ram

Kurtosis = 205509

minus5 0 5

NSCT coefficient amplitudes

times104

(d)

Figure 4 Histograms of four distribution maps in Figure 3 (the first row) and the curves fitted with GGDs (the second row)

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 5 The fusion results of different fusion methods for the first set of CT and MRI images (a) (b) Source images fused images by (c)DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

be achieved by varying two parameters of the GGD which isdefined as

119875 (119909 120572 120573) =120573

2120572Γ (1120573)119890minus(|119909|120572)

120573

(1)

where Γ(sdot) is the Gamma function 120572 is the scale parameter(width of the PDF peak) and 120573 is the shape parameterwhich tunes the decay rate of the density function Normally

the parameters 120572 and 120573 are computed by the maximumlikelihood (ML) estimator which has shown to be the desiredestimator [27] As for each subband the likelihood functionof the sample 119909 = (1199091 1199092 119909119899) is defined as

119871 (119909 120572 120573) = log119899

prod

119894=1

119901 (119909119894 120572 120573) (2)

6 Computational and Mathematical Methods in Medicine

In this case 120572 and 120573 are parameters that need to beestimated We can obtain the unique root by the likelihoodequations below here Ψ(sdot) denotes the digamma function

120597119871 (119909 120572 120573)

120597120573= minus

119871

120572+

119873

sum

119894=1

1205731003816100381610038161003816119909119894

1003816100381610038161003816120573120572minus120573

120572= 0 (3)

120597119871 (119909 120572 120573)

120597120573=

119871

120573+

119871Ψ (1120573)

1205732

minus

119873

sum

119894=1

(

10038161003816100381610038161199091198941003816100381610038161003816

120572)

120573

log(1003816100381610038161003816119909119894

1003816100381610038161003816

120572) = 0

(4)

Let 120573 be fixed and 120573 gt 0 then (4) has the unique solutionwhich is the real and positive value

= (120573

119873

119873

sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

1120573

(5)

Combining (4) and (5) the shape parameter 120573 can besolved by the following transcendental equation

1 +Ψ (1120573)

120573minus

sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573 log 1003816100381610038161003816119909119894

1003816100381610038161003816

sum1003816100381610038161003816119909119894

1003816100381610038161003816120573

+

log((120573119873)sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

120573= 0

(6)

In (6) the determination of 120573 can be effectively solvedusing Newton-Raphson iterative procedure [26 28] and thealgorithm is detailedly described in [22]Therefore with onlytwo parameters we can accurately characterize the marginaldistribution of the NSCT coefficients

34 The Dependency of Different NSCT Subbands The KLDis a common and justified way of measuring the distancebetween two distributions119863KL(119875 119876) is applied to describethe deficiency of using one distribution 119902 to represent the truedistribution 119901 which is generally used for comparing tworelated distributions The KLD between two distributions for119875119876 the PDFs of which are respectively denoted as 119901(119883 1205791)119901(119883 1205792) is defined as

119863KL (119875 119876) = 119863 (119901 (119909 1205791) 119901 (119909 1205792))

= int119901 (119909 1205791) log119901 (119909 1205791)

119901 (119909 1205792)119889119909

(7)

where 1205791 and 1205792 are a set of estimated parameters Giventwo GGD distributions of NSCT subbands the similaritybetween two GGDs for NSCT subbands can be defined bythe parameters 120572 and 120573 Substitute (1) into (7) and after some

manipulations the KLD between two PDFs can be expressedas

119863KL (119875 119876) = 119863KL (119901 (sdot 1198861 1205731) 119901 (sdot 1205722 1205732))

= log(12057311205722Γ (11205732)

12057321205721Γ (11205731))

+ (1205721

1205722

)

1205732 Γ ((1205732 + 1) 1205731)

Γ (11205731)minus

1

1205731

(8)

However there are some deficiencies with the KLDwhich makes it less ideal First the KLD is asymmetricthat is (119863KL(119875 119876)) is different from (119863KL(119876 119875))Second if 119902(119909) = 0 and 119901(119909) = 0 for any 119909 then119863KL(119875 119876) is undefined Third the KLD does not offerany nice upper bounds [14] On the other hand the JSDhas the characteristics of nonnegativity finiteness symmetryand boundedness [15 29] So we use the symmetric JSD tomeasure the similarity between two NSCT subbands in thisstudy The JSD between GGDs is derived from the KLDmathematically it is defined as

119863JS (119875 119876) = 119863JS (119876 119875)

=1

2(119863KL (119875 119872) + 119863KL (119876 119872))

119872 =119875 + 119876

2

(9)

4 The Proposed Image Fusion Technique

41 Fusion of the Low Frequency Coefficients The low fre-quency subbands represent the approximation componentsof the source images The simplest way to combine subbandcoefficients is the averaging method However this methodeasily leads to the low contrast and blurred result To extractmore useful information from the source images for thelow frequency coefficients we employ the fusion rule basedon two activity level measurements which consists of theregional standard deviation and Shannon entropy In prin-ciple the local texture features of an image are related withthe variation of the coefficients around neighborhood Onthe other hand the entropy indicates how much informationan image contains Thus combining the two together canextract more complementary information present in thesource images The process is listed as follows

(1) Computing the regional standard deviation119863120582(119909 119910)

119863120582 (119909 119910)

= radic sum

119898isin119872119899isin119873

120596 (119898 119899) times [119862120582 (119909 + 119898 119910 + 119899) minus 119878120582 (119909 119910)]2(10)

(2) Calculating the normalized Shannon entropy

119864120582 (119909 119910) =1

|119877|sum

119894119895

(119862120582

0(119894 119895))

2

log (1198621205820(119894 119895))

2

(11)

Computational and Mathematical Methods in Medicine 7

(3) Computing the weights (120575120582 120585120582) of the standarddeviation 119863120582(119909 119910) and the information entropy 119864120582(119909 119910)respectively

120575120582 =

1003816100381610038161003816119863120582 (119909 119910)1003816100381610038161003816120572

1003816100381610038161003816119863119860 (119909 119910)1003816100381610038161003816120572+1003816100381610038161003816119863119861 (119909 119910)

1003816100381610038161003816120572

120585120582 =119864120582 (119909 119910)

119864119860 (119909 119910) + 119864119861 (119909 119910)

(12)

where the parameter 120572 is a constant which tunes thesharpness of fused image by adjusting the value of parameterit is set to 12 in our experiment

Let 119862120582

0(119909 119910) denote the low frequency subband coeffi-

cient at location (119909 119910)120582 is input image119860119861 Finally the fusedimage can be obtained by

119862119865

0(119909 119910) = sum

120582=119860119861

[120575120582119862120582

0(119909 119910) + 120585120582119862

120582

0(119909 119910)] (13)

42 Fusion of the High Frequency Coefficients High fre-quency subbands correspond to detailed information in theseregions such as edges lines and corners Because differentimaging modalities contain redundant and complementaryinformation of each other the purpose of selection rule ismainly to capture salient information of the source imagesas much as possible Maximum selection rule is not suitablefor medical image fusion because it works well on thispremise that only an original image provides good pixelat each corresponding location thus vast complementaryinformation will be lost when it is used for MIF To improvethe fusion performance for the high frequency subbands wepropose the fusion scheme based on weight maps which aredetermined by the saliency maps According to the fact thatthere exist dependencies between the NSCT coefficients thehigh frequency coefficients are first updated by utilizing therelationships between NSCT subbands and then combiningtogether by using weight maps The process is described asfollows

(1) Updating of the High Frequency Subband CoefficientsFirst we calculate the horizontal dependency 119895119904119889

119897120579ℎbetween

coefficients with different directions at the same scale 119897 as

119895119904119889119897120579ℎ

(119909 119910) =

119870

sum

119895=1119895 =119894

119863JS (119862119897120579119894 (119909 119910) 119862119897120579119895 (119909 119910)) (14)

where119870 is the total of the subbands at the 119897th scaleThen we calculate the vertical dependency 119895119904119889

119897120579Vbetween the specified subbandrsquos (for instance subband 119894)parents and children

119895119904119889119897120579V (119909 119910) =

119870

sum

119895=1

119863JS (119862119897120579119894 (119909 119910) 119862119897minus1120579119895 (119909 119910))

+ 119863JS (119862119897120579119894 (119909 119910) 119862119897+1120579119895 (119909 119910))

(15)

Further the horizontal and vertical dependency compo-nents are normalized respectively

119895119904119889119897120579ℎ

(119909 119910) =119895119904119889119897120579ℎ

(119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

119895119904119889119897120579V (119909 119910) =

119895119904119889119897120579V (119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

(16)

Finally the high frequency NSCT coefficients are revisedas

119862119897120579 (119909 119910)

= 119862119897120579 (119909 119910)radic1 + 119895119904119889

119897120579ℎ(119909 119910)

2+ 119895119904119889119897120579V (119909 119910)

2

(17)

(2) Construction of Weight Maps Weight maps are derivedfrom the saliency maps which describe each pixel by thesaliency level of salient information We apply Gaussianfilter to each high pass subband which tends to assign ahigh weight value to important elements such as edges andcorners A saliency map is constructed by the local average ofthe absolute value of the filter response

119878119897120579 (119909 119910) =1003816100381610038161003816119862119897120579 (119909 119910)

1003816100381610038161003816 lowast 119892119903119892 120579119892(119909 119910) (18)

where119892(sdot) is a Gaussian low pass filter whose size is (2119903119892+1)times(2119903119892 + 1) and the parameters 119903119892 and 120579119892 are set to 5 Next theweight maps are determined by comparison of the saliencymaps (119878119899

119897120579(119909 119910) 119899 isin [119860 119861])

119882119899

119897120579(119909 119910)

=

1 if 119878119899119897120579

(119909 119910) = max (119878119860119897120579

(119909 119910) 119878119861

119897120579(119909 119910))

0 otherwise

(19)

Finally the fused subband coefficients 119862119865

119897120579(119909 119910) can be

obtained by the weighted summation

119862119865

119897120579(119909 119910) = 119882

119860

119897120579(119909 119910) 119862

119860

119897120579(119909 119910)

+ 119882119861

119897120579(119909 119910) 119862

119861

119897120579(119909 119910)

(20)

5 Experimental Results and Discussion

Five different data sets of human brain images are usedand the source images consist of two different modalitiesincluding CTMRI MRIPET and MRISPECT images Allthe images have the size of 256 times 256 pixels which havebeen registered by some kind of registration method as [30]To verify the effectiveness and applicability of the proposedfusion scheme the results produced by the proposed methodare respectively compared with results of other state-of-the-art schemes such as discrete wavelet transform (DWT) [1]gradient pyramid (GP) [31] principal component analysis(PCA) [32] Intensity Hue and Saturation color model (IHS)[33] guided filtering (GF) [34] the contourlet transform(CT) [3] NSCT the shearlet transform (ST) [35] and

8 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 6 The fusion results of different fusion methods for the second set of CT and MRI images (a) (b) Source images fused images by(c) DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

the nonsubsampled shearlet transform (NSST) based meth-ods For simplicity MIF method [12] based on pulse-coupledneural network and modified spatial frequency in NSCTdomain is denoted as NSCT-1 NSCT based MIF method inthe scheme [36] is denoted as NSCT-2 The fusion method[37] based on neighborhood characteristic and regionaliza-tion in NSCT domain and NSCT based MIF method [10]in 119897 120572 120573 color space are denoted as NSCT-3 and NSCT-4 respectively Accordingly NSST based MIF method usingGGD model [21] is termed as NSST-1 NSST based fusionscheme of the literature [38] and MIF method [39] byutilizing the features in NSST domain are termed as NSST-2andNSST-3 respectively NSST based statisticalMIFmethod[20] using HMT model is termed as NSST-4 For the NSCTand NSST methods we adopt the average-maximum fusionscheme namely the low frequency coefficients are fused bythe average of the corresponding coefficients and the highfrequency coefficients are fused by using absolute maximumFor allMSDmethods the original images are all decomposedinto 4 levels with the number of the directions 2 2 33 Additionally the quantitative comparison based on fiveimage fusion quality metrics is also employed to demonstratethe fusion performance of different methods

51 Experiments on CT-MRI Image Fusion Figure 5 showsa fusion experiment of CT and MRI image It can be seenthat PCA based method gives poor result relative to the otheralgorithms in which the bone structure of original CT imageis almost invisible For the GP based method the final imageis darker and has lower contrast some detail information isunclear The results from Figures 5(c) 5(f) 5(g) 5(h) and5(i) have some improvement to various degrees and produce

better visual effect on bone structures however the detailsof the soft tissue regions from these methods still retainunsharpness By contrast the proposed method can wellpreserve the detailed features of the original images withoutproducing visible artifacts Figure 6 is another example ofCT and MRI image fusion As seen from Figures 6(c) 6(d)6(e) and 6(f) their results have low contrast and lose a lotof details What is worse there are undesirable artifacts onthe edges of these final images (see regions labeled by the redellipses in Figure 6) Accordingly CT based method otherNSCT based methods and the proposed method providebetter visual effects with good contrast the abundant infor-mation of the source images can be successfully transferred tothe fused image Both tests imply that the proposed methodis suitable for fusion of CT and MRI images

52 Experiments onMRI-PET andMRI-SPECT Image FusionIn this section a case of MRI and PET fusion for a 70-year-old man affected with Alzheimerrsquos disease is shown InFigure 7 the source MRI image shows that the hemisphericsulci is widened andmore prominent in the parietal lobes thecorresponding PET shows that regional cerebral metabolismis abnormal and hypometabolism heavily happens in anteriortemporal and posterior parietal regions meanwhile the righthemisphere is slightly more affected than the left Herethe proposed method is compared with other seven fusionschemes Obviously the results of Figures 7(d) and 7(e) havedisadvantages with serious color distortion and low contrastAlthough the results of Figures 7(c) and 7(f) improve thefusion performance to some extent they are not saturated inbrightness so that some parts are unidentifiable Addition-ally for the contrast and color fidelity the results fromFigures

Computational and Mathematical Methods in Medicine 9

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 7 The fusion results of different fusion methods for the MRI and PET images (a) (b) Source images fused images by (c) DWT (d)PCA (e) IHS (f) GP (g) CT (h) NSST (i) NSST-1 and (j) the proposed method

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 8 The fusion results of different fusion methods for the first set of MRI and SPECT images (a) (b) Source images fused images by(c) DWT (d) GP (e) IHS (f) NSST-2 (g) NSST-3 (h) NSCT-3 (i) NSCT-4 and (j) the proposed method

7(g) 7(h) and 7(i) have better fusion performance than thesemethods mentioned above but the structural information oftheMRI image is not successfully transferred to fused imagesThrough the comparison of these fused results it is foundthat the proposed method can well extract the structuraland functional information from the source images and fusethem with much less information distortion As illustratedin these regions highlighted by red arrows and ellipses inFigure 7 the proposed method well preserves complemen-tary information of different modal medical images andachieves the best visual effect in terms of contrast clarity andcolor fidelity

Figure 8 is a fusion experiment of MRI and SPECTimages The source MRI image demonstrates that tumors arelocated in the left temporal region as shown in the high signalintensity region (the white region labeled by the right redarrow in Figure 8(a)) From Figures 8(e) 8(f) and 8(h) theresults produced by IHS based methods are distinctly colordistortion in the lesion region The results produced by theDWT andGP basedmethods cannot well inherit PET imagersquosfunctional information and produce the low contrast images(see Figures 8(c) and 8(d)) By comparison Figures 8(g) and8(i) gain better results in terms of contrast and color fidelityHowever for the spatial details the fused result obtained by

10 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 9 The fusion results of different fusion methods for the second set of MRI and SPECT images (a) (b) Source images fused imagesby (c) DWT (d) GP (e) IHS (f) ST (g) CT (h) NSST-4 (i) NSCT and (j) the proposed method

the proposedmethod ismore close to the originalMRI image(see the region labeled by the left red arrow) and the spectralfeatures are also natural Figure 9 provides another exampleof MRI and SPECT image fusion In this test the proposedmethod is specifically compared with the typical schemes[20] which is the MIF method based on the statisticaldependencies between coefficients in NSST domain Fromall the fused results it is easily observed that the proposedmethod not only inherits the salient information existing inboth the original images but also hardly causes the problemofcolor distortion Through the above examples it can be seenthat the proposed method can be extended to combine theanatomical and functional medical images and achieves goodvisual effects

53 Objective Evaluation and Analysis In addition to thevisual analysis five fusion quality metrics namely mutualinformation (MI) [40] entropy (EN) [41] spatial frequency(SF) 119876

119860119861119865 [42] and the uniform intensity distribution(UID) [43] are employed to test the validity of the proposedmethod They reflect the fusion performance from claritycontrast color distortions and the amount of informationMI as an information measure for evaluating image fusionperformance represents how much information is obtainedfrom the source images The higher value of EN shows thefused image has more information contents and the highervalue of SF indicates the final image is clearer The index119876119860119861119865 measures the amount of information transferred from

source images to the fused image and the UID is used for thedescription of uniform intensity and color distribution andthe higher UID means better color information

The quantitative comparisons are listed in Tables 1 2 and3 It can be seen that for all the indices the proposedmethodhas a stable performance (most of values rank the first andonly a few rank the second) It shows that the objective

results based on these quality metrics also coincide with thesubjective visual perception Particularly for theMI values ofall tests the proposed scheme all gets the largest value It isconfirmed that the proposed statistical model can transformmore detailed information from the source images into thefinal image by exploiting dependencies between the NSCTcoefficients Therefore it can be concluded that the proposedmethod is effective and is suitable for medical image fusion

54 Computational Complexity Analysis To investigate thecomputational complexity of different schemes we recordthe running time of different fusion algorithms used inFigure 5 (see Table 4) All the tests are implemented byMatlab 2014a on a PC with double Intel core i7-3770k CPU35GHz 8GB RAM As shown in Table 4 among all fusionmethods the consumption time of PCA based algorithm isthe lowest the reason of which is that it does not involvemultiscale decomposition Additionally DWT GP GF andCT basedmethods have also low time consumption (less than008 s) However their performance is poor Relatively threemethods based on NSCT are slower which is also a commonproblem of the algorithms based on NSCT Due to using thecomplex neural network and mechanism of NSCT NSCT-1 needs the most time (about 34 s) The proposed methodand NSCT-2 consume the similar time (1713 s and 1529 s)Actually the decomposition and reverse construction almostcost 910 of the total time Without exception the proposedmethod increases the cost of computational complexity withutilizing NSCT tool yet it achieves the better effect thanprevious methods

6 Conclusions

In this paper we propose a novel NSCT based statisticalmultimodal medical image fusion method which utilizes

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 4: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

4 Computational and Mathematical Methods in Medicine

50 10 15minus5minus10minus15

Coefficients

minus6

minus4

minus2

0

2

4

6

Pare

ntsrsquo

coeffi

cien

ts

(a)

minus30

minus20

minus10

0

10

20

30

Pare

ntsrsquo

coeffi

cien

ts

0 5 10minus5minus10

Coefficients(b)

minus30

minus20

minus10

0

10

20

30

Cou

sinrsquos

coeffi

cien

ts

3010 200minus30 minus20 minus10

Coefficients(c)

0 5 10minus5minus10

Coefficients

15

10

5

0

minus5

minus10

minus15

Cou

sinrsquos

coeffi

cien

ts

(d)

Figure 3 The conditional distribution between subband coefficients of the MRI image in Figure 2 (a) and (b) are the distributioncharacteristics between subband coefficients at different scales (c) and (d) are distribution characteristics of subband coefficientswith differentdirections at the same scale

and the extended tails in both sides of the peak whichindicates that the NSCT coefficients are sparse and themajority of coefficients are close to zero Further the kurtosisof each map is respectively measured as 2074 832 1044and 2055 (corresponding to (a) (b) (c) and (d) of the firstrow of Figure 4 in order) Clearly these values are muchlarger than the kurtosis of Gaussian distribution (kurtosisis about 30) What is more through a large number ofexperiments the coefficient characteristics (sparse and heavytailed phenomenon) are similar for other NSCT subbands Sothere exists a fact that the NSCT coefficients are sparse andhighly non-Gaussian

How to quantify the dependencies between NSCT coef-ficients by a statistical model is a subject worthy of studyInspired by the earlier statistical model of MSD coefficientsof image [23ndash26] in which the PDFs of coefficients across

scales and within scale are nicely fitted by the GGD functionwe fit the distribution characteristics by the same way andcalculate the dependencies of the NSCT coefficients Figure 5provides four PDFs of theNSCTcoefficients togetherwith thecurves of the fitted GGDs (as shown purple curves) It can beseen that these fitted curves are very close to the actual caseTherefore the statistical model can be applied to describe thespatial distribution characteristics of the NSCT coefficients

33 Statistics of the NSCT Coefficients The GGD model hasbeen extensively applied to describe the marginal densityof subband coefficients due to its flexible parametric formwhich adapts to a large family of distributions from super-Gaussian to sub-Gaussian Accordingly the approximation ofthe marginal density for the particular NSCT coefficient may

Computational and Mathematical Methods in Medicine 5

08

06

04

02

0

Prob

abili

ty d

ensit

y

The parameters of GGD120572 = 07474 120573 = 03720

minus10 minus5 0 5 10

NSCT coefficient amplitudes

minus10 minus5 0 5 10

Kurtosis = 207422

NSCT coefficient amplitudes

times104

25

2

15

1

05

0

The h

istog

ram

(a)

The parameters of GGD120572 = 11213 120573 = 04926

Prob

abili

ty d

ensit

y

08

06

04

02

0

1

minus5 0 5

NSCT coefficient amplitudes

times103

minus5 0 5

Kurtosis = 83149

NSCT coefficient amplitudes

The h

istog

ram

12

10

8

6

4

2

0

(b)

Prob

abili

ty d

ensit

y

The parameters of GGD120572 = 09625 120573 = 04120

08

06

04

02

0

1

NSCT coefficient amplitudesminus6 minus4 minus2 0 2 4 6

The h

istog

ram

Kurtosis = 104347

NSCT coefficient amplitudes

16

14

12

10

8

6

4

2

0minus6 minus4 minus2 0 2 4 6

times103

(c)

The parameters of GGD120572 = 08259 120573 = 03035

Prob

abili

ty d

ensit

y

08

06

04

02

0

1

minus5 0 5

NSCT coefficient amplitudes

25

2

3

15

1

05

0

The h

istog

ram

Kurtosis = 205509

minus5 0 5

NSCT coefficient amplitudes

times104

(d)

Figure 4 Histograms of four distribution maps in Figure 3 (the first row) and the curves fitted with GGDs (the second row)

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 5 The fusion results of different fusion methods for the first set of CT and MRI images (a) (b) Source images fused images by (c)DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

be achieved by varying two parameters of the GGD which isdefined as

119875 (119909 120572 120573) =120573

2120572Γ (1120573)119890minus(|119909|120572)

120573

(1)

where Γ(sdot) is the Gamma function 120572 is the scale parameter(width of the PDF peak) and 120573 is the shape parameterwhich tunes the decay rate of the density function Normally

the parameters 120572 and 120573 are computed by the maximumlikelihood (ML) estimator which has shown to be the desiredestimator [27] As for each subband the likelihood functionof the sample 119909 = (1199091 1199092 119909119899) is defined as

119871 (119909 120572 120573) = log119899

prod

119894=1

119901 (119909119894 120572 120573) (2)

6 Computational and Mathematical Methods in Medicine

In this case 120572 and 120573 are parameters that need to beestimated We can obtain the unique root by the likelihoodequations below here Ψ(sdot) denotes the digamma function

120597119871 (119909 120572 120573)

120597120573= minus

119871

120572+

119873

sum

119894=1

1205731003816100381610038161003816119909119894

1003816100381610038161003816120573120572minus120573

120572= 0 (3)

120597119871 (119909 120572 120573)

120597120573=

119871

120573+

119871Ψ (1120573)

1205732

minus

119873

sum

119894=1

(

10038161003816100381610038161199091198941003816100381610038161003816

120572)

120573

log(1003816100381610038161003816119909119894

1003816100381610038161003816

120572) = 0

(4)

Let 120573 be fixed and 120573 gt 0 then (4) has the unique solutionwhich is the real and positive value

= (120573

119873

119873

sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

1120573

(5)

Combining (4) and (5) the shape parameter 120573 can besolved by the following transcendental equation

1 +Ψ (1120573)

120573minus

sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573 log 1003816100381610038161003816119909119894

1003816100381610038161003816

sum1003816100381610038161003816119909119894

1003816100381610038161003816120573

+

log((120573119873)sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

120573= 0

(6)

In (6) the determination of 120573 can be effectively solvedusing Newton-Raphson iterative procedure [26 28] and thealgorithm is detailedly described in [22]Therefore with onlytwo parameters we can accurately characterize the marginaldistribution of the NSCT coefficients

34 The Dependency of Different NSCT Subbands The KLDis a common and justified way of measuring the distancebetween two distributions119863KL(119875 119876) is applied to describethe deficiency of using one distribution 119902 to represent the truedistribution 119901 which is generally used for comparing tworelated distributions The KLD between two distributions for119875119876 the PDFs of which are respectively denoted as 119901(119883 1205791)119901(119883 1205792) is defined as

119863KL (119875 119876) = 119863 (119901 (119909 1205791) 119901 (119909 1205792))

= int119901 (119909 1205791) log119901 (119909 1205791)

119901 (119909 1205792)119889119909

(7)

where 1205791 and 1205792 are a set of estimated parameters Giventwo GGD distributions of NSCT subbands the similaritybetween two GGDs for NSCT subbands can be defined bythe parameters 120572 and 120573 Substitute (1) into (7) and after some

manipulations the KLD between two PDFs can be expressedas

119863KL (119875 119876) = 119863KL (119901 (sdot 1198861 1205731) 119901 (sdot 1205722 1205732))

= log(12057311205722Γ (11205732)

12057321205721Γ (11205731))

+ (1205721

1205722

)

1205732 Γ ((1205732 + 1) 1205731)

Γ (11205731)minus

1

1205731

(8)

However there are some deficiencies with the KLDwhich makes it less ideal First the KLD is asymmetricthat is (119863KL(119875 119876)) is different from (119863KL(119876 119875))Second if 119902(119909) = 0 and 119901(119909) = 0 for any 119909 then119863KL(119875 119876) is undefined Third the KLD does not offerany nice upper bounds [14] On the other hand the JSDhas the characteristics of nonnegativity finiteness symmetryand boundedness [15 29] So we use the symmetric JSD tomeasure the similarity between two NSCT subbands in thisstudy The JSD between GGDs is derived from the KLDmathematically it is defined as

119863JS (119875 119876) = 119863JS (119876 119875)

=1

2(119863KL (119875 119872) + 119863KL (119876 119872))

119872 =119875 + 119876

2

(9)

4 The Proposed Image Fusion Technique

41 Fusion of the Low Frequency Coefficients The low fre-quency subbands represent the approximation componentsof the source images The simplest way to combine subbandcoefficients is the averaging method However this methodeasily leads to the low contrast and blurred result To extractmore useful information from the source images for thelow frequency coefficients we employ the fusion rule basedon two activity level measurements which consists of theregional standard deviation and Shannon entropy In prin-ciple the local texture features of an image are related withthe variation of the coefficients around neighborhood Onthe other hand the entropy indicates how much informationan image contains Thus combining the two together canextract more complementary information present in thesource images The process is listed as follows

(1) Computing the regional standard deviation119863120582(119909 119910)

119863120582 (119909 119910)

= radic sum

119898isin119872119899isin119873

120596 (119898 119899) times [119862120582 (119909 + 119898 119910 + 119899) minus 119878120582 (119909 119910)]2(10)

(2) Calculating the normalized Shannon entropy

119864120582 (119909 119910) =1

|119877|sum

119894119895

(119862120582

0(119894 119895))

2

log (1198621205820(119894 119895))

2

(11)

Computational and Mathematical Methods in Medicine 7

(3) Computing the weights (120575120582 120585120582) of the standarddeviation 119863120582(119909 119910) and the information entropy 119864120582(119909 119910)respectively

120575120582 =

1003816100381610038161003816119863120582 (119909 119910)1003816100381610038161003816120572

1003816100381610038161003816119863119860 (119909 119910)1003816100381610038161003816120572+1003816100381610038161003816119863119861 (119909 119910)

1003816100381610038161003816120572

120585120582 =119864120582 (119909 119910)

119864119860 (119909 119910) + 119864119861 (119909 119910)

(12)

where the parameter 120572 is a constant which tunes thesharpness of fused image by adjusting the value of parameterit is set to 12 in our experiment

Let 119862120582

0(119909 119910) denote the low frequency subband coeffi-

cient at location (119909 119910)120582 is input image119860119861 Finally the fusedimage can be obtained by

119862119865

0(119909 119910) = sum

120582=119860119861

[120575120582119862120582

0(119909 119910) + 120585120582119862

120582

0(119909 119910)] (13)

42 Fusion of the High Frequency Coefficients High fre-quency subbands correspond to detailed information in theseregions such as edges lines and corners Because differentimaging modalities contain redundant and complementaryinformation of each other the purpose of selection rule ismainly to capture salient information of the source imagesas much as possible Maximum selection rule is not suitablefor medical image fusion because it works well on thispremise that only an original image provides good pixelat each corresponding location thus vast complementaryinformation will be lost when it is used for MIF To improvethe fusion performance for the high frequency subbands wepropose the fusion scheme based on weight maps which aredetermined by the saliency maps According to the fact thatthere exist dependencies between the NSCT coefficients thehigh frequency coefficients are first updated by utilizing therelationships between NSCT subbands and then combiningtogether by using weight maps The process is described asfollows

(1) Updating of the High Frequency Subband CoefficientsFirst we calculate the horizontal dependency 119895119904119889

119897120579ℎbetween

coefficients with different directions at the same scale 119897 as

119895119904119889119897120579ℎ

(119909 119910) =

119870

sum

119895=1119895 =119894

119863JS (119862119897120579119894 (119909 119910) 119862119897120579119895 (119909 119910)) (14)

where119870 is the total of the subbands at the 119897th scaleThen we calculate the vertical dependency 119895119904119889

119897120579Vbetween the specified subbandrsquos (for instance subband 119894)parents and children

119895119904119889119897120579V (119909 119910) =

119870

sum

119895=1

119863JS (119862119897120579119894 (119909 119910) 119862119897minus1120579119895 (119909 119910))

+ 119863JS (119862119897120579119894 (119909 119910) 119862119897+1120579119895 (119909 119910))

(15)

Further the horizontal and vertical dependency compo-nents are normalized respectively

119895119904119889119897120579ℎ

(119909 119910) =119895119904119889119897120579ℎ

(119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

119895119904119889119897120579V (119909 119910) =

119895119904119889119897120579V (119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

(16)

Finally the high frequency NSCT coefficients are revisedas

119862119897120579 (119909 119910)

= 119862119897120579 (119909 119910)radic1 + 119895119904119889

119897120579ℎ(119909 119910)

2+ 119895119904119889119897120579V (119909 119910)

2

(17)

(2) Construction of Weight Maps Weight maps are derivedfrom the saliency maps which describe each pixel by thesaliency level of salient information We apply Gaussianfilter to each high pass subband which tends to assign ahigh weight value to important elements such as edges andcorners A saliency map is constructed by the local average ofthe absolute value of the filter response

119878119897120579 (119909 119910) =1003816100381610038161003816119862119897120579 (119909 119910)

1003816100381610038161003816 lowast 119892119903119892 120579119892(119909 119910) (18)

where119892(sdot) is a Gaussian low pass filter whose size is (2119903119892+1)times(2119903119892 + 1) and the parameters 119903119892 and 120579119892 are set to 5 Next theweight maps are determined by comparison of the saliencymaps (119878119899

119897120579(119909 119910) 119899 isin [119860 119861])

119882119899

119897120579(119909 119910)

=

1 if 119878119899119897120579

(119909 119910) = max (119878119860119897120579

(119909 119910) 119878119861

119897120579(119909 119910))

0 otherwise

(19)

Finally the fused subband coefficients 119862119865

119897120579(119909 119910) can be

obtained by the weighted summation

119862119865

119897120579(119909 119910) = 119882

119860

119897120579(119909 119910) 119862

119860

119897120579(119909 119910)

+ 119882119861

119897120579(119909 119910) 119862

119861

119897120579(119909 119910)

(20)

5 Experimental Results and Discussion

Five different data sets of human brain images are usedand the source images consist of two different modalitiesincluding CTMRI MRIPET and MRISPECT images Allthe images have the size of 256 times 256 pixels which havebeen registered by some kind of registration method as [30]To verify the effectiveness and applicability of the proposedfusion scheme the results produced by the proposed methodare respectively compared with results of other state-of-the-art schemes such as discrete wavelet transform (DWT) [1]gradient pyramid (GP) [31] principal component analysis(PCA) [32] Intensity Hue and Saturation color model (IHS)[33] guided filtering (GF) [34] the contourlet transform(CT) [3] NSCT the shearlet transform (ST) [35] and

8 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 6 The fusion results of different fusion methods for the second set of CT and MRI images (a) (b) Source images fused images by(c) DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

the nonsubsampled shearlet transform (NSST) based meth-ods For simplicity MIF method [12] based on pulse-coupledneural network and modified spatial frequency in NSCTdomain is denoted as NSCT-1 NSCT based MIF method inthe scheme [36] is denoted as NSCT-2 The fusion method[37] based on neighborhood characteristic and regionaliza-tion in NSCT domain and NSCT based MIF method [10]in 119897 120572 120573 color space are denoted as NSCT-3 and NSCT-4 respectively Accordingly NSST based MIF method usingGGD model [21] is termed as NSST-1 NSST based fusionscheme of the literature [38] and MIF method [39] byutilizing the features in NSST domain are termed as NSST-2andNSST-3 respectively NSST based statisticalMIFmethod[20] using HMT model is termed as NSST-4 For the NSCTand NSST methods we adopt the average-maximum fusionscheme namely the low frequency coefficients are fused bythe average of the corresponding coefficients and the highfrequency coefficients are fused by using absolute maximumFor allMSDmethods the original images are all decomposedinto 4 levels with the number of the directions 2 2 33 Additionally the quantitative comparison based on fiveimage fusion quality metrics is also employed to demonstratethe fusion performance of different methods

51 Experiments on CT-MRI Image Fusion Figure 5 showsa fusion experiment of CT and MRI image It can be seenthat PCA based method gives poor result relative to the otheralgorithms in which the bone structure of original CT imageis almost invisible For the GP based method the final imageis darker and has lower contrast some detail information isunclear The results from Figures 5(c) 5(f) 5(g) 5(h) and5(i) have some improvement to various degrees and produce

better visual effect on bone structures however the detailsof the soft tissue regions from these methods still retainunsharpness By contrast the proposed method can wellpreserve the detailed features of the original images withoutproducing visible artifacts Figure 6 is another example ofCT and MRI image fusion As seen from Figures 6(c) 6(d)6(e) and 6(f) their results have low contrast and lose a lotof details What is worse there are undesirable artifacts onthe edges of these final images (see regions labeled by the redellipses in Figure 6) Accordingly CT based method otherNSCT based methods and the proposed method providebetter visual effects with good contrast the abundant infor-mation of the source images can be successfully transferred tothe fused image Both tests imply that the proposed methodis suitable for fusion of CT and MRI images

52 Experiments onMRI-PET andMRI-SPECT Image FusionIn this section a case of MRI and PET fusion for a 70-year-old man affected with Alzheimerrsquos disease is shown InFigure 7 the source MRI image shows that the hemisphericsulci is widened andmore prominent in the parietal lobes thecorresponding PET shows that regional cerebral metabolismis abnormal and hypometabolism heavily happens in anteriortemporal and posterior parietal regions meanwhile the righthemisphere is slightly more affected than the left Herethe proposed method is compared with other seven fusionschemes Obviously the results of Figures 7(d) and 7(e) havedisadvantages with serious color distortion and low contrastAlthough the results of Figures 7(c) and 7(f) improve thefusion performance to some extent they are not saturated inbrightness so that some parts are unidentifiable Addition-ally for the contrast and color fidelity the results fromFigures

Computational and Mathematical Methods in Medicine 9

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 7 The fusion results of different fusion methods for the MRI and PET images (a) (b) Source images fused images by (c) DWT (d)PCA (e) IHS (f) GP (g) CT (h) NSST (i) NSST-1 and (j) the proposed method

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 8 The fusion results of different fusion methods for the first set of MRI and SPECT images (a) (b) Source images fused images by(c) DWT (d) GP (e) IHS (f) NSST-2 (g) NSST-3 (h) NSCT-3 (i) NSCT-4 and (j) the proposed method

7(g) 7(h) and 7(i) have better fusion performance than thesemethods mentioned above but the structural information oftheMRI image is not successfully transferred to fused imagesThrough the comparison of these fused results it is foundthat the proposed method can well extract the structuraland functional information from the source images and fusethem with much less information distortion As illustratedin these regions highlighted by red arrows and ellipses inFigure 7 the proposed method well preserves complemen-tary information of different modal medical images andachieves the best visual effect in terms of contrast clarity andcolor fidelity

Figure 8 is a fusion experiment of MRI and SPECTimages The source MRI image demonstrates that tumors arelocated in the left temporal region as shown in the high signalintensity region (the white region labeled by the right redarrow in Figure 8(a)) From Figures 8(e) 8(f) and 8(h) theresults produced by IHS based methods are distinctly colordistortion in the lesion region The results produced by theDWT andGP basedmethods cannot well inherit PET imagersquosfunctional information and produce the low contrast images(see Figures 8(c) and 8(d)) By comparison Figures 8(g) and8(i) gain better results in terms of contrast and color fidelityHowever for the spatial details the fused result obtained by

10 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 9 The fusion results of different fusion methods for the second set of MRI and SPECT images (a) (b) Source images fused imagesby (c) DWT (d) GP (e) IHS (f) ST (g) CT (h) NSST-4 (i) NSCT and (j) the proposed method

the proposedmethod ismore close to the originalMRI image(see the region labeled by the left red arrow) and the spectralfeatures are also natural Figure 9 provides another exampleof MRI and SPECT image fusion In this test the proposedmethod is specifically compared with the typical schemes[20] which is the MIF method based on the statisticaldependencies between coefficients in NSST domain Fromall the fused results it is easily observed that the proposedmethod not only inherits the salient information existing inboth the original images but also hardly causes the problemofcolor distortion Through the above examples it can be seenthat the proposed method can be extended to combine theanatomical and functional medical images and achieves goodvisual effects

53 Objective Evaluation and Analysis In addition to thevisual analysis five fusion quality metrics namely mutualinformation (MI) [40] entropy (EN) [41] spatial frequency(SF) 119876

119860119861119865 [42] and the uniform intensity distribution(UID) [43] are employed to test the validity of the proposedmethod They reflect the fusion performance from claritycontrast color distortions and the amount of informationMI as an information measure for evaluating image fusionperformance represents how much information is obtainedfrom the source images The higher value of EN shows thefused image has more information contents and the highervalue of SF indicates the final image is clearer The index119876119860119861119865 measures the amount of information transferred from

source images to the fused image and the UID is used for thedescription of uniform intensity and color distribution andthe higher UID means better color information

The quantitative comparisons are listed in Tables 1 2 and3 It can be seen that for all the indices the proposedmethodhas a stable performance (most of values rank the first andonly a few rank the second) It shows that the objective

results based on these quality metrics also coincide with thesubjective visual perception Particularly for theMI values ofall tests the proposed scheme all gets the largest value It isconfirmed that the proposed statistical model can transformmore detailed information from the source images into thefinal image by exploiting dependencies between the NSCTcoefficients Therefore it can be concluded that the proposedmethod is effective and is suitable for medical image fusion

54 Computational Complexity Analysis To investigate thecomputational complexity of different schemes we recordthe running time of different fusion algorithms used inFigure 5 (see Table 4) All the tests are implemented byMatlab 2014a on a PC with double Intel core i7-3770k CPU35GHz 8GB RAM As shown in Table 4 among all fusionmethods the consumption time of PCA based algorithm isthe lowest the reason of which is that it does not involvemultiscale decomposition Additionally DWT GP GF andCT basedmethods have also low time consumption (less than008 s) However their performance is poor Relatively threemethods based on NSCT are slower which is also a commonproblem of the algorithms based on NSCT Due to using thecomplex neural network and mechanism of NSCT NSCT-1 needs the most time (about 34 s) The proposed methodand NSCT-2 consume the similar time (1713 s and 1529 s)Actually the decomposition and reverse construction almostcost 910 of the total time Without exception the proposedmethod increases the cost of computational complexity withutilizing NSCT tool yet it achieves the better effect thanprevious methods

6 Conclusions

In this paper we propose a novel NSCT based statisticalmultimodal medical image fusion method which utilizes

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 5: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

Computational and Mathematical Methods in Medicine 5

08

06

04

02

0

Prob

abili

ty d

ensit

y

The parameters of GGD120572 = 07474 120573 = 03720

minus10 minus5 0 5 10

NSCT coefficient amplitudes

minus10 minus5 0 5 10

Kurtosis = 207422

NSCT coefficient amplitudes

times104

25

2

15

1

05

0

The h

istog

ram

(a)

The parameters of GGD120572 = 11213 120573 = 04926

Prob

abili

ty d

ensit

y

08

06

04

02

0

1

minus5 0 5

NSCT coefficient amplitudes

times103

minus5 0 5

Kurtosis = 83149

NSCT coefficient amplitudes

The h

istog

ram

12

10

8

6

4

2

0

(b)

Prob

abili

ty d

ensit

y

The parameters of GGD120572 = 09625 120573 = 04120

08

06

04

02

0

1

NSCT coefficient amplitudesminus6 minus4 minus2 0 2 4 6

The h

istog

ram

Kurtosis = 104347

NSCT coefficient amplitudes

16

14

12

10

8

6

4

2

0minus6 minus4 minus2 0 2 4 6

times103

(c)

The parameters of GGD120572 = 08259 120573 = 03035

Prob

abili

ty d

ensit

y

08

06

04

02

0

1

minus5 0 5

NSCT coefficient amplitudes

25

2

3

15

1

05

0

The h

istog

ram

Kurtosis = 205509

minus5 0 5

NSCT coefficient amplitudes

times104

(d)

Figure 4 Histograms of four distribution maps in Figure 3 (the first row) and the curves fitted with GGDs (the second row)

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 5 The fusion results of different fusion methods for the first set of CT and MRI images (a) (b) Source images fused images by (c)DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

be achieved by varying two parameters of the GGD which isdefined as

119875 (119909 120572 120573) =120573

2120572Γ (1120573)119890minus(|119909|120572)

120573

(1)

where Γ(sdot) is the Gamma function 120572 is the scale parameter(width of the PDF peak) and 120573 is the shape parameterwhich tunes the decay rate of the density function Normally

the parameters 120572 and 120573 are computed by the maximumlikelihood (ML) estimator which has shown to be the desiredestimator [27] As for each subband the likelihood functionof the sample 119909 = (1199091 1199092 119909119899) is defined as

119871 (119909 120572 120573) = log119899

prod

119894=1

119901 (119909119894 120572 120573) (2)

6 Computational and Mathematical Methods in Medicine

In this case 120572 and 120573 are parameters that need to beestimated We can obtain the unique root by the likelihoodequations below here Ψ(sdot) denotes the digamma function

120597119871 (119909 120572 120573)

120597120573= minus

119871

120572+

119873

sum

119894=1

1205731003816100381610038161003816119909119894

1003816100381610038161003816120573120572minus120573

120572= 0 (3)

120597119871 (119909 120572 120573)

120597120573=

119871

120573+

119871Ψ (1120573)

1205732

minus

119873

sum

119894=1

(

10038161003816100381610038161199091198941003816100381610038161003816

120572)

120573

log(1003816100381610038161003816119909119894

1003816100381610038161003816

120572) = 0

(4)

Let 120573 be fixed and 120573 gt 0 then (4) has the unique solutionwhich is the real and positive value

= (120573

119873

119873

sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

1120573

(5)

Combining (4) and (5) the shape parameter 120573 can besolved by the following transcendental equation

1 +Ψ (1120573)

120573minus

sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573 log 1003816100381610038161003816119909119894

1003816100381610038161003816

sum1003816100381610038161003816119909119894

1003816100381610038161003816120573

+

log((120573119873)sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

120573= 0

(6)

In (6) the determination of 120573 can be effectively solvedusing Newton-Raphson iterative procedure [26 28] and thealgorithm is detailedly described in [22]Therefore with onlytwo parameters we can accurately characterize the marginaldistribution of the NSCT coefficients

34 The Dependency of Different NSCT Subbands The KLDis a common and justified way of measuring the distancebetween two distributions119863KL(119875 119876) is applied to describethe deficiency of using one distribution 119902 to represent the truedistribution 119901 which is generally used for comparing tworelated distributions The KLD between two distributions for119875119876 the PDFs of which are respectively denoted as 119901(119883 1205791)119901(119883 1205792) is defined as

119863KL (119875 119876) = 119863 (119901 (119909 1205791) 119901 (119909 1205792))

= int119901 (119909 1205791) log119901 (119909 1205791)

119901 (119909 1205792)119889119909

(7)

where 1205791 and 1205792 are a set of estimated parameters Giventwo GGD distributions of NSCT subbands the similaritybetween two GGDs for NSCT subbands can be defined bythe parameters 120572 and 120573 Substitute (1) into (7) and after some

manipulations the KLD between two PDFs can be expressedas

119863KL (119875 119876) = 119863KL (119901 (sdot 1198861 1205731) 119901 (sdot 1205722 1205732))

= log(12057311205722Γ (11205732)

12057321205721Γ (11205731))

+ (1205721

1205722

)

1205732 Γ ((1205732 + 1) 1205731)

Γ (11205731)minus

1

1205731

(8)

However there are some deficiencies with the KLDwhich makes it less ideal First the KLD is asymmetricthat is (119863KL(119875 119876)) is different from (119863KL(119876 119875))Second if 119902(119909) = 0 and 119901(119909) = 0 for any 119909 then119863KL(119875 119876) is undefined Third the KLD does not offerany nice upper bounds [14] On the other hand the JSDhas the characteristics of nonnegativity finiteness symmetryand boundedness [15 29] So we use the symmetric JSD tomeasure the similarity between two NSCT subbands in thisstudy The JSD between GGDs is derived from the KLDmathematically it is defined as

119863JS (119875 119876) = 119863JS (119876 119875)

=1

2(119863KL (119875 119872) + 119863KL (119876 119872))

119872 =119875 + 119876

2

(9)

4 The Proposed Image Fusion Technique

41 Fusion of the Low Frequency Coefficients The low fre-quency subbands represent the approximation componentsof the source images The simplest way to combine subbandcoefficients is the averaging method However this methodeasily leads to the low contrast and blurred result To extractmore useful information from the source images for thelow frequency coefficients we employ the fusion rule basedon two activity level measurements which consists of theregional standard deviation and Shannon entropy In prin-ciple the local texture features of an image are related withthe variation of the coefficients around neighborhood Onthe other hand the entropy indicates how much informationan image contains Thus combining the two together canextract more complementary information present in thesource images The process is listed as follows

(1) Computing the regional standard deviation119863120582(119909 119910)

119863120582 (119909 119910)

= radic sum

119898isin119872119899isin119873

120596 (119898 119899) times [119862120582 (119909 + 119898 119910 + 119899) minus 119878120582 (119909 119910)]2(10)

(2) Calculating the normalized Shannon entropy

119864120582 (119909 119910) =1

|119877|sum

119894119895

(119862120582

0(119894 119895))

2

log (1198621205820(119894 119895))

2

(11)

Computational and Mathematical Methods in Medicine 7

(3) Computing the weights (120575120582 120585120582) of the standarddeviation 119863120582(119909 119910) and the information entropy 119864120582(119909 119910)respectively

120575120582 =

1003816100381610038161003816119863120582 (119909 119910)1003816100381610038161003816120572

1003816100381610038161003816119863119860 (119909 119910)1003816100381610038161003816120572+1003816100381610038161003816119863119861 (119909 119910)

1003816100381610038161003816120572

120585120582 =119864120582 (119909 119910)

119864119860 (119909 119910) + 119864119861 (119909 119910)

(12)

where the parameter 120572 is a constant which tunes thesharpness of fused image by adjusting the value of parameterit is set to 12 in our experiment

Let 119862120582

0(119909 119910) denote the low frequency subband coeffi-

cient at location (119909 119910)120582 is input image119860119861 Finally the fusedimage can be obtained by

119862119865

0(119909 119910) = sum

120582=119860119861

[120575120582119862120582

0(119909 119910) + 120585120582119862

120582

0(119909 119910)] (13)

42 Fusion of the High Frequency Coefficients High fre-quency subbands correspond to detailed information in theseregions such as edges lines and corners Because differentimaging modalities contain redundant and complementaryinformation of each other the purpose of selection rule ismainly to capture salient information of the source imagesas much as possible Maximum selection rule is not suitablefor medical image fusion because it works well on thispremise that only an original image provides good pixelat each corresponding location thus vast complementaryinformation will be lost when it is used for MIF To improvethe fusion performance for the high frequency subbands wepropose the fusion scheme based on weight maps which aredetermined by the saliency maps According to the fact thatthere exist dependencies between the NSCT coefficients thehigh frequency coefficients are first updated by utilizing therelationships between NSCT subbands and then combiningtogether by using weight maps The process is described asfollows

(1) Updating of the High Frequency Subband CoefficientsFirst we calculate the horizontal dependency 119895119904119889

119897120579ℎbetween

coefficients with different directions at the same scale 119897 as

119895119904119889119897120579ℎ

(119909 119910) =

119870

sum

119895=1119895 =119894

119863JS (119862119897120579119894 (119909 119910) 119862119897120579119895 (119909 119910)) (14)

where119870 is the total of the subbands at the 119897th scaleThen we calculate the vertical dependency 119895119904119889

119897120579Vbetween the specified subbandrsquos (for instance subband 119894)parents and children

119895119904119889119897120579V (119909 119910) =

119870

sum

119895=1

119863JS (119862119897120579119894 (119909 119910) 119862119897minus1120579119895 (119909 119910))

+ 119863JS (119862119897120579119894 (119909 119910) 119862119897+1120579119895 (119909 119910))

(15)

Further the horizontal and vertical dependency compo-nents are normalized respectively

119895119904119889119897120579ℎ

(119909 119910) =119895119904119889119897120579ℎ

(119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

119895119904119889119897120579V (119909 119910) =

119895119904119889119897120579V (119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

(16)

Finally the high frequency NSCT coefficients are revisedas

119862119897120579 (119909 119910)

= 119862119897120579 (119909 119910)radic1 + 119895119904119889

119897120579ℎ(119909 119910)

2+ 119895119904119889119897120579V (119909 119910)

2

(17)

(2) Construction of Weight Maps Weight maps are derivedfrom the saliency maps which describe each pixel by thesaliency level of salient information We apply Gaussianfilter to each high pass subband which tends to assign ahigh weight value to important elements such as edges andcorners A saliency map is constructed by the local average ofthe absolute value of the filter response

119878119897120579 (119909 119910) =1003816100381610038161003816119862119897120579 (119909 119910)

1003816100381610038161003816 lowast 119892119903119892 120579119892(119909 119910) (18)

where119892(sdot) is a Gaussian low pass filter whose size is (2119903119892+1)times(2119903119892 + 1) and the parameters 119903119892 and 120579119892 are set to 5 Next theweight maps are determined by comparison of the saliencymaps (119878119899

119897120579(119909 119910) 119899 isin [119860 119861])

119882119899

119897120579(119909 119910)

=

1 if 119878119899119897120579

(119909 119910) = max (119878119860119897120579

(119909 119910) 119878119861

119897120579(119909 119910))

0 otherwise

(19)

Finally the fused subband coefficients 119862119865

119897120579(119909 119910) can be

obtained by the weighted summation

119862119865

119897120579(119909 119910) = 119882

119860

119897120579(119909 119910) 119862

119860

119897120579(119909 119910)

+ 119882119861

119897120579(119909 119910) 119862

119861

119897120579(119909 119910)

(20)

5 Experimental Results and Discussion

Five different data sets of human brain images are usedand the source images consist of two different modalitiesincluding CTMRI MRIPET and MRISPECT images Allthe images have the size of 256 times 256 pixels which havebeen registered by some kind of registration method as [30]To verify the effectiveness and applicability of the proposedfusion scheme the results produced by the proposed methodare respectively compared with results of other state-of-the-art schemes such as discrete wavelet transform (DWT) [1]gradient pyramid (GP) [31] principal component analysis(PCA) [32] Intensity Hue and Saturation color model (IHS)[33] guided filtering (GF) [34] the contourlet transform(CT) [3] NSCT the shearlet transform (ST) [35] and

8 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 6 The fusion results of different fusion methods for the second set of CT and MRI images (a) (b) Source images fused images by(c) DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

the nonsubsampled shearlet transform (NSST) based meth-ods For simplicity MIF method [12] based on pulse-coupledneural network and modified spatial frequency in NSCTdomain is denoted as NSCT-1 NSCT based MIF method inthe scheme [36] is denoted as NSCT-2 The fusion method[37] based on neighborhood characteristic and regionaliza-tion in NSCT domain and NSCT based MIF method [10]in 119897 120572 120573 color space are denoted as NSCT-3 and NSCT-4 respectively Accordingly NSST based MIF method usingGGD model [21] is termed as NSST-1 NSST based fusionscheme of the literature [38] and MIF method [39] byutilizing the features in NSST domain are termed as NSST-2andNSST-3 respectively NSST based statisticalMIFmethod[20] using HMT model is termed as NSST-4 For the NSCTand NSST methods we adopt the average-maximum fusionscheme namely the low frequency coefficients are fused bythe average of the corresponding coefficients and the highfrequency coefficients are fused by using absolute maximumFor allMSDmethods the original images are all decomposedinto 4 levels with the number of the directions 2 2 33 Additionally the quantitative comparison based on fiveimage fusion quality metrics is also employed to demonstratethe fusion performance of different methods

51 Experiments on CT-MRI Image Fusion Figure 5 showsa fusion experiment of CT and MRI image It can be seenthat PCA based method gives poor result relative to the otheralgorithms in which the bone structure of original CT imageis almost invisible For the GP based method the final imageis darker and has lower contrast some detail information isunclear The results from Figures 5(c) 5(f) 5(g) 5(h) and5(i) have some improvement to various degrees and produce

better visual effect on bone structures however the detailsof the soft tissue regions from these methods still retainunsharpness By contrast the proposed method can wellpreserve the detailed features of the original images withoutproducing visible artifacts Figure 6 is another example ofCT and MRI image fusion As seen from Figures 6(c) 6(d)6(e) and 6(f) their results have low contrast and lose a lotof details What is worse there are undesirable artifacts onthe edges of these final images (see regions labeled by the redellipses in Figure 6) Accordingly CT based method otherNSCT based methods and the proposed method providebetter visual effects with good contrast the abundant infor-mation of the source images can be successfully transferred tothe fused image Both tests imply that the proposed methodis suitable for fusion of CT and MRI images

52 Experiments onMRI-PET andMRI-SPECT Image FusionIn this section a case of MRI and PET fusion for a 70-year-old man affected with Alzheimerrsquos disease is shown InFigure 7 the source MRI image shows that the hemisphericsulci is widened andmore prominent in the parietal lobes thecorresponding PET shows that regional cerebral metabolismis abnormal and hypometabolism heavily happens in anteriortemporal and posterior parietal regions meanwhile the righthemisphere is slightly more affected than the left Herethe proposed method is compared with other seven fusionschemes Obviously the results of Figures 7(d) and 7(e) havedisadvantages with serious color distortion and low contrastAlthough the results of Figures 7(c) and 7(f) improve thefusion performance to some extent they are not saturated inbrightness so that some parts are unidentifiable Addition-ally for the contrast and color fidelity the results fromFigures

Computational and Mathematical Methods in Medicine 9

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 7 The fusion results of different fusion methods for the MRI and PET images (a) (b) Source images fused images by (c) DWT (d)PCA (e) IHS (f) GP (g) CT (h) NSST (i) NSST-1 and (j) the proposed method

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 8 The fusion results of different fusion methods for the first set of MRI and SPECT images (a) (b) Source images fused images by(c) DWT (d) GP (e) IHS (f) NSST-2 (g) NSST-3 (h) NSCT-3 (i) NSCT-4 and (j) the proposed method

7(g) 7(h) and 7(i) have better fusion performance than thesemethods mentioned above but the structural information oftheMRI image is not successfully transferred to fused imagesThrough the comparison of these fused results it is foundthat the proposed method can well extract the structuraland functional information from the source images and fusethem with much less information distortion As illustratedin these regions highlighted by red arrows and ellipses inFigure 7 the proposed method well preserves complemen-tary information of different modal medical images andachieves the best visual effect in terms of contrast clarity andcolor fidelity

Figure 8 is a fusion experiment of MRI and SPECTimages The source MRI image demonstrates that tumors arelocated in the left temporal region as shown in the high signalintensity region (the white region labeled by the right redarrow in Figure 8(a)) From Figures 8(e) 8(f) and 8(h) theresults produced by IHS based methods are distinctly colordistortion in the lesion region The results produced by theDWT andGP basedmethods cannot well inherit PET imagersquosfunctional information and produce the low contrast images(see Figures 8(c) and 8(d)) By comparison Figures 8(g) and8(i) gain better results in terms of contrast and color fidelityHowever for the spatial details the fused result obtained by

10 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 9 The fusion results of different fusion methods for the second set of MRI and SPECT images (a) (b) Source images fused imagesby (c) DWT (d) GP (e) IHS (f) ST (g) CT (h) NSST-4 (i) NSCT and (j) the proposed method

the proposedmethod ismore close to the originalMRI image(see the region labeled by the left red arrow) and the spectralfeatures are also natural Figure 9 provides another exampleof MRI and SPECT image fusion In this test the proposedmethod is specifically compared with the typical schemes[20] which is the MIF method based on the statisticaldependencies between coefficients in NSST domain Fromall the fused results it is easily observed that the proposedmethod not only inherits the salient information existing inboth the original images but also hardly causes the problemofcolor distortion Through the above examples it can be seenthat the proposed method can be extended to combine theanatomical and functional medical images and achieves goodvisual effects

53 Objective Evaluation and Analysis In addition to thevisual analysis five fusion quality metrics namely mutualinformation (MI) [40] entropy (EN) [41] spatial frequency(SF) 119876

119860119861119865 [42] and the uniform intensity distribution(UID) [43] are employed to test the validity of the proposedmethod They reflect the fusion performance from claritycontrast color distortions and the amount of informationMI as an information measure for evaluating image fusionperformance represents how much information is obtainedfrom the source images The higher value of EN shows thefused image has more information contents and the highervalue of SF indicates the final image is clearer The index119876119860119861119865 measures the amount of information transferred from

source images to the fused image and the UID is used for thedescription of uniform intensity and color distribution andthe higher UID means better color information

The quantitative comparisons are listed in Tables 1 2 and3 It can be seen that for all the indices the proposedmethodhas a stable performance (most of values rank the first andonly a few rank the second) It shows that the objective

results based on these quality metrics also coincide with thesubjective visual perception Particularly for theMI values ofall tests the proposed scheme all gets the largest value It isconfirmed that the proposed statistical model can transformmore detailed information from the source images into thefinal image by exploiting dependencies between the NSCTcoefficients Therefore it can be concluded that the proposedmethod is effective and is suitable for medical image fusion

54 Computational Complexity Analysis To investigate thecomputational complexity of different schemes we recordthe running time of different fusion algorithms used inFigure 5 (see Table 4) All the tests are implemented byMatlab 2014a on a PC with double Intel core i7-3770k CPU35GHz 8GB RAM As shown in Table 4 among all fusionmethods the consumption time of PCA based algorithm isthe lowest the reason of which is that it does not involvemultiscale decomposition Additionally DWT GP GF andCT basedmethods have also low time consumption (less than008 s) However their performance is poor Relatively threemethods based on NSCT are slower which is also a commonproblem of the algorithms based on NSCT Due to using thecomplex neural network and mechanism of NSCT NSCT-1 needs the most time (about 34 s) The proposed methodand NSCT-2 consume the similar time (1713 s and 1529 s)Actually the decomposition and reverse construction almostcost 910 of the total time Without exception the proposedmethod increases the cost of computational complexity withutilizing NSCT tool yet it achieves the better effect thanprevious methods

6 Conclusions

In this paper we propose a novel NSCT based statisticalmultimodal medical image fusion method which utilizes

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 6: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

6 Computational and Mathematical Methods in Medicine

In this case 120572 and 120573 are parameters that need to beestimated We can obtain the unique root by the likelihoodequations below here Ψ(sdot) denotes the digamma function

120597119871 (119909 120572 120573)

120597120573= minus

119871

120572+

119873

sum

119894=1

1205731003816100381610038161003816119909119894

1003816100381610038161003816120573120572minus120573

120572= 0 (3)

120597119871 (119909 120572 120573)

120597120573=

119871

120573+

119871Ψ (1120573)

1205732

minus

119873

sum

119894=1

(

10038161003816100381610038161199091198941003816100381610038161003816

120572)

120573

log(1003816100381610038161003816119909119894

1003816100381610038161003816

120572) = 0

(4)

Let 120573 be fixed and 120573 gt 0 then (4) has the unique solutionwhich is the real and positive value

= (120573

119873

119873

sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

1120573

(5)

Combining (4) and (5) the shape parameter 120573 can besolved by the following transcendental equation

1 +Ψ (1120573)

120573minus

sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573 log 1003816100381610038161003816119909119894

1003816100381610038161003816

sum1003816100381610038161003816119909119894

1003816100381610038161003816120573

+

log((120573119873)sum119873

119894=1

10038161003816100381610038161199091198941003816100381610038161003816120573)

120573= 0

(6)

In (6) the determination of 120573 can be effectively solvedusing Newton-Raphson iterative procedure [26 28] and thealgorithm is detailedly described in [22]Therefore with onlytwo parameters we can accurately characterize the marginaldistribution of the NSCT coefficients

34 The Dependency of Different NSCT Subbands The KLDis a common and justified way of measuring the distancebetween two distributions119863KL(119875 119876) is applied to describethe deficiency of using one distribution 119902 to represent the truedistribution 119901 which is generally used for comparing tworelated distributions The KLD between two distributions for119875119876 the PDFs of which are respectively denoted as 119901(119883 1205791)119901(119883 1205792) is defined as

119863KL (119875 119876) = 119863 (119901 (119909 1205791) 119901 (119909 1205792))

= int119901 (119909 1205791) log119901 (119909 1205791)

119901 (119909 1205792)119889119909

(7)

where 1205791 and 1205792 are a set of estimated parameters Giventwo GGD distributions of NSCT subbands the similaritybetween two GGDs for NSCT subbands can be defined bythe parameters 120572 and 120573 Substitute (1) into (7) and after some

manipulations the KLD between two PDFs can be expressedas

119863KL (119875 119876) = 119863KL (119901 (sdot 1198861 1205731) 119901 (sdot 1205722 1205732))

= log(12057311205722Γ (11205732)

12057321205721Γ (11205731))

+ (1205721

1205722

)

1205732 Γ ((1205732 + 1) 1205731)

Γ (11205731)minus

1

1205731

(8)

However there are some deficiencies with the KLDwhich makes it less ideal First the KLD is asymmetricthat is (119863KL(119875 119876)) is different from (119863KL(119876 119875))Second if 119902(119909) = 0 and 119901(119909) = 0 for any 119909 then119863KL(119875 119876) is undefined Third the KLD does not offerany nice upper bounds [14] On the other hand the JSDhas the characteristics of nonnegativity finiteness symmetryand boundedness [15 29] So we use the symmetric JSD tomeasure the similarity between two NSCT subbands in thisstudy The JSD between GGDs is derived from the KLDmathematically it is defined as

119863JS (119875 119876) = 119863JS (119876 119875)

=1

2(119863KL (119875 119872) + 119863KL (119876 119872))

119872 =119875 + 119876

2

(9)

4 The Proposed Image Fusion Technique

41 Fusion of the Low Frequency Coefficients The low fre-quency subbands represent the approximation componentsof the source images The simplest way to combine subbandcoefficients is the averaging method However this methodeasily leads to the low contrast and blurred result To extractmore useful information from the source images for thelow frequency coefficients we employ the fusion rule basedon two activity level measurements which consists of theregional standard deviation and Shannon entropy In prin-ciple the local texture features of an image are related withthe variation of the coefficients around neighborhood Onthe other hand the entropy indicates how much informationan image contains Thus combining the two together canextract more complementary information present in thesource images The process is listed as follows

(1) Computing the regional standard deviation119863120582(119909 119910)

119863120582 (119909 119910)

= radic sum

119898isin119872119899isin119873

120596 (119898 119899) times [119862120582 (119909 + 119898 119910 + 119899) minus 119878120582 (119909 119910)]2(10)

(2) Calculating the normalized Shannon entropy

119864120582 (119909 119910) =1

|119877|sum

119894119895

(119862120582

0(119894 119895))

2

log (1198621205820(119894 119895))

2

(11)

Computational and Mathematical Methods in Medicine 7

(3) Computing the weights (120575120582 120585120582) of the standarddeviation 119863120582(119909 119910) and the information entropy 119864120582(119909 119910)respectively

120575120582 =

1003816100381610038161003816119863120582 (119909 119910)1003816100381610038161003816120572

1003816100381610038161003816119863119860 (119909 119910)1003816100381610038161003816120572+1003816100381610038161003816119863119861 (119909 119910)

1003816100381610038161003816120572

120585120582 =119864120582 (119909 119910)

119864119860 (119909 119910) + 119864119861 (119909 119910)

(12)

where the parameter 120572 is a constant which tunes thesharpness of fused image by adjusting the value of parameterit is set to 12 in our experiment

Let 119862120582

0(119909 119910) denote the low frequency subband coeffi-

cient at location (119909 119910)120582 is input image119860119861 Finally the fusedimage can be obtained by

119862119865

0(119909 119910) = sum

120582=119860119861

[120575120582119862120582

0(119909 119910) + 120585120582119862

120582

0(119909 119910)] (13)

42 Fusion of the High Frequency Coefficients High fre-quency subbands correspond to detailed information in theseregions such as edges lines and corners Because differentimaging modalities contain redundant and complementaryinformation of each other the purpose of selection rule ismainly to capture salient information of the source imagesas much as possible Maximum selection rule is not suitablefor medical image fusion because it works well on thispremise that only an original image provides good pixelat each corresponding location thus vast complementaryinformation will be lost when it is used for MIF To improvethe fusion performance for the high frequency subbands wepropose the fusion scheme based on weight maps which aredetermined by the saliency maps According to the fact thatthere exist dependencies between the NSCT coefficients thehigh frequency coefficients are first updated by utilizing therelationships between NSCT subbands and then combiningtogether by using weight maps The process is described asfollows

(1) Updating of the High Frequency Subband CoefficientsFirst we calculate the horizontal dependency 119895119904119889

119897120579ℎbetween

coefficients with different directions at the same scale 119897 as

119895119904119889119897120579ℎ

(119909 119910) =

119870

sum

119895=1119895 =119894

119863JS (119862119897120579119894 (119909 119910) 119862119897120579119895 (119909 119910)) (14)

where119870 is the total of the subbands at the 119897th scaleThen we calculate the vertical dependency 119895119904119889

119897120579Vbetween the specified subbandrsquos (for instance subband 119894)parents and children

119895119904119889119897120579V (119909 119910) =

119870

sum

119895=1

119863JS (119862119897120579119894 (119909 119910) 119862119897minus1120579119895 (119909 119910))

+ 119863JS (119862119897120579119894 (119909 119910) 119862119897+1120579119895 (119909 119910))

(15)

Further the horizontal and vertical dependency compo-nents are normalized respectively

119895119904119889119897120579ℎ

(119909 119910) =119895119904119889119897120579ℎ

(119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

119895119904119889119897120579V (119909 119910) =

119895119904119889119897120579V (119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

(16)

Finally the high frequency NSCT coefficients are revisedas

119862119897120579 (119909 119910)

= 119862119897120579 (119909 119910)radic1 + 119895119904119889

119897120579ℎ(119909 119910)

2+ 119895119904119889119897120579V (119909 119910)

2

(17)

(2) Construction of Weight Maps Weight maps are derivedfrom the saliency maps which describe each pixel by thesaliency level of salient information We apply Gaussianfilter to each high pass subband which tends to assign ahigh weight value to important elements such as edges andcorners A saliency map is constructed by the local average ofthe absolute value of the filter response

119878119897120579 (119909 119910) =1003816100381610038161003816119862119897120579 (119909 119910)

1003816100381610038161003816 lowast 119892119903119892 120579119892(119909 119910) (18)

where119892(sdot) is a Gaussian low pass filter whose size is (2119903119892+1)times(2119903119892 + 1) and the parameters 119903119892 and 120579119892 are set to 5 Next theweight maps are determined by comparison of the saliencymaps (119878119899

119897120579(119909 119910) 119899 isin [119860 119861])

119882119899

119897120579(119909 119910)

=

1 if 119878119899119897120579

(119909 119910) = max (119878119860119897120579

(119909 119910) 119878119861

119897120579(119909 119910))

0 otherwise

(19)

Finally the fused subband coefficients 119862119865

119897120579(119909 119910) can be

obtained by the weighted summation

119862119865

119897120579(119909 119910) = 119882

119860

119897120579(119909 119910) 119862

119860

119897120579(119909 119910)

+ 119882119861

119897120579(119909 119910) 119862

119861

119897120579(119909 119910)

(20)

5 Experimental Results and Discussion

Five different data sets of human brain images are usedand the source images consist of two different modalitiesincluding CTMRI MRIPET and MRISPECT images Allthe images have the size of 256 times 256 pixels which havebeen registered by some kind of registration method as [30]To verify the effectiveness and applicability of the proposedfusion scheme the results produced by the proposed methodare respectively compared with results of other state-of-the-art schemes such as discrete wavelet transform (DWT) [1]gradient pyramid (GP) [31] principal component analysis(PCA) [32] Intensity Hue and Saturation color model (IHS)[33] guided filtering (GF) [34] the contourlet transform(CT) [3] NSCT the shearlet transform (ST) [35] and

8 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 6 The fusion results of different fusion methods for the second set of CT and MRI images (a) (b) Source images fused images by(c) DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

the nonsubsampled shearlet transform (NSST) based meth-ods For simplicity MIF method [12] based on pulse-coupledneural network and modified spatial frequency in NSCTdomain is denoted as NSCT-1 NSCT based MIF method inthe scheme [36] is denoted as NSCT-2 The fusion method[37] based on neighborhood characteristic and regionaliza-tion in NSCT domain and NSCT based MIF method [10]in 119897 120572 120573 color space are denoted as NSCT-3 and NSCT-4 respectively Accordingly NSST based MIF method usingGGD model [21] is termed as NSST-1 NSST based fusionscheme of the literature [38] and MIF method [39] byutilizing the features in NSST domain are termed as NSST-2andNSST-3 respectively NSST based statisticalMIFmethod[20] using HMT model is termed as NSST-4 For the NSCTand NSST methods we adopt the average-maximum fusionscheme namely the low frequency coefficients are fused bythe average of the corresponding coefficients and the highfrequency coefficients are fused by using absolute maximumFor allMSDmethods the original images are all decomposedinto 4 levels with the number of the directions 2 2 33 Additionally the quantitative comparison based on fiveimage fusion quality metrics is also employed to demonstratethe fusion performance of different methods

51 Experiments on CT-MRI Image Fusion Figure 5 showsa fusion experiment of CT and MRI image It can be seenthat PCA based method gives poor result relative to the otheralgorithms in which the bone structure of original CT imageis almost invisible For the GP based method the final imageis darker and has lower contrast some detail information isunclear The results from Figures 5(c) 5(f) 5(g) 5(h) and5(i) have some improvement to various degrees and produce

better visual effect on bone structures however the detailsof the soft tissue regions from these methods still retainunsharpness By contrast the proposed method can wellpreserve the detailed features of the original images withoutproducing visible artifacts Figure 6 is another example ofCT and MRI image fusion As seen from Figures 6(c) 6(d)6(e) and 6(f) their results have low contrast and lose a lotof details What is worse there are undesirable artifacts onthe edges of these final images (see regions labeled by the redellipses in Figure 6) Accordingly CT based method otherNSCT based methods and the proposed method providebetter visual effects with good contrast the abundant infor-mation of the source images can be successfully transferred tothe fused image Both tests imply that the proposed methodis suitable for fusion of CT and MRI images

52 Experiments onMRI-PET andMRI-SPECT Image FusionIn this section a case of MRI and PET fusion for a 70-year-old man affected with Alzheimerrsquos disease is shown InFigure 7 the source MRI image shows that the hemisphericsulci is widened andmore prominent in the parietal lobes thecorresponding PET shows that regional cerebral metabolismis abnormal and hypometabolism heavily happens in anteriortemporal and posterior parietal regions meanwhile the righthemisphere is slightly more affected than the left Herethe proposed method is compared with other seven fusionschemes Obviously the results of Figures 7(d) and 7(e) havedisadvantages with serious color distortion and low contrastAlthough the results of Figures 7(c) and 7(f) improve thefusion performance to some extent they are not saturated inbrightness so that some parts are unidentifiable Addition-ally for the contrast and color fidelity the results fromFigures

Computational and Mathematical Methods in Medicine 9

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 7 The fusion results of different fusion methods for the MRI and PET images (a) (b) Source images fused images by (c) DWT (d)PCA (e) IHS (f) GP (g) CT (h) NSST (i) NSST-1 and (j) the proposed method

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 8 The fusion results of different fusion methods for the first set of MRI and SPECT images (a) (b) Source images fused images by(c) DWT (d) GP (e) IHS (f) NSST-2 (g) NSST-3 (h) NSCT-3 (i) NSCT-4 and (j) the proposed method

7(g) 7(h) and 7(i) have better fusion performance than thesemethods mentioned above but the structural information oftheMRI image is not successfully transferred to fused imagesThrough the comparison of these fused results it is foundthat the proposed method can well extract the structuraland functional information from the source images and fusethem with much less information distortion As illustratedin these regions highlighted by red arrows and ellipses inFigure 7 the proposed method well preserves complemen-tary information of different modal medical images andachieves the best visual effect in terms of contrast clarity andcolor fidelity

Figure 8 is a fusion experiment of MRI and SPECTimages The source MRI image demonstrates that tumors arelocated in the left temporal region as shown in the high signalintensity region (the white region labeled by the right redarrow in Figure 8(a)) From Figures 8(e) 8(f) and 8(h) theresults produced by IHS based methods are distinctly colordistortion in the lesion region The results produced by theDWT andGP basedmethods cannot well inherit PET imagersquosfunctional information and produce the low contrast images(see Figures 8(c) and 8(d)) By comparison Figures 8(g) and8(i) gain better results in terms of contrast and color fidelityHowever for the spatial details the fused result obtained by

10 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 9 The fusion results of different fusion methods for the second set of MRI and SPECT images (a) (b) Source images fused imagesby (c) DWT (d) GP (e) IHS (f) ST (g) CT (h) NSST-4 (i) NSCT and (j) the proposed method

the proposedmethod ismore close to the originalMRI image(see the region labeled by the left red arrow) and the spectralfeatures are also natural Figure 9 provides another exampleof MRI and SPECT image fusion In this test the proposedmethod is specifically compared with the typical schemes[20] which is the MIF method based on the statisticaldependencies between coefficients in NSST domain Fromall the fused results it is easily observed that the proposedmethod not only inherits the salient information existing inboth the original images but also hardly causes the problemofcolor distortion Through the above examples it can be seenthat the proposed method can be extended to combine theanatomical and functional medical images and achieves goodvisual effects

53 Objective Evaluation and Analysis In addition to thevisual analysis five fusion quality metrics namely mutualinformation (MI) [40] entropy (EN) [41] spatial frequency(SF) 119876

119860119861119865 [42] and the uniform intensity distribution(UID) [43] are employed to test the validity of the proposedmethod They reflect the fusion performance from claritycontrast color distortions and the amount of informationMI as an information measure for evaluating image fusionperformance represents how much information is obtainedfrom the source images The higher value of EN shows thefused image has more information contents and the highervalue of SF indicates the final image is clearer The index119876119860119861119865 measures the amount of information transferred from

source images to the fused image and the UID is used for thedescription of uniform intensity and color distribution andthe higher UID means better color information

The quantitative comparisons are listed in Tables 1 2 and3 It can be seen that for all the indices the proposedmethodhas a stable performance (most of values rank the first andonly a few rank the second) It shows that the objective

results based on these quality metrics also coincide with thesubjective visual perception Particularly for theMI values ofall tests the proposed scheme all gets the largest value It isconfirmed that the proposed statistical model can transformmore detailed information from the source images into thefinal image by exploiting dependencies between the NSCTcoefficients Therefore it can be concluded that the proposedmethod is effective and is suitable for medical image fusion

54 Computational Complexity Analysis To investigate thecomputational complexity of different schemes we recordthe running time of different fusion algorithms used inFigure 5 (see Table 4) All the tests are implemented byMatlab 2014a on a PC with double Intel core i7-3770k CPU35GHz 8GB RAM As shown in Table 4 among all fusionmethods the consumption time of PCA based algorithm isthe lowest the reason of which is that it does not involvemultiscale decomposition Additionally DWT GP GF andCT basedmethods have also low time consumption (less than008 s) However their performance is poor Relatively threemethods based on NSCT are slower which is also a commonproblem of the algorithms based on NSCT Due to using thecomplex neural network and mechanism of NSCT NSCT-1 needs the most time (about 34 s) The proposed methodand NSCT-2 consume the similar time (1713 s and 1529 s)Actually the decomposition and reverse construction almostcost 910 of the total time Without exception the proposedmethod increases the cost of computational complexity withutilizing NSCT tool yet it achieves the better effect thanprevious methods

6 Conclusions

In this paper we propose a novel NSCT based statisticalmultimodal medical image fusion method which utilizes

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 7: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

Computational and Mathematical Methods in Medicine 7

(3) Computing the weights (120575120582 120585120582) of the standarddeviation 119863120582(119909 119910) and the information entropy 119864120582(119909 119910)respectively

120575120582 =

1003816100381610038161003816119863120582 (119909 119910)1003816100381610038161003816120572

1003816100381610038161003816119863119860 (119909 119910)1003816100381610038161003816120572+1003816100381610038161003816119863119861 (119909 119910)

1003816100381610038161003816120572

120585120582 =119864120582 (119909 119910)

119864119860 (119909 119910) + 119864119861 (119909 119910)

(12)

where the parameter 120572 is a constant which tunes thesharpness of fused image by adjusting the value of parameterit is set to 12 in our experiment

Let 119862120582

0(119909 119910) denote the low frequency subband coeffi-

cient at location (119909 119910)120582 is input image119860119861 Finally the fusedimage can be obtained by

119862119865

0(119909 119910) = sum

120582=119860119861

[120575120582119862120582

0(119909 119910) + 120585120582119862

120582

0(119909 119910)] (13)

42 Fusion of the High Frequency Coefficients High fre-quency subbands correspond to detailed information in theseregions such as edges lines and corners Because differentimaging modalities contain redundant and complementaryinformation of each other the purpose of selection rule ismainly to capture salient information of the source imagesas much as possible Maximum selection rule is not suitablefor medical image fusion because it works well on thispremise that only an original image provides good pixelat each corresponding location thus vast complementaryinformation will be lost when it is used for MIF To improvethe fusion performance for the high frequency subbands wepropose the fusion scheme based on weight maps which aredetermined by the saliency maps According to the fact thatthere exist dependencies between the NSCT coefficients thehigh frequency coefficients are first updated by utilizing therelationships between NSCT subbands and then combiningtogether by using weight maps The process is described asfollows

(1) Updating of the High Frequency Subband CoefficientsFirst we calculate the horizontal dependency 119895119904119889

119897120579ℎbetween

coefficients with different directions at the same scale 119897 as

119895119904119889119897120579ℎ

(119909 119910) =

119870

sum

119895=1119895 =119894

119863JS (119862119897120579119894 (119909 119910) 119862119897120579119895 (119909 119910)) (14)

where119870 is the total of the subbands at the 119897th scaleThen we calculate the vertical dependency 119895119904119889

119897120579Vbetween the specified subbandrsquos (for instance subband 119894)parents and children

119895119904119889119897120579V (119909 119910) =

119870

sum

119895=1

119863JS (119862119897120579119894 (119909 119910) 119862119897minus1120579119895 (119909 119910))

+ 119863JS (119862119897120579119894 (119909 119910) 119862119897+1120579119895 (119909 119910))

(15)

Further the horizontal and vertical dependency compo-nents are normalized respectively

119895119904119889119897120579ℎ

(119909 119910) =119895119904119889119897120579ℎ

(119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

119895119904119889119897120579V (119909 119910) =

119895119904119889119897120579V (119909 119910)

119895119904119889119897120579ℎ

(119909 119910) + 119895119904119889119897120579V (119909 119910)

(16)

Finally the high frequency NSCT coefficients are revisedas

119862119897120579 (119909 119910)

= 119862119897120579 (119909 119910)radic1 + 119895119904119889

119897120579ℎ(119909 119910)

2+ 119895119904119889119897120579V (119909 119910)

2

(17)

(2) Construction of Weight Maps Weight maps are derivedfrom the saliency maps which describe each pixel by thesaliency level of salient information We apply Gaussianfilter to each high pass subband which tends to assign ahigh weight value to important elements such as edges andcorners A saliency map is constructed by the local average ofthe absolute value of the filter response

119878119897120579 (119909 119910) =1003816100381610038161003816119862119897120579 (119909 119910)

1003816100381610038161003816 lowast 119892119903119892 120579119892(119909 119910) (18)

where119892(sdot) is a Gaussian low pass filter whose size is (2119903119892+1)times(2119903119892 + 1) and the parameters 119903119892 and 120579119892 are set to 5 Next theweight maps are determined by comparison of the saliencymaps (119878119899

119897120579(119909 119910) 119899 isin [119860 119861])

119882119899

119897120579(119909 119910)

=

1 if 119878119899119897120579

(119909 119910) = max (119878119860119897120579

(119909 119910) 119878119861

119897120579(119909 119910))

0 otherwise

(19)

Finally the fused subband coefficients 119862119865

119897120579(119909 119910) can be

obtained by the weighted summation

119862119865

119897120579(119909 119910) = 119882

119860

119897120579(119909 119910) 119862

119860

119897120579(119909 119910)

+ 119882119861

119897120579(119909 119910) 119862

119861

119897120579(119909 119910)

(20)

5 Experimental Results and Discussion

Five different data sets of human brain images are usedand the source images consist of two different modalitiesincluding CTMRI MRIPET and MRISPECT images Allthe images have the size of 256 times 256 pixels which havebeen registered by some kind of registration method as [30]To verify the effectiveness and applicability of the proposedfusion scheme the results produced by the proposed methodare respectively compared with results of other state-of-the-art schemes such as discrete wavelet transform (DWT) [1]gradient pyramid (GP) [31] principal component analysis(PCA) [32] Intensity Hue and Saturation color model (IHS)[33] guided filtering (GF) [34] the contourlet transform(CT) [3] NSCT the shearlet transform (ST) [35] and

8 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 6 The fusion results of different fusion methods for the second set of CT and MRI images (a) (b) Source images fused images by(c) DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

the nonsubsampled shearlet transform (NSST) based meth-ods For simplicity MIF method [12] based on pulse-coupledneural network and modified spatial frequency in NSCTdomain is denoted as NSCT-1 NSCT based MIF method inthe scheme [36] is denoted as NSCT-2 The fusion method[37] based on neighborhood characteristic and regionaliza-tion in NSCT domain and NSCT based MIF method [10]in 119897 120572 120573 color space are denoted as NSCT-3 and NSCT-4 respectively Accordingly NSST based MIF method usingGGD model [21] is termed as NSST-1 NSST based fusionscheme of the literature [38] and MIF method [39] byutilizing the features in NSST domain are termed as NSST-2andNSST-3 respectively NSST based statisticalMIFmethod[20] using HMT model is termed as NSST-4 For the NSCTand NSST methods we adopt the average-maximum fusionscheme namely the low frequency coefficients are fused bythe average of the corresponding coefficients and the highfrequency coefficients are fused by using absolute maximumFor allMSDmethods the original images are all decomposedinto 4 levels with the number of the directions 2 2 33 Additionally the quantitative comparison based on fiveimage fusion quality metrics is also employed to demonstratethe fusion performance of different methods

51 Experiments on CT-MRI Image Fusion Figure 5 showsa fusion experiment of CT and MRI image It can be seenthat PCA based method gives poor result relative to the otheralgorithms in which the bone structure of original CT imageis almost invisible For the GP based method the final imageis darker and has lower contrast some detail information isunclear The results from Figures 5(c) 5(f) 5(g) 5(h) and5(i) have some improvement to various degrees and produce

better visual effect on bone structures however the detailsof the soft tissue regions from these methods still retainunsharpness By contrast the proposed method can wellpreserve the detailed features of the original images withoutproducing visible artifacts Figure 6 is another example ofCT and MRI image fusion As seen from Figures 6(c) 6(d)6(e) and 6(f) their results have low contrast and lose a lotof details What is worse there are undesirable artifacts onthe edges of these final images (see regions labeled by the redellipses in Figure 6) Accordingly CT based method otherNSCT based methods and the proposed method providebetter visual effects with good contrast the abundant infor-mation of the source images can be successfully transferred tothe fused image Both tests imply that the proposed methodis suitable for fusion of CT and MRI images

52 Experiments onMRI-PET andMRI-SPECT Image FusionIn this section a case of MRI and PET fusion for a 70-year-old man affected with Alzheimerrsquos disease is shown InFigure 7 the source MRI image shows that the hemisphericsulci is widened andmore prominent in the parietal lobes thecorresponding PET shows that regional cerebral metabolismis abnormal and hypometabolism heavily happens in anteriortemporal and posterior parietal regions meanwhile the righthemisphere is slightly more affected than the left Herethe proposed method is compared with other seven fusionschemes Obviously the results of Figures 7(d) and 7(e) havedisadvantages with serious color distortion and low contrastAlthough the results of Figures 7(c) and 7(f) improve thefusion performance to some extent they are not saturated inbrightness so that some parts are unidentifiable Addition-ally for the contrast and color fidelity the results fromFigures

Computational and Mathematical Methods in Medicine 9

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 7 The fusion results of different fusion methods for the MRI and PET images (a) (b) Source images fused images by (c) DWT (d)PCA (e) IHS (f) GP (g) CT (h) NSST (i) NSST-1 and (j) the proposed method

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 8 The fusion results of different fusion methods for the first set of MRI and SPECT images (a) (b) Source images fused images by(c) DWT (d) GP (e) IHS (f) NSST-2 (g) NSST-3 (h) NSCT-3 (i) NSCT-4 and (j) the proposed method

7(g) 7(h) and 7(i) have better fusion performance than thesemethods mentioned above but the structural information oftheMRI image is not successfully transferred to fused imagesThrough the comparison of these fused results it is foundthat the proposed method can well extract the structuraland functional information from the source images and fusethem with much less information distortion As illustratedin these regions highlighted by red arrows and ellipses inFigure 7 the proposed method well preserves complemen-tary information of different modal medical images andachieves the best visual effect in terms of contrast clarity andcolor fidelity

Figure 8 is a fusion experiment of MRI and SPECTimages The source MRI image demonstrates that tumors arelocated in the left temporal region as shown in the high signalintensity region (the white region labeled by the right redarrow in Figure 8(a)) From Figures 8(e) 8(f) and 8(h) theresults produced by IHS based methods are distinctly colordistortion in the lesion region The results produced by theDWT andGP basedmethods cannot well inherit PET imagersquosfunctional information and produce the low contrast images(see Figures 8(c) and 8(d)) By comparison Figures 8(g) and8(i) gain better results in terms of contrast and color fidelityHowever for the spatial details the fused result obtained by

10 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 9 The fusion results of different fusion methods for the second set of MRI and SPECT images (a) (b) Source images fused imagesby (c) DWT (d) GP (e) IHS (f) ST (g) CT (h) NSST-4 (i) NSCT and (j) the proposed method

the proposedmethod ismore close to the originalMRI image(see the region labeled by the left red arrow) and the spectralfeatures are also natural Figure 9 provides another exampleof MRI and SPECT image fusion In this test the proposedmethod is specifically compared with the typical schemes[20] which is the MIF method based on the statisticaldependencies between coefficients in NSST domain Fromall the fused results it is easily observed that the proposedmethod not only inherits the salient information existing inboth the original images but also hardly causes the problemofcolor distortion Through the above examples it can be seenthat the proposed method can be extended to combine theanatomical and functional medical images and achieves goodvisual effects

53 Objective Evaluation and Analysis In addition to thevisual analysis five fusion quality metrics namely mutualinformation (MI) [40] entropy (EN) [41] spatial frequency(SF) 119876

119860119861119865 [42] and the uniform intensity distribution(UID) [43] are employed to test the validity of the proposedmethod They reflect the fusion performance from claritycontrast color distortions and the amount of informationMI as an information measure for evaluating image fusionperformance represents how much information is obtainedfrom the source images The higher value of EN shows thefused image has more information contents and the highervalue of SF indicates the final image is clearer The index119876119860119861119865 measures the amount of information transferred from

source images to the fused image and the UID is used for thedescription of uniform intensity and color distribution andthe higher UID means better color information

The quantitative comparisons are listed in Tables 1 2 and3 It can be seen that for all the indices the proposedmethodhas a stable performance (most of values rank the first andonly a few rank the second) It shows that the objective

results based on these quality metrics also coincide with thesubjective visual perception Particularly for theMI values ofall tests the proposed scheme all gets the largest value It isconfirmed that the proposed statistical model can transformmore detailed information from the source images into thefinal image by exploiting dependencies between the NSCTcoefficients Therefore it can be concluded that the proposedmethod is effective and is suitable for medical image fusion

54 Computational Complexity Analysis To investigate thecomputational complexity of different schemes we recordthe running time of different fusion algorithms used inFigure 5 (see Table 4) All the tests are implemented byMatlab 2014a on a PC with double Intel core i7-3770k CPU35GHz 8GB RAM As shown in Table 4 among all fusionmethods the consumption time of PCA based algorithm isthe lowest the reason of which is that it does not involvemultiscale decomposition Additionally DWT GP GF andCT basedmethods have also low time consumption (less than008 s) However their performance is poor Relatively threemethods based on NSCT are slower which is also a commonproblem of the algorithms based on NSCT Due to using thecomplex neural network and mechanism of NSCT NSCT-1 needs the most time (about 34 s) The proposed methodand NSCT-2 consume the similar time (1713 s and 1529 s)Actually the decomposition and reverse construction almostcost 910 of the total time Without exception the proposedmethod increases the cost of computational complexity withutilizing NSCT tool yet it achieves the better effect thanprevious methods

6 Conclusions

In this paper we propose a novel NSCT based statisticalmultimodal medical image fusion method which utilizes

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 8: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

8 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 6 The fusion results of different fusion methods for the second set of CT and MRI images (a) (b) Source images fused images by(c) DWT (d) GP (e) PCA (f) GF (g) CT (h) NSCT-1 (i) NSCT-2 and (j) the proposed method

the nonsubsampled shearlet transform (NSST) based meth-ods For simplicity MIF method [12] based on pulse-coupledneural network and modified spatial frequency in NSCTdomain is denoted as NSCT-1 NSCT based MIF method inthe scheme [36] is denoted as NSCT-2 The fusion method[37] based on neighborhood characteristic and regionaliza-tion in NSCT domain and NSCT based MIF method [10]in 119897 120572 120573 color space are denoted as NSCT-3 and NSCT-4 respectively Accordingly NSST based MIF method usingGGD model [21] is termed as NSST-1 NSST based fusionscheme of the literature [38] and MIF method [39] byutilizing the features in NSST domain are termed as NSST-2andNSST-3 respectively NSST based statisticalMIFmethod[20] using HMT model is termed as NSST-4 For the NSCTand NSST methods we adopt the average-maximum fusionscheme namely the low frequency coefficients are fused bythe average of the corresponding coefficients and the highfrequency coefficients are fused by using absolute maximumFor allMSDmethods the original images are all decomposedinto 4 levels with the number of the directions 2 2 33 Additionally the quantitative comparison based on fiveimage fusion quality metrics is also employed to demonstratethe fusion performance of different methods

51 Experiments on CT-MRI Image Fusion Figure 5 showsa fusion experiment of CT and MRI image It can be seenthat PCA based method gives poor result relative to the otheralgorithms in which the bone structure of original CT imageis almost invisible For the GP based method the final imageis darker and has lower contrast some detail information isunclear The results from Figures 5(c) 5(f) 5(g) 5(h) and5(i) have some improvement to various degrees and produce

better visual effect on bone structures however the detailsof the soft tissue regions from these methods still retainunsharpness By contrast the proposed method can wellpreserve the detailed features of the original images withoutproducing visible artifacts Figure 6 is another example ofCT and MRI image fusion As seen from Figures 6(c) 6(d)6(e) and 6(f) their results have low contrast and lose a lotof details What is worse there are undesirable artifacts onthe edges of these final images (see regions labeled by the redellipses in Figure 6) Accordingly CT based method otherNSCT based methods and the proposed method providebetter visual effects with good contrast the abundant infor-mation of the source images can be successfully transferred tothe fused image Both tests imply that the proposed methodis suitable for fusion of CT and MRI images

52 Experiments onMRI-PET andMRI-SPECT Image FusionIn this section a case of MRI and PET fusion for a 70-year-old man affected with Alzheimerrsquos disease is shown InFigure 7 the source MRI image shows that the hemisphericsulci is widened andmore prominent in the parietal lobes thecorresponding PET shows that regional cerebral metabolismis abnormal and hypometabolism heavily happens in anteriortemporal and posterior parietal regions meanwhile the righthemisphere is slightly more affected than the left Herethe proposed method is compared with other seven fusionschemes Obviously the results of Figures 7(d) and 7(e) havedisadvantages with serious color distortion and low contrastAlthough the results of Figures 7(c) and 7(f) improve thefusion performance to some extent they are not saturated inbrightness so that some parts are unidentifiable Addition-ally for the contrast and color fidelity the results fromFigures

Computational and Mathematical Methods in Medicine 9

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 7 The fusion results of different fusion methods for the MRI and PET images (a) (b) Source images fused images by (c) DWT (d)PCA (e) IHS (f) GP (g) CT (h) NSST (i) NSST-1 and (j) the proposed method

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 8 The fusion results of different fusion methods for the first set of MRI and SPECT images (a) (b) Source images fused images by(c) DWT (d) GP (e) IHS (f) NSST-2 (g) NSST-3 (h) NSCT-3 (i) NSCT-4 and (j) the proposed method

7(g) 7(h) and 7(i) have better fusion performance than thesemethods mentioned above but the structural information oftheMRI image is not successfully transferred to fused imagesThrough the comparison of these fused results it is foundthat the proposed method can well extract the structuraland functional information from the source images and fusethem with much less information distortion As illustratedin these regions highlighted by red arrows and ellipses inFigure 7 the proposed method well preserves complemen-tary information of different modal medical images andachieves the best visual effect in terms of contrast clarity andcolor fidelity

Figure 8 is a fusion experiment of MRI and SPECTimages The source MRI image demonstrates that tumors arelocated in the left temporal region as shown in the high signalintensity region (the white region labeled by the right redarrow in Figure 8(a)) From Figures 8(e) 8(f) and 8(h) theresults produced by IHS based methods are distinctly colordistortion in the lesion region The results produced by theDWT andGP basedmethods cannot well inherit PET imagersquosfunctional information and produce the low contrast images(see Figures 8(c) and 8(d)) By comparison Figures 8(g) and8(i) gain better results in terms of contrast and color fidelityHowever for the spatial details the fused result obtained by

10 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 9 The fusion results of different fusion methods for the second set of MRI and SPECT images (a) (b) Source images fused imagesby (c) DWT (d) GP (e) IHS (f) ST (g) CT (h) NSST-4 (i) NSCT and (j) the proposed method

the proposedmethod ismore close to the originalMRI image(see the region labeled by the left red arrow) and the spectralfeatures are also natural Figure 9 provides another exampleof MRI and SPECT image fusion In this test the proposedmethod is specifically compared with the typical schemes[20] which is the MIF method based on the statisticaldependencies between coefficients in NSST domain Fromall the fused results it is easily observed that the proposedmethod not only inherits the salient information existing inboth the original images but also hardly causes the problemofcolor distortion Through the above examples it can be seenthat the proposed method can be extended to combine theanatomical and functional medical images and achieves goodvisual effects

53 Objective Evaluation and Analysis In addition to thevisual analysis five fusion quality metrics namely mutualinformation (MI) [40] entropy (EN) [41] spatial frequency(SF) 119876

119860119861119865 [42] and the uniform intensity distribution(UID) [43] are employed to test the validity of the proposedmethod They reflect the fusion performance from claritycontrast color distortions and the amount of informationMI as an information measure for evaluating image fusionperformance represents how much information is obtainedfrom the source images The higher value of EN shows thefused image has more information contents and the highervalue of SF indicates the final image is clearer The index119876119860119861119865 measures the amount of information transferred from

source images to the fused image and the UID is used for thedescription of uniform intensity and color distribution andthe higher UID means better color information

The quantitative comparisons are listed in Tables 1 2 and3 It can be seen that for all the indices the proposedmethodhas a stable performance (most of values rank the first andonly a few rank the second) It shows that the objective

results based on these quality metrics also coincide with thesubjective visual perception Particularly for theMI values ofall tests the proposed scheme all gets the largest value It isconfirmed that the proposed statistical model can transformmore detailed information from the source images into thefinal image by exploiting dependencies between the NSCTcoefficients Therefore it can be concluded that the proposedmethod is effective and is suitable for medical image fusion

54 Computational Complexity Analysis To investigate thecomputational complexity of different schemes we recordthe running time of different fusion algorithms used inFigure 5 (see Table 4) All the tests are implemented byMatlab 2014a on a PC with double Intel core i7-3770k CPU35GHz 8GB RAM As shown in Table 4 among all fusionmethods the consumption time of PCA based algorithm isthe lowest the reason of which is that it does not involvemultiscale decomposition Additionally DWT GP GF andCT basedmethods have also low time consumption (less than008 s) However their performance is poor Relatively threemethods based on NSCT are slower which is also a commonproblem of the algorithms based on NSCT Due to using thecomplex neural network and mechanism of NSCT NSCT-1 needs the most time (about 34 s) The proposed methodand NSCT-2 consume the similar time (1713 s and 1529 s)Actually the decomposition and reverse construction almostcost 910 of the total time Without exception the proposedmethod increases the cost of computational complexity withutilizing NSCT tool yet it achieves the better effect thanprevious methods

6 Conclusions

In this paper we propose a novel NSCT based statisticalmultimodal medical image fusion method which utilizes

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 9: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

Computational and Mathematical Methods in Medicine 9

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 7 The fusion results of different fusion methods for the MRI and PET images (a) (b) Source images fused images by (c) DWT (d)PCA (e) IHS (f) GP (g) CT (h) NSST (i) NSST-1 and (j) the proposed method

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 8 The fusion results of different fusion methods for the first set of MRI and SPECT images (a) (b) Source images fused images by(c) DWT (d) GP (e) IHS (f) NSST-2 (g) NSST-3 (h) NSCT-3 (i) NSCT-4 and (j) the proposed method

7(g) 7(h) and 7(i) have better fusion performance than thesemethods mentioned above but the structural information oftheMRI image is not successfully transferred to fused imagesThrough the comparison of these fused results it is foundthat the proposed method can well extract the structuraland functional information from the source images and fusethem with much less information distortion As illustratedin these regions highlighted by red arrows and ellipses inFigure 7 the proposed method well preserves complemen-tary information of different modal medical images andachieves the best visual effect in terms of contrast clarity andcolor fidelity

Figure 8 is a fusion experiment of MRI and SPECTimages The source MRI image demonstrates that tumors arelocated in the left temporal region as shown in the high signalintensity region (the white region labeled by the right redarrow in Figure 8(a)) From Figures 8(e) 8(f) and 8(h) theresults produced by IHS based methods are distinctly colordistortion in the lesion region The results produced by theDWT andGP basedmethods cannot well inherit PET imagersquosfunctional information and produce the low contrast images(see Figures 8(c) and 8(d)) By comparison Figures 8(g) and8(i) gain better results in terms of contrast and color fidelityHowever for the spatial details the fused result obtained by

10 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 9 The fusion results of different fusion methods for the second set of MRI and SPECT images (a) (b) Source images fused imagesby (c) DWT (d) GP (e) IHS (f) ST (g) CT (h) NSST-4 (i) NSCT and (j) the proposed method

the proposedmethod ismore close to the originalMRI image(see the region labeled by the left red arrow) and the spectralfeatures are also natural Figure 9 provides another exampleof MRI and SPECT image fusion In this test the proposedmethod is specifically compared with the typical schemes[20] which is the MIF method based on the statisticaldependencies between coefficients in NSST domain Fromall the fused results it is easily observed that the proposedmethod not only inherits the salient information existing inboth the original images but also hardly causes the problemofcolor distortion Through the above examples it can be seenthat the proposed method can be extended to combine theanatomical and functional medical images and achieves goodvisual effects

53 Objective Evaluation and Analysis In addition to thevisual analysis five fusion quality metrics namely mutualinformation (MI) [40] entropy (EN) [41] spatial frequency(SF) 119876

119860119861119865 [42] and the uniform intensity distribution(UID) [43] are employed to test the validity of the proposedmethod They reflect the fusion performance from claritycontrast color distortions and the amount of informationMI as an information measure for evaluating image fusionperformance represents how much information is obtainedfrom the source images The higher value of EN shows thefused image has more information contents and the highervalue of SF indicates the final image is clearer The index119876119860119861119865 measures the amount of information transferred from

source images to the fused image and the UID is used for thedescription of uniform intensity and color distribution andthe higher UID means better color information

The quantitative comparisons are listed in Tables 1 2 and3 It can be seen that for all the indices the proposedmethodhas a stable performance (most of values rank the first andonly a few rank the second) It shows that the objective

results based on these quality metrics also coincide with thesubjective visual perception Particularly for theMI values ofall tests the proposed scheme all gets the largest value It isconfirmed that the proposed statistical model can transformmore detailed information from the source images into thefinal image by exploiting dependencies between the NSCTcoefficients Therefore it can be concluded that the proposedmethod is effective and is suitable for medical image fusion

54 Computational Complexity Analysis To investigate thecomputational complexity of different schemes we recordthe running time of different fusion algorithms used inFigure 5 (see Table 4) All the tests are implemented byMatlab 2014a on a PC with double Intel core i7-3770k CPU35GHz 8GB RAM As shown in Table 4 among all fusionmethods the consumption time of PCA based algorithm isthe lowest the reason of which is that it does not involvemultiscale decomposition Additionally DWT GP GF andCT basedmethods have also low time consumption (less than008 s) However their performance is poor Relatively threemethods based on NSCT are slower which is also a commonproblem of the algorithms based on NSCT Due to using thecomplex neural network and mechanism of NSCT NSCT-1 needs the most time (about 34 s) The proposed methodand NSCT-2 consume the similar time (1713 s and 1529 s)Actually the decomposition and reverse construction almostcost 910 of the total time Without exception the proposedmethod increases the cost of computational complexity withutilizing NSCT tool yet it achieves the better effect thanprevious methods

6 Conclusions

In this paper we propose a novel NSCT based statisticalmultimodal medical image fusion method which utilizes

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 10: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

10 Computational and Mathematical Methods in Medicine

(a) (b) (c) (d) (e)

(f) (g) (h) (i) (j)

Figure 9 The fusion results of different fusion methods for the second set of MRI and SPECT images (a) (b) Source images fused imagesby (c) DWT (d) GP (e) IHS (f) ST (g) CT (h) NSST-4 (i) NSCT and (j) the proposed method

the proposedmethod ismore close to the originalMRI image(see the region labeled by the left red arrow) and the spectralfeatures are also natural Figure 9 provides another exampleof MRI and SPECT image fusion In this test the proposedmethod is specifically compared with the typical schemes[20] which is the MIF method based on the statisticaldependencies between coefficients in NSST domain Fromall the fused results it is easily observed that the proposedmethod not only inherits the salient information existing inboth the original images but also hardly causes the problemofcolor distortion Through the above examples it can be seenthat the proposed method can be extended to combine theanatomical and functional medical images and achieves goodvisual effects

53 Objective Evaluation and Analysis In addition to thevisual analysis five fusion quality metrics namely mutualinformation (MI) [40] entropy (EN) [41] spatial frequency(SF) 119876

119860119861119865 [42] and the uniform intensity distribution(UID) [43] are employed to test the validity of the proposedmethod They reflect the fusion performance from claritycontrast color distortions and the amount of informationMI as an information measure for evaluating image fusionperformance represents how much information is obtainedfrom the source images The higher value of EN shows thefused image has more information contents and the highervalue of SF indicates the final image is clearer The index119876119860119861119865 measures the amount of information transferred from

source images to the fused image and the UID is used for thedescription of uniform intensity and color distribution andthe higher UID means better color information

The quantitative comparisons are listed in Tables 1 2 and3 It can be seen that for all the indices the proposedmethodhas a stable performance (most of values rank the first andonly a few rank the second) It shows that the objective

results based on these quality metrics also coincide with thesubjective visual perception Particularly for theMI values ofall tests the proposed scheme all gets the largest value It isconfirmed that the proposed statistical model can transformmore detailed information from the source images into thefinal image by exploiting dependencies between the NSCTcoefficients Therefore it can be concluded that the proposedmethod is effective and is suitable for medical image fusion

54 Computational Complexity Analysis To investigate thecomputational complexity of different schemes we recordthe running time of different fusion algorithms used inFigure 5 (see Table 4) All the tests are implemented byMatlab 2014a on a PC with double Intel core i7-3770k CPU35GHz 8GB RAM As shown in Table 4 among all fusionmethods the consumption time of PCA based algorithm isthe lowest the reason of which is that it does not involvemultiscale decomposition Additionally DWT GP GF andCT basedmethods have also low time consumption (less than008 s) However their performance is poor Relatively threemethods based on NSCT are slower which is also a commonproblem of the algorithms based on NSCT Due to using thecomplex neural network and mechanism of NSCT NSCT-1 needs the most time (about 34 s) The proposed methodand NSCT-2 consume the similar time (1713 s and 1529 s)Actually the decomposition and reverse construction almostcost 910 of the total time Without exception the proposedmethod increases the cost of computational complexity withutilizing NSCT tool yet it achieves the better effect thanprevious methods

6 Conclusions

In this paper we propose a novel NSCT based statisticalmultimodal medical image fusion method which utilizes

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 11: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

Computational and Mathematical Methods in Medicine 11

Table 1 Objective evaluation results of the four different metrics for CT and MRI images

Index DWT GP PCA GF CT NSCT-1 NSCT-2 Proposed

Figure 5

MI 1274 1279 1365 1279 1324 1450 1490 1891EN 5008 4843 4050 5107 5183 4978 5371 5675SF 3322 3108 2052 2983 2767 2352 3341 3486

119876119860119861119865 0509 0577 0369 0592 0592 0568 0593 0596

Figure 6

MI 3077 2521 3663 2729 3831 3916 3949 3962EN 5125 4851 4340 5578 6300 6414 6473 6469SF 2279 1984 1904 2029 2987 3015 3027 3141

119876119860119861119865 0618 0694 0665 0682 0682 0694 0699 0703

Table 2 Objective evaluation results of the five different metrics for MRI and PET images

Index DWT PCA IHS GP CT NSST NSST-1 Proposed

Figure 7

MI 2278 2841 2656 2280 2108 2174 2513 2853EN 4573 3707 3125 4561 4242 4148 5080 4768SF 3021 3114 1603 3012 3130 3115 3221 3373

119876119860119861119865 0525 0498 0483 0526 0551 0584 0617 0607UID 0752 0737 0725 0753 0789 0797 0836 0814

Table 3 Objective evaluation results of the five metrics for MRI and SPECT images

Index DWT GP IHS NSST-2 NSST-3 NSCT-3 NSCT-4 Proposed

Figure 8

MI 2167 2028 2255 2399 2688 2376 2520 2695EN 4424 3973 4105 4870 4507 3533 4499 4531SF 3189 2957 3506 3528 4017 3479 3978 4120

119876119860119861119865 0487 0481 0512 0499 0556 0507 0536 0576UID 0802 0812 0711 0724 0817 0765 0813 0826Index DWT GP IHS ST CT NSST-4 NSCT Proposed

Figure 9

MI 2485 2113 2872 2480 2532 3076 2784 3083EN 4277 4050 4202 4561 4426 4594 4202 4668SF 3124 2948 3126 3547 4069 3945 4087 4176

119876119860119861119865 0518 0506 0502 0537 0536 0538 0519 0540UID 0793 0679 0714 0783 0799 0806 0801 0817

Table 4 Computational complexity comparison of different fusion methods for CT-MRI dataset shown in Figure 5

Methods DWT GP PCA GF CT NSCT-1 NSCT-2 ProposedRunning time (s) 0021 0019 0003 0075 1976 3379 1529 1713

GGD to fit nicely marginal distributions of the high fre-quency coefficients and accurately measures the similaritybetween two NSCT subbands by the JSD of two GGDs Theproposed fusion rules make full use of the dependenciesbetween the coefficients and transfer them to the final imageExperimental results demonstrate that the proposed algo-rithm can effectively extract the salient information from thesource images and well combine them Note that the fusionmethods based on NSCT lack the competitive advantage intime consumption because of multilevel decomposition andreconstruction process Fast image multiscale transform toolis the subject of future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to thank all the reviewers for givingus their constructive suggestions and kind comments Thiswork was supported by grants from major special project2012A090300001 which is integration of enterprises univer-sities and research institutes in Guangdong province

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 12: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

12 Computational and Mathematical Methods in Medicine

References

[1] G Qu D Zhang and P Yan ldquoMedical image fusion by wavelettransform modulus maximardquo Optics Express vol 9 no 4 pp184ndash190 2001

[2] F E Ali I M El-Dokany A A Saad and F E Abd El-SamieldquoA curvelet transform approach for the fusion of MR and CTimagesrdquo Journal of Modern Optics vol 57 no 4 pp 273ndash2862010

[3] L Yang B L Guo and W Ni ldquoMultimodality medical imagefusion based on multiscale geometric analysis of contourlettransformrdquo Neurocomputing vol 72 no 1ndash3 pp 203ndash211 2008

[4] T Li and Y Wang ldquoMultiscaled combination of MR andSPECT images in neuroimaging a simplex method basedvariable-weight fusionrdquo Computer Methods and Programs inBiomedicine vol 105 no 1 pp 31ndash39 2012

[5] A L da Cunha J Zhou and M N Do ldquoThe nonsubsampledcontourlet transform theory design and applicationsrdquo IEEETransactions on Image Processing vol 15 no 10 pp 3089ndash31012006

[6] A L Cunha J Zhou and M N Do ldquoNonsubsampled con-tourlet transform filter design and applications in denoisingrdquo inProceedings of the 12th IEEE International Conference on ImageProcessing (ICIP rsquo05) pp 749ndash752 Genoa Italy September2005

[7] J Zhou A L Cunha and M N Do ldquoNonsubsampled con-tourlet transform construction and application in enhance-mentrdquo in Proceedings of the IEEE International Conference onImage Processing (ICIP rsquo05) pp I469ndashI472 September 2005

[8] H Li Y Chai and Z Li ldquoMulti-focus image fusion basedon nonsubsampled contourlet transform and focused regionsdetectionrdquo Optik vol 124 no 1 pp 40ndash51 2013

[9] S Daneshvar and H Ghassemian ldquoMRI and PET image fusionby combining IHS and retina-inspired modelsrdquo InformationFusion vol 11 no 2 pp 114ndash123 2010

[10] G Bhatnagar Q M J Wu and Z Liu ldquoDirective contrastbased multimodal medical image fusion in NSCT domainrdquoIEEE Transactions on Multimedia vol 15 no 5 pp 1014ndash10242013

[11] T Li and YWang ldquoBiological image fusion using aNSCT basedvariable-weight methodrdquo Information Fusion vol 12 no 2 pp85ndash92 2011

[12] S Das and M K Kundu ldquoNSCT-based multimodal medicalimage fusion using pulse-coupled neural network andmodifiedspatial frequencyrdquo Medical and Biological Engineering andComputing vol 50 no 10 pp 1105ndash1114 2012

[13] R Srivastava and A Khare ldquoMedical image fusion using localenergy in nonsubsampled contourlet transform domainrdquo inComputational Vision and Robotics vol 332 of Advances inIntelligent Systems and Computing pp 29ndash35 Springer NewDelhi India 2015

[14] C Wang and H-W Shen ldquoInformation theory in scientificvisualizationrdquo Entropy vol 13 no 1 pp 254ndash273 2011

[15] J Lin ldquoDivergence measures based on the Shannon entropyrdquoIEEE Transactions on InformationTheory vol 37 no 1 pp 145ndash151 1991

[16] M S Crouse R D Nowak and R G Baraniuk ldquoWavelet-basedstatistical signal processing using hiddenMarkovmodelsrdquo IEEETransactions on Signal Processing vol 46 no 4 pp 886ndash9021998

[17] J Portilla V Strela M J Wainwright and E P SimoncellildquoImage denoising using scale mixtures of Gaussians in the

wavelet domainrdquo IEEE Transactions on Image Processing vol 12no 11 pp 1338ndash1351 2003

[18] D D-Y Po and M N Do ldquoDirectional multiscale modelingof images using the contourlet transformrdquo in Proceedings of theIEEE Workshop on Statistical Signal Processing vol 15 pp 262ndash265 IEEE St Louis Mo USA September-October 2003

[19] H Choi and R G Baraniuk ldquoMultiscale image segmentationusing wavelet-domain hidden Markov modelsrdquo IEEE Transac-tions on Image Processing vol 10 no 9 pp 1309ndash1321 2001

[20] L Wang B Li and L-F Tian ldquoMulti-modal medical imagefusion using the inter-scale and intra-scale dependenciesbetween image shift-invariant shearlet coefficientsrdquo InformationFusion vol 19 no 1 pp 20ndash28 2014

[21] LWang B Li and L-F Tian ldquoEGGDD an explicit dependencymodel for multi-modal medical image fusion in shift-invariantshearlet transform domainrdquo Information Fusion vol 19 no 1pp 29ndash37 2014

[22] MNDo andMVetterli ldquoWavelet-based texture retrieval usinggeneralized Gaussian density and Kullback-Leibler distancerdquoIEEE Transactions on Image Processing vol 11 no 2 pp 146ndash158 2002

[23] G Van deWouwer P Scheunders and D Van Dyck ldquoStatisticaltexture characterization from discrete wavelet representationsrdquoIEEE Transactions on Image Processing vol 8 no 4 pp 592ndash598 1999

[24] S G Mallat ldquoTheory for multiresolution signal decompositionthe wavelet representationrdquo IEEE Transactions on Pattern Anal-ysis and Machine Intelligence vol 11 no 7 pp 674ndash693 1989

[25] K Sharifi and A Leron-Garcia ldquoEstimation of shape parameterfor generalized Gaussian distributions in subband decomposi-tions of videordquo IEEE Transactions on Circuits and Systems forVideo Technology vol 5 no 1 pp 52ndash56 1995

[26] P Moulin and J Liu ldquoAnalysis of multiresolution image denois-ing schemes using generalizedGaussian and complexity priorsrdquoIEEETransactions on InformationTheory vol 45 no 3 pp 909ndash919 1998

[27] M K Varanasi and B Aazhang ldquoParametric generalized Gaus-sian density estimationrdquoThe Journal of the Acoustical Society ofAmerica vol 86 no 4 pp 1404ndash1415 1989

[28] S M Kay ldquoFundamentals of statistical signal processing esti-mation theoryrdquo Technometrics vol 37 no 4 pp 465ndash466 1995

[29] M Chen and H Janicke ldquoAn information-theoretic frameworkfor visualizationrdquo IEEE Transactions on Visualization and Com-puter Graphics vol 16 no 6 pp 1206ndash1215 2010

[30] F Sroubek and J Flusser ldquoRegistration and fusion of blurredimagesrdquo in Image Analysis and Recognition vol 3211 of LectureNotes in Computer Science pp 122ndash129 Springer Berlin Ger-many 2004

[31] V S Petrovic andC S Xydeas ldquoGradient-basedmultiresolutionimage fusionrdquo IEEE Transactions on Image Processing vol 13no 2 pp 228ndash237 2004

[32] O Rockinger and T Fechner ldquoPixel-level image fusion the caseof image sequencesrdquo in Signal Processing Sensor Fusion andTarget Recognition VII vol 3374 of Proceedings of SPIE pp 378ndash388 Orlando Fla USA April 1998

[33] T-M Tu S-C Su H-C Shyu and P S Huang ldquoA new look atIHS-like image fusion methodsrdquo Information Fusion vol 2 no3 pp 177ndash186 2001

[34] S Li X Kang and J Hu ldquoImage fusion with guided filteringrdquoIEEE Transactions on Image Processing vol 22 no 7 pp 2864ndash2875 2013

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 13: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

Computational and Mathematical Methods in Medicine 13

[35] C Deng S Wang and X Chen ldquoRemote sensing imagesfusion algorithm based on shearlet transformrdquo in Proceedingsof the International Conference on Environmental Science andInformation Application Technology (ESIAT rsquo09) pp 451ndash454IEEE Wuhan China July 2009

[36] G Bhatnagar Q M J Wu and Z Liu ldquoA new contrast basedmultimodal medical image fusion frameworkrdquo Neurocomput-ing vol 157 pp 143ndash152 2015

[37] Y Chen J Xiong H-L Liu and Q Fan ldquoFusion method ofinfrared and visible images based on neighborhood character-istic and regionalization in NSCT domainrdquo Optik vol 125 no17 pp 4980ndash4984 2014

[38] W W Kong ldquoMulti-sensor image fusion based on NSSTdomain I2CMrdquo Electronics Letters vol 49 no 13 pp 802ndash8032013

[39] PGanasala andVKumar ldquoMultimodalitymedical image fusionbased onnew features inNSSTdomainrdquoBiomedical EngineeringLetters vol 4 no 4 pp 414ndash424 2014

[40] M Hossny S Nahavandi and D Creighton ldquoCommentson lsquoinformation measure for performance of image fusionrsquordquoElectronics Letters vol 44 no 18 pp 1066ndash1067 2008

[41] Q-G Miao C Shi P-F Xu M Yang and Y-B Shi ldquoA novelalgorithm of image fusion using shearletsrdquo Optics Communica-tions vol 284 no 6 pp 1540ndash1547 2011

[42] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo Electronics Letters vol 36 no 4 pp 308ndash3092000

[43] Y Liu M Liang Y Zhou et al ldquoDisrupted small-worldnetworks in schizophreniardquo Brain vol 131 no 4 pp 945ndash9612008

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Page 14: Research Article The Nonsubsampled Contourlet Transform Based Statistical Medical ...downloads.hindawi.com/journals/cmmm/2015/262819.pdf · 2019-07-31 · Research Article The Nonsubsampled

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom