13
International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486 © Research India Publications. http://www.ripublication.com 8474 Color and Texture Based Image Retrieval Feature Descriptor using Local Mesh Maximum Edge Co-occurrence Pattern Harpreet Kaur 1 and Vijay Dhir 2 1 Research Scholar, 2 Professor, 1,2 Department of Computer Science and Engineering, Punjab Technical University, Kapurthala, Punjab, India. Abstract A novel descriptor is designed and developed for extracting the features from large databases. The descriptor is known as LMeMECoP that is local mesh maximum edge co-occurrence patterns. Information of local maximum edge is collected between the possible neighbors for provided centre pixel that is the reference pixel. The maximum edge collection is in the form of mesh intended for reference pixel. The information of extracted maximum edge is utilized for forming binary bits. The co-occurrence bits are collected later for appropriate centre pixel by means of maximum edge response information. In the end, the patterns are generated from the co-occurrence bits between the provided pixels with its neighbor pixels. The combination of texture information with color information takes place for enhancing the proposed method performance. The change of RGB to HSV color space is executed for general histogram and color centered feature for each color space. Generation of final feature is executed with the integration of texture based features with color based features. The LMeMECoP assessment is executed by proposed method testing on Corel 5K with Corel 10 K and MIT VisTex dB centered on ARP (average retrieval precision). The output after investigation has shown that LMeMECoP has shown a tremendous betterment in terms of ARP on Corel-5K and Corel-10K dB. Keywords: Local mesh patterns, maximum edge patterns; Pattern Recognition; texture; image retrieval INTRODUCTION Image Retrieval is known as a method for retrieving the images from huge image databases as user’s requirements. It is intended for searching the image from the actual content instead of metadata. The concept of image retrieval is utilized for extracting the features initially and then indexing them by utilizing suitable structures and effectively executes the answer as per the user’s query. Image Retrieval firstly takes the RGB image in the form of Input, operates feature extraction with few of its similarity computations by the images being stores in the databases and later retrieves the output image as per similarity computation. Some general Image retrieval fundamentals are there and are categorized into three parts, namely, feature extraction, multi dimensional indexing and architecture of retrieval system. The diagram for the same is shown in figure 1. The objective of the research is the designing of effective and competitive algorithm for visual features and combination of similar one. The improvements in the field are shown by number of researchers and are provided in [1-4]. Figure 1: Image Retrieval Block Diagram Query Image Feature Extraction Database Image Similarity Evaluation

Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

  • Upload
    others

  • View
    15

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8474

Color and Texture Based Image Retrieval Feature Descriptor using Local

Mesh Maximum Edge Co-occurrence Pattern

Harpreet Kaur1 and Vijay Dhir2

1Research Scholar, 2 Professor, 1,2 Department of Computer Science and Engineering,

Punjab Technical University, Kapurthala, Punjab, India.

Abstract

A novel descriptor is designed and developed for extracting

the features from large databases. The descriptor is known as

LMeMECoP that is local mesh maximum edge co-occurrence

patterns. Information of local maximum edge is collected

between the possible neighbors for provided centre pixel that

is the reference pixel. The maximum edge collection is in the

form of mesh intended for reference pixel. The information of

extracted maximum edge is utilized for forming binary bits.

The co-occurrence bits are collected later for appropriate

centre pixel by means of maximum edge response

information. In the end, the patterns are generated from the

co-occurrence bits between the provided pixels with its

neighbor pixels. The combination of texture information with

color information takes place for enhancing the proposed

method performance. The change of RGB to HSV color space

is executed for general histogram and color centered feature

for each color space. Generation of final feature is executed

with the integration of texture based features with color based

features. The LMeMECoP assessment is executed by

proposed method testing on Corel 5K with Corel 10 K and

MIT VisTex dB centered on ARP (average retrieval

precision). The output after investigation has shown that

LMeMECoP has shown a tremendous betterment in terms of

ARP on Corel-5K and Corel-10K dB.

Keywords: Local mesh patterns, maximum edge patterns;

Pattern Recognition; texture; image retrieval

INTRODUCTION

Image Retrieval is known as a method for retrieving the

images from huge image databases as user’s requirements. It

is intended for searching the image from the actual content

instead of metadata. The concept of image retrieval is utilized

for extracting the features initially and then indexing them by

utilizing suitable structures and effectively executes the

answer as per the user’s query. Image Retrieval firstly takes

the RGB image in the form of Input, operates feature

extraction with few of its similarity computations by the

images being stores in the databases and later retrieves the

output image as per similarity computation. Some general

Image retrieval fundamentals are there and are categorized

into three parts, namely, feature extraction, multi dimensional

indexing and architecture of retrieval system. The diagram for

the same is shown in figure 1.

The objective of the research is the designing of effective and

competitive algorithm for visual features and combination of

similar one. The improvements in the field are shown by

number of researchers and are provided in [1-4].

Figure 1: Image Retrieval Block Diagram

Query Image

Feature Extraction

Database Image

Similarity Evaluation

Page 2: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8475

Texture calculates the image characteristics as per the

variation in definite scale and directions [5-7]. It basically

extracts the significant information of surface structural

arrangement like fabrics, clouds and bricks, so on and

association with the relative environment. Generally, the

textural information examines the pixels group behavior

instead of single pixel nature. The natural images are

considered as the premium examples of texture and color

mixture. While number of texture descriptors performs on

gray space but for optimizing the feature performance, they

may perform on color images. Varied types of strategies are

followed for extracting the texture features [8].

The concept of Wavelet Transform based correlogram for

image retrieval was given by Moghaddam et al. [9]. The

wavelet correlogram efficiency is enhanced by utilizing

metaheuristic GA (Genetic Algorithm) [10] for the

optimization of quantization thresholds. The color histogram

(color) and wavelet transform (texture) features are integrated

by Birgale et al. [11] with Subrahmanyam et al. [12] for CBIR

(Content Based Image Retrieval). The Correlogram algorithm

for the retrieval of images has utilized WC+RWC (wavelets

and rotated wavelets) and is proposed in [13].

LBP (Local binary pattern) is being proposed for defining the

textures by Ojala et al in [14]. The features for LBP was

proposed by Ojala et al. are dependent on rotational variant.

The rotation variant based LBP is transformed into rotational

invariant by means of texture classification [15-16]. For facial

expression recognition and analysis, the LBP features are

utilized in [17, 18]. Heikkila et al. has presented the detection

and modeling background by utilizing LBP in [19]. Huang et

al. has presented the comprehensive LBP for shape

localization in [20]. Heikkila et al. has utilized the LBP for the

description in interest region in [21]. Li et al. has utilized the

integration of LBP and Gabor filter for the segmentation of

texture in [22]. Zhang et al. has defined LDP (Local derivative

pattern) for face recognition in [23]. The authors have taken

LBP as a local pattern of non-directional first order that are

the binary outcome of the derivate of first-order in images.

Block-based LBP texture feature is utilized as image

description basis for Content Based Image Retrieval in [24].

An enhanced version of the LBP features is named as CS-LBP

(Center-symmetric local binary pattern) is connected with

SIFT (Scale invariant feature transform) for defining the

interest regions in [25]. Two varied type of LEP (Local edge

pattern) histograms, namely, LEPINV and LEPSEG are given

by Yao et al. in [26]. LEPSEG is given for image and

LEPINV segmentation and is designed for the retrieval of

images. The LEPSEG is scale and rotation variant, but, the

LEPINV is scale and rotation invariant.

The LBP and the LDP addition in the text cannot deal with the

different exterior discrepancies that mainly occur in

unrestrained natural images because of pose, aging, facial

expression, illumination, partial obstructions, and so on.

Therefore, for solve the defined problem; LTP (Local ternary

pattern) has been defined for face recognition in diverse

lighting environment in [27]. Few researches on pattern

dependent features by Subrahmanyam et al. has, LMEBP

(local maximum edge patterns) in [28], LTrP (local tetra

patterns) in [29] and DLEP (directional local extrema

patterns) in [30] for natural as well as texture image retrieval

and DBWP (directional binary wavelet patterns) in [31],

LMeP (local mesh patterns) in [32] and LTCoP (local ternary

co-occurrence patterns) in [33] for retrieval of medical

images. Reddy et al. in [34] has enhanced the features of

DLEP by computing the image local gray values of magnitude

information. Jacob et al. in [35] has given LOCTP (local

oppugnant color texture pattern) for retrieving the images.

Different existing features with different ways are being

provided in the literature for gathering the connection among

the referenced pixels with its neighbors between the neighbors

for applications of pattern recognition.

These mentioned features have inspired us to present

LMeMECoP (Local mesh maximum edge co-occurrence

patterns) for the applications of image retrieval. The methods

connect the information of maximum edge between neighbors

and then compile the maximum edge bits co-occurrence and

are coded as patterns. The algorithm being proposed is more

contrasting as compare to existing algorithms. Further, the

proposed method performance is enhanced by integrating it

through color features. The proposed method evaluation is

implemented on image database of benchmark Corel-10K

(10000). The paper is organized in the following manner.

A concise summary of image retrieval with its glance is

defined in Section I. the review of existing local feature

descriptors is defined in Section II. The proposed feature

descriptor is elaborated in Section III. The results obtained

after the simulation of the proposed work is defined in

Section IV. The conclusion and the future scope are being

drawn on the basis of simulated work in Section V.

REVIEW OF EXISTING LOCAL FEATURE

DESCRIPTORS

LBP (Local Binary Patterns)

The concept of LBP was defined by Ojala et al for texture

classification [14]. This method gives a robust way to define

exact local patterns in the texture. The exact 3X3

neighborhood threshold values are defined by the center

pixels that are added by binomial values for the equivalent

pixels. The pixels being resulted is added for LBP number for

the mentioned texture unit. The method LBP is known as a

gray scale invariant and could be simply integrated with

contrast measure by calculating the average gray level

difference of every neighborhood with value 1 and 0

correspondingly as shown in figure 2.

Page 3: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8476

Figure 2: LBP Block Diagram

Local binary pattern is known as a two valued code. The

values of LBP are calculated with the comparison of centre

pixel’s gray value with the neighbors by utilizing the below

equations:

(1)

(2)

From the above equation,

= Centre pixel’s gray value

Neighbors’s gray value

Q=Number of neighbors

R= Neighborhood radius

Figure 3: LBP Value computation

Original Image (3x3)

pixels

Output Image

6 5 2

7 6 1

9 3 7

8 0 0

16 0

32 0 128

1 0 0

1 0

1 0 1

8 4 2

16 1

32 64 128

6 5 2

7 6 1

9 3 7

1 0 0

1 0

1 1 1

1 0 0

128 0

64 32 16

241

Binary Pattern Example

Weights LBP Value

LBP=1+16+32+64+128=241

Page 4: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8477

Local Maximum edge Binary Patterns (LMEBP)

LMEBP operator gathers the information of maximum edge

on the basis of local difference magnitude among the

referenced pixel with its required neighbours intended in an

image [28]. The calculation of initial maximum edge is

defined below:

where, j=1,2,......8 (3)

(4)

If the corresponding edge sign is positive, than it is coded as

‘1’ or else it is coded as ‘0’ for the source pixel.

(5)

LMEBP can be described as:

(6)

On the basis of above mentioned equation 6, the LMEBP code

is being taken for every source pixel for some image. In the

end, LMEBP histogram is produced for LMEBP resultant map

[28].

Figure 4: Local Maximum edge Binary Patterns

Local Mesh Patterns (LMeP)

Local Mesh patterns were proposed by Subrahmanyam et al in

[38] for image indexing and image retrieval. It elaborates the

structure of local spatial for texture by utilizing the

relationship among nearest neighbors for provided center

pixel. The local mesh patterns for matrix of 3X3 is achieved

by computing differences among two surroundings pixels for

the centre pixel as defined below:

(7)

As shown in figure 7,j=1,2,…,8 and +((i+Q+i-1) mod P)

(8)

As defined above, j is showing initial three patterns, s defines

the nearest neighbors within the centre pixel. When the LMeP

calculation takes place, the image is shown by calculating the

histogram as shown in below figure:

(9)

, in which M and N are defined as image

size.

Figure 5: Local Mesh Patterns

Local Maximum Edge Co-occurrence patterns

(LMECoPs)

For some provided image, initially LMEBP patterns are

integrated and then the cooccurrence matrix is being taken on

the LMENP image. The provided Co-occurrence matrix is

collected with HSV color histogram [39]. The working of

Local maximum Edge Co-occurrence patterns is defined

below:

Initially, RGB image is loaded and is being converted into

HSV color space for the extraction of color features.

Construction of H,S and V for space histogram is taken place.

Later, the RGB image is converted into gray scale for the

extraction of text features. The maximum edge information

among the center pixel with its neighbors is collected and then

the maximum edge patterns are calculated. A feature vector is

formed by integrating the HSV histogram with Co-occurrence

matrix. The best matches on the basis of similarity measures

between the target image and the query image are calculated

from the databases for retrieving the fine matches.

Page 5: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8478

Proposed Feature Descriptor

Local Mesh Maximum Edge Co-occurrence Patterns

(LMeMECoP)

The proposed method that is LMeMECoP gathers the

relationship between the neighbors for the calculation of

maximum edge being dissimilar from LMEBP. Firstly, the

local dissimilarity between the neighbors for provided source

pixel is defined below:

, where i=1, 2… Q; (9)

((i+Q+k-1), Q);

(10)

As shown above, r is the neighbor radius by means of center

pixel, Q is the number of neighbors, k is the local dissimilarity

index. The mod (x,y) proceeds with the reminder for operation

x and y.

Maximum edges are composed for appropriate k

(11)

When the sign of equivalent edge is taken as positive, than it

is considered as 1 instead of 0 for the source pixel

(12)

LMeME that is Local mesh maximum edge bits are integrated

for a provided centre pixel for appropriate i as defined below:

(13)

In the end, the pattern of co-occurrence is taken for provided

center pixel at appropriate i is defined below:

………. (14)

In the above equation (14), x*y is the pattern co-occurrence

The whole needed operators of LMeMeCoP for some

provided image is defined below:

Finally, the provided image is being changed into

LMeMECoP maps with values from 0 to 255. After

calculating the LMeMECoP, the entire map is shown by

developing a histogram as defined by below equation:

(15)

Similarity measure and query matching

The feature vector intended for query image I is defined as

(16)

The above equation is resulted after the extraction of features.

Likewise, every image intended in the database is shown with

feature vector

(17)

In the above equation, i=1, 2…

The aim is to choose an m best image which is same as the

query image. It generally consists of m mostly matched image

by calculating the distance among query image and database

image. For matching the image, we have utilized similarity

distance metric as calculated by below equation [30]:

(18)

As shown in the above equation (18), is the kth feature

of ith image in database.

Image Retrieval System framework

This research has presented a novel feature descriptor,

LMeMECoP (local mesh maximum edge co-occurrence

patterns) and combines with the color features with Indexing

of color based image with the retrieval applications.

The algorithm of the proposed image retrieval system is

defined below:

Proposed Image Retrieval Algorithm

In the below algorithm, Input is the Image and output is the

retrieval result.

Step 1: The color image is loaded and is being changed into

gray scale (g) with HSV value.

Step 2: For some specified center pixel, gather the neighbors

from gray scale image.

Step 3: Compute the local dissimilarity between the

neighbors.

Step 4: Gather the maximum edge bit by means of local

difference and code the bit as 1 and 0 on the basis of

sign.

Step 5: Employ steps 2 to 4 for every pixels by taking all

pixels as source pixel.

Step 6: Describe the co-occurrence pattern by utilizing

equation 14.

Step 7: Develop the histogram from LMeMECoP with

dissimilarity k.

Step 8: Compute HSV histogram for S,H and V color image

space.

Step 9: Develop the feature vector by integrating all

histograms.

Step 10: Evaluate the query image with the image intended in

Page 6: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8479

the dB by utilizing equation 18

Step 11: Recover the image on the basis of best matches.

EXPERIMENTAL RESULT

The effectiveness of the work being proposed is tested by

executing the two experiments on benchmark databases. The

databases utilized for the implementation are:

i. Core-5K

ii. Corel-10

iii. MIT VisTex database

For experiments, 1 and 2, Corel database has been used for

images [36]. The Corel database is consisted of variety of

images for different content ranges from animals to outdoor

sports for natural images. The mentioned images are

categorized into dissimilar categories with size of 100 by area

professionals. Few researchers usually believe that Corel

database fulfill all the requirements for accessing an image

retrieval system because of its huge size and diverse content.

In each experiment, all images in the database are utilized as a

query image. For every query, the system compiles

databases images, with the matching

distance of small images measured by utilizing equation 17.

When the retrieved image is belong to

similar category as per the query image, then it can be

concluded that the system have appropriately recognized the

predictable image or the system is failed for finding the

predictable image.

For the evaluation of the proposed method, performance

metrics, namely, Precision, Average Retrieval Precision

(ARP), Recall and Average Retrieval Recall (ARR) has been

used and are defined below for some query image :

i. Precision= P( =

ii. Average Retrieval Precision= ARP( =

iii. Recall= R( =

iv. Average Retrieval Recall= ARR( =

Database -1: Corel-5K Database

The Corel-5K dB is consisted of 5000 natural images for 50

dissimilar categories. Every category is consisted of 100

images. The 50 categories are the natural images like people,

animal, buses, beaches and so on. The performance for the

proposed image retrieval system is executed on the basis of

precision, recall, ARP and ARR as well.

The results for the existing and the proposed system is

depicted in the Table I on the basis of Corel-5K and Corel-

10K dB’s for average precision and recall ad the performance

has been shown on the basis of different categories is shown

in figure 7(a) and 7(b) correspondingly for Corel-5K dB.

Then, the performance for different methods is calculated by

means of ARP and ARR on Corel 5K dB and the results are

shown in figure 8(a) and 8(b) correspondingly.

Table I: Different methods for Precision and Recall on Corel–5k and Corel–10k databases MDLEP: DLEP+MDLEP; BLK_LBP:

Block based LBP, Proposed method: LMeMECoP

Data

Base

Perfor-

mance

Method

Corel–5K CS_

LBP

LEPSE

G

LEPIN

V

BLK_

LBP

LBP DLE MDLE

LOC

TP

LME

COP

LMe

MECo

Precision (%) 32.9 41.5 35.1 45.7 43.6 48.8 54.4 56.4 58.7 62.2

Corel–10K Recall (%) 14.0 18.3 14.8 20.3 19.2 21.1 24.1 26.1 28.5 30.2

Precision (%) 26.4 34.0 28.9 38.1 37.6 40.0 45.4 47.3 __ 52.9

Recall (%) 10.1 13.8 11.2 15.3 14.9 15.7 18.4 20.3 __ 22.9

Page 7: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8480

As shown in table 1, figure 7 and figure 8, below points has

been observed:

i. The LCOMeEP defines 29.3%, 27.1%, 20.7%, 18.6%,

16.5%, 7.8%,13.4%, and 5.8% enhanced results as related to

LEPSEG, CS-LBP, BLK_LBP, LEPINV, LBP, MDLEP, and

LOCTP correspondingly for ARP on Core-5K dB.

ii. The LCOMeEP defines 11.9%,16.2% , 9.9%, 15.4%,

11.0%, 6.1%,9.1%, and 4.1% enhanced results as related to

CS-LBP, LEPSEG, LEPINV, BLK_LBP, LBP, DLEP,

MDLEP and LOCTP correspondingly for ARR on Core-5K

dB.

It has been observed that the methods being proposed has

depicted a major improvement as related to state-of-the-art

techniques for the computation measures on Corel–5K dB.

Fig. 8 defines the results of query retrieval of LMeMECoP on

Corel–5K dB.

Database-2: Corel-10K Database

The Corel-10K dB is consisted of 10000 natural images for

100 dissimilar categories. Every category has 100 images. The

proposed image retrieval system performance is calculated on

the basis of precision, recall, ARP and ARR.

The different method performance for average precision and

recall is shown in figure 6 (a) and Fig. 6 (b) correspondingly

on Corel-10K dB. Further, different methods performances

are also calculated on the basis of ARP and ARR on Corel-

10K dB. The ARP and ARR results are shown in Fig. 7(a) and

Fig. 7(b) correspondingly. From Table 1, Figure 6 and Figure

7, it is concluded that the performance of the proposed method

shows an improvement as compare to state-of-the-art methods

on Corel–10K dB.

Figure 6: Comparison of LMeMECoP with different existing methods on Corel–5K.dB (a) Category wise performance for

precision, (b) category wise performance for recall

Page 8: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8481

Figure 7: Comparison of different methods for ARP and ARR on Corel-5K dB

Page 9: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8482

Figure 8: Image retrieval example by LMeMECOP on Corel–5K dB

Page 10: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8483

Figure 9: LMeMECoP comparison with different existing methods on Corel–10K dB (a) Category wise performance for

precision, (b) category wise performance for recall

Figure 10: Various methods comparison for ARP and ARR on Corel-10K dB

Page 11: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8484

Figure 11: Proposed method Comparison with existing methods for ARR on MIT VisTex dB

Database 3: MIT VisTex Database

MIT VisTex database is taken that contains 40 varied textures

[37]. Every texture has a size of 512X512 being categorized

into sixteen 128X128 non-overlapping sub-images and

therefore, developing 640 (40X16) images database.

The performance of the proposed method that is LMeMECoP

by means and different other methods as a function number as

depicted in figure 10 and it is being shown that the method

being proposed has performed better as compare to other

existing methods for average retrieval rate on MIT VisTex

dB.

CONCLUSION

In this research, novel descriptor, that is, LMeMECoP (local

mesh maximum edge co-occurrence patterns) has been

proposed for feature extraction. The presented descriptor

initially gathers the information of local maximum edge

between the probable neighbors for provided reference pixel

that is the centre pixel. The binary bits are produced later on

from the information of extracted maximum edge. The

proposed method gathers the co-occurrence binary bits from

the provided center pixel by means of the response of

maximum edge information. The method later gathers the co-

occurrence binary bits for specified center pixel of the

response of maximum edge. The performance is being

evaluated on the basis of three databases, namely, Corel-5K,

Corel-10K and MIT VisTex database for precision, recall,

ARP (average retrieval precision) and ARR (average retrieval

rate). The outcome after execution has shown an improvement

as compare to existing work for image retrieval.

ACKNOWLEDGEMENT

One of our authors, Harpreet Kaur wants to thanks to I.K

Gujral Punjab Technical University (PTU), Kapurthala, India

for their help and support.

REFERENCES

[1]. Vipparthi, S. K., & Nagar, S. K. (2014). “ Multi-joint

histogram based modelling for image indexing and

retrieval”,Computers & Electrical Engineering, 40(8),

163-173.

[2]. Subrahmanyam, M., Wu, Q. J., Maheshwari, R. P., &

Balasubramanian, R. (2013). “Modified color motif

co-occurrence matrix for image indexing and

retrieval”, Computers & Electrical Engineering, 39(3),

762-774.

[3]. Fathima, A. A., Vasuhi, S., Babu, N. T., Vaidehi, V., &

Page 12: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8485

Treesa, T. M. (2014). “Fusion Framework for

Multimodal Biometric Person Authentication

System”, IAENG International Journal of Computer

Science, 41(1).

[4]. Lan, M. H., Xu, J., & Gao, W. (2016). “ Ontology

feature extraction via vector learning algorithm and

applied to similarity measuring and ontology

mapping”, IAENG International Journal of Computer

Science, 43(1), 10-19.

[5]. Poon, B., Amin, M. A., & Yan, H. (2016). “ PCA

Based Human Face Recognition with Improved

Methods for Distorted Images due to Illumination and

Color Background”,IAENG International Journal of

Computer Science, 43(3).

[6]. Xia, Z., Yuan, C., Sun, X., Sun, D., & Lv, R. (2016).

“Combining wavelet transform and LBP related

features for fingerprint liveness detection”, IAENG

International Journal of Computer Science, 43(3), 290-

298.

[7]. Manjunath, B. S., & Ma, W. Y. (1996). “Texture

features for browsing and retrieval of image data”,IEEE

Transactions on pattern analysis and machine

intelligence, 18(8), 837-842.

[8]. Kokare, M., Biswas, P. K., & Chatterji, B. N. (2007).

“Texture image retrieval using rotated wavelet

filters”,Pattern recognition letters, 28(10), 1240-1249.

[9]. Moghaddam, H. A., Khajoie, T. T., & Rouhi, A. H.

(2003, September). “A new algorithm for image

indexing and retrieval using wavelet correlogram.

In Image Processing,” 2003. ICIP 2003. Proceedings.

2003 International Conference on (Vol. 3, pp. III-497).

IEEE.

[10]. Saadatmand, M. T., & Moghaddam, H. A. (2005,

September). “Enhanced wavelet correlogram methods

for image indexing and retrieval”, In Image Processing,

2005. ICIP 2005. IEEE International Conference

on (Vol. 1, pp. I-541). IEEE.

[11]. Birgale, L., Kokare, M., & Doye, D. (2006, July).

“Colour and texture features for content based image

retrieval”, In Computer Graphics, Imaging and

Visualisation, 2006 International Conference on (pp.

146-149). IEEE.

[12]. Murala, S., Gonde, A. B., & Maheshwari, R. P. (2009,

March). “Color and texture features for image indexing

and retrieval”, In Advance Computing Conference,

2009. IACC 2009. IEEE International (pp. 1411-1416).

IEEE.

[13]. Subrahmanyam, M., Maheshwari, R. P., &

Balasubramanian, R. (2011). “A correlogram algorithm

for image indexing and retrieval using wavelet and

rotated wavelet filters”, International Journal of Signal

and Imaging Systems Engineering, 4(1), 27-34.

[14]. Ojala, T., Pietikäinen, M., & Harwood, D. (1996). “A

comparative study of texture measures with

classification based on featured distributions”, Pattern

recognition, 29(1), 51-59.

[15]. Ojala, T., Pietikainen, M., & Maenpaa, T. (2002).

“Multiresolution gray-scale and rotation invariant

texture classification with local binary patterns”, IEEE

Transactions on pattern analysis and machine

intelligence, 24(7), 971-987.

[16]. Pietikäinen, M., Ojala, T., & Xu, Z. (2000). “Rotation-

invariant texture classification using feature

distributions”, Pattern Recognition, 33(1), 43-52.

[17]. Ahonen, T., Hadid, A., & Pietikainen, M. (2006). “Face

description with local binary patterns: Application to

face recognition. IEEE transactions on pattern analysis

and machine intelligence”,28(12), 2037-2041.

[18]. Zhao, G., & Pietikainen, M. (2007). “Dynamic texture

recognition using local binary patterns with an

application to facial expressions. IEEE transactions on

pattern analysis and machine intelligence”, 29(6), 915-

928.

[19]. Heikkila, M., & Pietikainen, M. (2006). “A texture-

based method for modeling the background and

detecting moving objects”,. IEEE transactions on

pattern analysis and machine intelligence, 28(4), 657-

662.

[20]. Huang, X., Li, S. Z., & Wang, Y. (2004, December).

“Shape localization based on statistical method using

extended local binary pattern”, In Image and Graphics

(ICIG'04), Third International Conference on (pp. 184-

187). IEEE.

[21]. Heikkilä, M., Pietikäinen, M., & Schmid, C. (2009).

“Description of interest regions with local binary

patterns”, Pattern recognition, 42(3), 425-436.

[22]. Li, M., & Staunton, R. C. (2008). “Optimum Gabor

filter design and local binary patterns for texture

segmentation”, Pattern Recognition Letters, 29(5), 664-

672.

[23]. Zhang, B., Gao, Y., Zhao, S., & Liu, J. (2010). “ Local

derivative pattern versus local binary pattern: face

recognition with high-order local pattern

descriptor”, IEEE transactions on image

processing, 19(2), 533-544.

[24]. Takala, V., Ahonen, T., & Pietikäinen, M. (2005).

“Block-based methods for image retrieval using local

binary patterns”, Image analysis, 13-181.

[25]. Heikkilä, M., Pietikäinen, M., & Schmid, C. (2009).

Description of interest regions with local binary

patterns. Pattern recognition, 42(3), 425-436.

Page 13: Color and Texture Based Image Retrieval Feature Descriptor using … · 2017-10-18 · Color and Texture Based Image Retrieval Feature Descriptor using Local . Mesh Maximum Edge Co-occurrence

International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 19 (2017) pp. 8474-8486

© Research India Publications. http://www.ripublication.com

8486

[26]. Yao, C. H., & Chen, S. Y. (2003). “Retrieval of

translated, rotated and scaled color textures”, Pattern

Recognition, 36(4), 913-929.

[27]. Tan, X., & Triggs, B. (2010). “Enhanced local texture

feature sets for face recognition under difficult lighting

conditions”, IEEE transactions on image

processing, 19(6), 1635-1650.

[28]. Subrahmanyam, M., Maheshwari, R. P., &

Balasubramanian, R. (2012). “Local maximum edge

binary patterns: a new descriptor for image retrieval

and object tracking”, Signal Processing, 92(6), 1467-

1479.

[29]. Murala, S., Maheshwari, R. P., & Balasubramanian, R.

(2012). “Local tetra patterns: a new feature descriptor

for content-based image retrieval”, IEEE Transactions

on Image Processing, 21(5), 2874-2886.

[30]. Murala, S., Maheshwari, R. P., & Balasubramanian, R.

(2012). “Directional local extrema patterns: a new

descriptor for content based image

retrieval”, International journal of multimedia

information retrieval, 1(3), 191-203.

[31]. Murala, S., Maheshwari, R. P., & Balasubramanian, R.

(2012). “Directional binary wavelet patterns for

biomedical image indexing and retrieval”, Journal of

Medical Systems, 36(5), 2865-2879.

[32]. Murala, S., & Wu, Q. J. (2014). “Local mesh patterns

versus local binary patterns: biomedical image indexing

and retrieval’, IEEE Journal of Biomedical and Health

Informatics, 18(3), 929-938.

[33]. Murala, S., & Wu, Q. J. (2013). “Local ternary co-

occurrence patterns: a new feature descriptor for MRI

and CT image retrieval’, Neurocomputing, 119, 399-

412.

[34]. Reddy, P. V. B., & Reddy, A. R. M. (2014). “Content

based image indexing and retrieval using directional

local extrema and magnitude patterns”,AEU-

International Journal of Electronics and

Communications, 68(7), 637-643.

[35]. Jacob, I. J., Srinivasagan, K. G., & Jayapriya, K.

(2014). “Local oppugnant color texture pattern for

image retrieval system”, Pattern Recognition

Letters, 42, 72-78.

[36]. Corel 1000 and Corel 10000 image database. [Online].

Available: http://wang.ist.psu.edu/docs/related.shtml.

[37]. MIT Vision and Modeling Group, Vision Texture.

[Online]. Available: http://vismod.www.media.mit.edu.

[38]. Murala, S., & Wu, Q. J. (2014). “Local mesh patterns

versus local binary patterns: biomedical image indexing

and retrieval”, IEEE Journal of Biomedical and Health

Informatics, 18(3), 929-938.

[39]. H Kaur V Dhir (2016) “Local Maximum Edge co-

occurrence pattern for image retrieval”, IEEE Explore .