15
Hindawi Publishing Corporation BioMed Research International Volume 2013, Article ID 481431, 14 pages http://dx.doi.org/10.1155/2013/481431 Research Article Optimized Periocular Template Selection for Human Recognition Sambit Bakshi, Pankaj K. Sa, and Banshidhar Majhi Department of Computer Science and Engineering, National Institute of Technology Rourkela, Odisha 769008, India Correspondence should be addressed to Sambit Bakshi; [email protected] Received 8 April 2013; Revised 30 June 2013; Accepted 7 July 2013 Academic Editor: Tatsuya Akutsu Copyright © 2013 Sambit Bakshi et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. A novel approach for selecting a rectangular template around periocular region optimally potential for human recognition is proposed. A comparatively larger template of periocular image than the optimal one can be slightly more potent for recognition, but the larger template heavily slows down the biometric system by making feature extraction computationally intensive and increasing the database size. A smaller template, on the contrary, cannot yield desirable recognition though the smaller template performs faster due to low computation for feature extraction. ese two contradictory objectives (namely, (a) to minimize the size of periocular template and (b) to maximize the recognition through the template) are aimed to be optimized through the proposed research. is paper proposes four different approaches for dynamic optimal template selection from periocular region. e proposed methods are tested on publicly available unconstrained UBIRISv2 and FERET databases and satisfactory results have been achieved. us obtained template can be used for recognition of individuals in an organization and can be generalized to recognize every citizen of a nation. 1. Introduction A biometric system comprises a physical or behavioral trait of a person through which he or she can be recognized uniquely. Computer aided identification of a person through face biometric has grown its importance through the last decade and researchers have attempted to find unique facial nodal points. However, change of facial data with expression and age makes it challenging for recognition through face. A stringent necessity to identify a person on partial facial data has been felt in such scenario. ere are forensic applications where antemortem information is a partial face. ese motives led researchers to derive auxiliary biometric traits from facial image, namely, iris, ear, lip, and periocular region. Recognizing human through iris captured under near infrared (NIR) illumination and constrained scenario yields satisfactory recognition accuracy while recognition under visual spectrum (VS) and unconstrained scenario is relatively challenging. In particular, VS periocular image has been exploited to examine its uniqueness as there exists many nodal points. Classification and recognition through periocular region show significant accuracy, given the fact that periocular biometric uses only approximately 10% of a complete face data (illustrated in Section 4.1). Figure 1 illustrates the working model of a biometric system that employs region around the eye (periocular region) as a trait for recognition. Face is one of the primitive means of human recognition. Periocular (peripheral area of ocular) region refers to the immediate vicinity of the eye, including eyebrow and lower eye fold as depicted in Figure 2. Face recognition has been main attention of biometric researchers due to its ease of unconstrained acquisition and the uniqueness. Face is proven to have approximately 18 feature points [1] which can comprise in formation of a unique template for authentication. e major challenges in face detection faced by the researchers are due to change of human face with age, expression, and so forth. With the advent of low-cost hardware to fuse multiple biometrics in real time, the emphasis began to extract a subset of face which can partially resolve the aforementioned issues listed in Table 1. Hence the investigation towards ear, lip, and periocular has started gaining priority. Furthermore, capturing eye or face image automatically acquires periocular image. is gives the

Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

Hindawi Publishing CorporationBioMed Research InternationalVolume 2013 Article ID 481431 14 pageshttpdxdoiorg1011552013481431

Research ArticleOptimized Periocular Template Selectionfor Human Recognition

Sambit Bakshi Pankaj K Sa and Banshidhar Majhi

Department of Computer Science and Engineering National Institute of Technology Rourkela Odisha 769008 India

Correspondence should be addressed to Sambit Bakshi sambitbaksigmailcom

Received 8 April 2013 Revised 30 June 2013 Accepted 7 July 2013

Academic Editor Tatsuya Akutsu

Copyright copy 2013 Sambit Bakshi et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

A novel approach for selecting a rectangular template around periocular region optimally potential for human recognition isproposed A comparatively larger template of periocular image than the optimal one can be slightly more potent for recognitionbut the larger template heavily slows down the biometric system by making feature extraction computationally intensive andincreasing the database size A smaller template on the contrary cannot yield desirable recognition though the smaller templateperforms faster due to low computation for feature extraction These two contradictory objectives (namely (a) to minimize thesize of periocular template and (b) to maximize the recognition through the template) are aimed to be optimized through theproposed research This paper proposes four different approaches for dynamic optimal template selection from periocular regionThe proposedmethods are tested on publicly available unconstrained UBIRISv2 and FERET databases and satisfactory results havebeen achieved Thus obtained template can be used for recognition of individuals in an organization and can be generalized torecognize every citizen of a nation

1 Introduction

A biometric system comprises a physical or behavioral traitof a person through which he or she can be recognizeduniquely Computer aided identification of a person throughface biometric has grown its importance through the lastdecade and researchers have attempted to find unique facialnodal points However change of facial data with expressionand age makes it challenging for recognition through faceA stringent necessity to identify a person on partial facialdata has been felt in such scenario There are forensicapplications where antemortem information is a partial faceThese motives led researchers to derive auxiliary biometrictraits from facial image namely iris ear lip and periocularregion Recognizing human through iris captured undernear infrared (NIR) illumination and constrained scenarioyields satisfactory recognition accuracy while recognitionunder visual spectrum (VS) and unconstrained scenario isrelatively challenging In particular VS periocular image hasbeen exploited to examine its uniqueness as there existsmany nodal points Classification and recognition throughperiocular region show significant accuracy given the fact

that periocular biometric uses only approximately 10 ofa complete face data (illustrated in Section 41) Figure 1illustrates the working model of a biometric system thatemploys region around the eye (periocular region) as a traitfor recognition Face is one of the primitive means of humanrecognition

Periocular (peripheral area of ocular) region refers tothe immediate vicinity of the eye including eyebrow andlower eye fold as depicted in Figure 2 Face recognitionhas been main attention of biometric researchers due toits ease of unconstrained acquisition and the uniquenessFace is proven to have approximately 18 feature points [1]which can comprise in formation of a unique templatefor authentication The major challenges in face detectionfaced by the researchers are due to change of human facewith age expression and so forth With the advent oflow-cost hardware to fuse multiple biometrics in real timethe emphasis began to extract a subset of face which canpartially resolve the aforementioned issues listed in Table 1Hence the investigation towards ear lip and periocular hasstarted gaining priority Furthermore capturing eye or faceimage automatically acquires periocular imageThis gives the

2 BioMed Research International

Database module

Key point Descriptorselection generation

LocalizationSensor and preprocessing module Local feature

extraction module Match generation

Final scoregeneration

ScoreY Ngt

threshold

Matching module

Genuine ImposterEnrolmentRecognition

Figure 1 Working model of periocular biometric system

Eyebrow

Upper eye fold

Upper eyelidBlood vesselsTear ductInner cornerLower eyelidLower eye fold

Skin textureand colorOuter corner

Figure 2 Important features from a periocular image

flexibility of recognizing an individual using the perioculardata along with iris data without extra storage or acquisitioncost Moreover periocular features can be used when an irisimage does not contain subtle details which mostly occursdue to poor image quality Periocular biometric also comesinto play as a candidate for fusion with face image for betterrecognition accuracy

This paper approaches to fit an optimal boundary to theperiocular region which is sufficient and necessary for recog-nition Unlike other biometric traits edge information is notthe required criteria to exactly localize periocular regionRather periocular region can be localized where the periph-ery of eye contains no further information Researchers have

considered a static rectangular boundary around the eyeto recognize human and termed the localized rectangle asperiocular region However this approach is naive as thesame static boundary does notwork for every face image (egwhen the face image is captured through different distancesfrom the camera or when there is a tilt of face or cameraduring acquisition) So there is a need of deriving a dynamicboundary to describe periocular region While deciding theperiocular boundary the objective of achieving the highestrecognition accuracy also needs to be maintained The paperspecifies fewmetrics through which periocular region can beoptimally localized in scale and rotation invariant manner

The rest of the paper is organized as follows Section 2describes the landmark works in the direction of recognitionand classification through periocular region and analyzes theneed for optimizing the periocular region considered forrecognition pointed in Section 3 In Section 4 four methodsof template optimization are described and subsequentlySection 5 records experimental results obtained to establishthe proposed methods Finally Section 6 concludes withdescribing the decided periocular template which is optimalfor human recognition and marks its importance for recog-nition from a large database

2 Literature Review

Investigations have been made by researchers in the direc-tion of localizing iris from high quality constrained eyeimages captured inNIR illumination Table 2 summarizes the

BioMed Research International 3

Table 1 Comparison of biometric traits present in human face

Trait Advantages Possible challenges

IrisHigh-dimensional feature can be extracted difficult tospoof permanence of iris secured within eye folds andcan be captured in noninvasive way

Yields accuracy in NIR images than VS images cost of NIRacquisition device is high low recognition accuracy inunconstrained scenarios low recognition accuracy for lowresolution occlusion due to use of lens eye may close at thetime of capture do not work for keratoconus and keratitispatients

Face Easy to acquire yields accuracy in VS images mostavailable in criminal investigations

Not socially acceptable for some religions full face imagemakes database large variation with expression and age

Periocular Can be captured with faceiris region without extraacquisition cost Can be occluded by spectacle less features in case of infants

Lip Existence of both global and local features Difficult to acquire less acceptable socially shape changeswith human expression

Ear Easy segmentation due to presence of contrast in thevicinity Difficult to acquire and can be partially occluded by hair

comparative study of accuracy obtained by few benchmarkiris localization technique The results conclude that highlocalization accuracy has been achieved for NIR iris imagesSeveral global and local matching techniques have beenapplied for matching NIR iris images and researchers havegot high accuracy However when it comes to recognize aperson only through his iris image captured under visiblespectrum the results have been observed to be unsatisfactorySo researchers have been motivated to take into account notonly iris but also its peripheral regions while recognizingvisible spectrum images

The task of recognition is more challenging than classifi-cation and hence draws more attentionThe most commonlyused feature extraction techniques in context of periocularrecognition are Scale Invariant Feature Transform LocalBinary Pattern Tables 3 and 4 outline the methods usedand performance obtained towards periocular classificationand recognition in visual spectrum images respectivelyHowever the portion of eye on which it is applied is notcomputationally justified in the literature Any arbitraryrectangular portion centering the eye has been taken intoaccount without questioning the following

(a) Will the accuracy obtained from this arbitrary bound-ary increase if a larger region is considered

(b) How much of the considered periocular region isactually contributing to recognition

(c) Is there any portion within this arbitrary consideredperiocular region which can be removed and stillcomparable accuracy can be achieved

The derivation of optimal dynamic periocular regiongives a simultaneous solution to the aforementioned ques-tions

3 Why Optimal Template forPeriocular Region Is Required

Unlike other biometric traits periocular region has noboundary defined by any edge information Hence periocular

region cannot be detected through differential change in pixelvalue in different directions Rather the location of boundaryis the region which is smooth in terms of pixel intensity thatis a region with no information The authors of [2] havelocalized the periocular region statically by taking a rectanglehaving dimension 6119877iris times 4119877iris centering the iris where 119877irisdefines the radius of the iris But this localizationmethod failswhen the eye is tilted or gaze is not frontal Moreover themethod presumes the location of iris center to be accuratelydetectable However iris center cannot be detected for someeye images due to low-resolution nature of the image

The objective of the paper is to attain a dynamic boundaryaround the eye that defines periocular region The regionhence derived should have the following properties (a)should be able to recognize humans uniquely (b) shouldbe achievable for low-quality VS images (c) should containmain identifiable features of eye region identifiable by ahuman being and (d) no subset of the derived periocularregion should be equally potent as the derived region forrecognition

The optimally selected periocular template can be atemplate to hold identity of an individual If such template canbe generated for the whole nation it can serve as authorizedidentity (ie biometric passport [23]) of every citizen of thenation

4 Proposed Periocular TemplateSelection Methods

To achieve the above stated properties four different dynamicmodels are proposed through which periocular region canbe segmented out These models are based on (a) humananthropometry (b) demand of the accuracy of biometricsystem (c) human expert judgement and (d) subdivisionapproach

41 Through Human Anthropometry In a given face imageface can be extracted out by neural training to the system orby fast color-segmentationmethodsThe color-segmentationmethods detect skin region in the image and find the

4 BioMed Research International

Table 2 Performance comparison of some benchmark NIR iris localization approaches

Year Authors Approach Testing database Accuracy results

2002 Camus and Wildes[3]

Multiresolution coarse-to-finestrategy

Constrained iris images (640without glasses 30 withglasses)

Overall 98 (995 for subjectswithout glasses and 666 for subjectswearing glasses)

2004 Sung et al [4]Bisection method cannyedge-map detector andhistogram equalization

3176 images acquired througha CCD camera

100 inner boundary and 945 forcollarette boundary

2004 Bonney et al [5] Least significant bit plane andstandard deviations

108 images from CASIA v1and 104 images from UNSA

Pupil detection 991 and limbicdetection 665

2005 Liu et al [6] Modification to Masekrsquossegmentation algorithm

317 gallery and 4249 probeimages acquired using IridianLG 2200 iris imaging system

9708 rank-1 recognition

2006 Proenca andAlexandre [7]

Moment functions dependenton fuzzy clustering

1214 good quality 663 noisyimages from 241 subjects intwo sessions

9802 on good data set and 9788 onnoisy data set

2008 Pundlik et al [8] Markov random field andgraph cut WVU nonideal database Pixel label error rate 59

2009 He et al [9] Adaboost-cascade iris detectorfor iris center prediction

NIST Iris ChallengeEvaluation (ICE) v 10CASIA-Iris-V3-lampUBIRISv10

053 EER for ICEv10 and 075 EERfor CASIA Iris-V3-lamp

2010 Liu et al [10] 119870-means cluster CASIAv3 and UBIRISv2019 false positive and 213 falsenegative (on a fresh data set not usedto tune the system)

2010 Tan et al [11] Gray distribution features andgray projection CASIAv1 9914 accuracy (processing time

0484 simage)

2011 Bakshi et al [12] Image morphology andconnected component analysis CASIAv3 9576 accuracy with processing

(0396 simage)

Table 3 Survey on classification through periocular biometric

Authors Classification type Algorithm Classifier Testing database Accuracy ()

Abiantun and Savvides [13] Left versus right eye Adaboost HaarGabor features LDA SVM ICE 8995

Bhat and Savvides [14] Left versus right eye ASM SVM ICE LG Left eye 91 righteye 89

Merkow et al [15] Gender LBP LDA SVM PCA Downloadedfrom web 849

Lyle et al [16] Gender and ethnicity LBP SVM FRGC Gender 93ethnic 91

connected components in such a region Depending onconnected components having skin color the system labelsthe component largest in size as face Algorithm 1 proposes abinary component analysis based skin detection The thresh-olds are experimentally fitted to obtain highest accuracy insegmenting skin region in face images comprising skin colorswith different skin tonesThe algorithm takes RGB face imageas input It first converts the face image to 119884119862119887119862119903 colorspace and normalizes the pixel values In the next step theaverage luminance value is calculated by summing up the 119884

component values of each pixel and dividing the total numberof pixels in the image A brightness compensated imageis generated depending on the value of average luminanceas specified in the algorithm In the obtained brightnesscompensated image compound condition is applied and a

thresholding is performed to obtain the skin-map finallyThrough connected component analysis of the skin mapin 119884119862119887119862119903 color space open eye region can be obtained asexplained in Algorithm 2The reason of segmenting open eyeregion is to obtain the nonskin region within detected facewhich can be labeled as eye and thus to achieve approximatelocation of eye center

Once the eye region is detected the iris center can beobtained using conventional pupil detection and integrodif-ferential approach for finding the iris boundary and a staticboundary can be fitted As described earlier the authors of[2] bounded periocular region with 6119877iris times 4119877iris rectanglecentering the iris center But no justification is produced inthe paper regarding the empirically taken height and width ofthis periocular boundary This process of finding periocular

BioMed Research International 5

Table 4 Survey on recognition through periocular biometric

Year Authors Algorithm Features Testing database Performance results2010 Hollingsworth et al [17] Human analysis Eye region NIR images of 120 subjects Accuracy of 92

2010 Woodard et al [18] LBP fused withiris matching Skin

MBGC NIR images from88 subjects

Left eye rank-1recognition rate

Iris 138Periocular 925Both 965

Right eye rank-1recognition rate

Iris 101Periocular 887Both 924

2010 Miller et al [19] LBPColorinformationskin texture

FRGC neutral expressiondifferent session

Rank-1 recognitionrate

Periocular 9410Face 9438

FRGC alternateexpression same session

Rank-1 recognitionrate

Periocular 9950Face 9975

FRGC alternateexpression a differentsession

Rank-1 recognitionrate

Periocular 9490Face 9037

2010 Miller et al [20] LBP city blockdistance

Skin

FRGC VS images from410 subjects

Rank-1 recognitionrate

Left eye 8439Right eye 8390Both eyes 8976

FERET VS images from54 subjects

Rank-1 recognitionrate

Left eye 7222Right eye 7037Both eyes 7407

2010 Adams et al [21] LBP GE toselect features Skin

FRGC VS images from410 subjects

Rank-1 recognitionrate

Left eye 8685Right eye 8626Both eyes 9216

FERET VS images from54 subjects

Rank-1 recognitionrate

Left eye 8025Right eye 8080Both eyes 8506

2011 Woodard et al [22]LBP colorhistograms Skin

FRGC neutral expressiona different session

Rank-1 recognitionrate

Left eye 871Right eye 883Both eyes 910

FRGC alternateexpression same session

Rank-1 recognitionrate

Left eye 968Right eye 968Both eyes 983

FRGC alternateexpression differentsession

Rank-1 recognitionrate

Left eye 871Right eye 871Both eyes 912

boundary has prerequisite of knowledge of coordinates of iriscenter and radius of iris

Anthropometric analysis [24] of human face and eyeregion gives the information regarding the ratio of eye andiris and ratio of width of face and eye A typical block diagramin Figure 6 depicts the ratios of different parts of human facewith respect to height or width of face From the analysis it isfound that

widthperiocular = widtheyebrow = 067 timesheightface

2

heightperiocular = 2 times 119889(eyebroweyecenter)

= 2 times (021 +007

2)widthface

2

= 049 timeswidthface

2

(1)

where 119889(eyebroweyecenter) denotes the distance between center ofeyebrow and eye center

heighteyewidtheye

=049

067times

widthface2heightface2

= 073 timeswidthfaceheightface

(2)

areaperiocular = widthperiocular times heightperiocular

= 067 timesheightface

2times 049

widthface2

= 033 timesheightface

2timeswidthface

2

=033

120587times (120587 times

heightface2

timeswidthface

2)

= 011 times areaface

(3)

6 BioMed Research International

Require 119868 RGB face image of size 119898 times 119899

Ensure 119878 Binary face image indicating skin-map(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) Compute the average luminance value of image 119868 as

119868119884avg =1

119898119899

119898

sum

119894=1

119899

sum

119895=1

119868119884119894119895

(4) Brightness compensated image 1198681198621015840 is obtained as 119868119862

1015840

119894119895= 119868119877

1015840

119894119895 1198681198661015840

119894119895 119868119861119894119895

where 1198681198771015840

119894119895= (119868119877119894119895)

120591 and 1198681198661015840

119894119895= (119868119866119894119895)

120591 and

120591 =

15 if119868119884avg lt 64

07 if 119868119884avg gt 190

1 otherwise(5) The skin map 119878 is detected from 119868119862

1015840 as

119878119894119895

=

0 if119877119894119895 + 1

119866119894119895 + 1gt 108

119877119894119895 + 1

119861119894119895 + 1gt 108 119866119894119895 gt 30 119866119894119895 lt 140

1 otherwisewhere 119878119894119895 = 0 indicates skin region and 119878119894119895 = 1 indicates non-skin regions

(6) return 119878

Algorithm 1 Skin Detection

Require 119868 RGB face image of size 119898 times 119899 119878 Binary face image indicating skin-mapEnsure EM Binary face image indicating open eye regions

(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) FC = Set of connected components in 119878

(4) EM = FC(5) For each connected component EM119901 in 119878 repeat Step 5 to 8(6) EB119901 = 0

(7) For each pixel (119894119895) in EM119901 the value of EB119901 is updated as

EB119901 = EB119901 + 0 if 65 lt 119868119884119894119895 lt 80

EB119901 + 1 otherwise(8) if EB119901 = Number of pixels in EM119901 then do EM = EM minus EM119901

(Removal of the 119901th connected component)(9) return EM

Algorithm 2 Open Eye Detection

This information can be used to decide the boundaryof periocular region In (1) width and height of eye areexpressed as a function of the height andwidth of human faceHence to gauge the width and height of periocular templateboundary there is no need to have knowledge of iris radiusHowever knowledge of coordinates of iris center is necessaryFrom these information a bounding box can be fit composingall visible portions of periocular region for example eyebroweyelashes tear duct eye fold eye corner and so forth Thisapproach is crude and dependent on the human supervisionor intelligent detection of these nodal points in human eye

Further from (2) it is observable that either informationof the height or width of periocular region is sufficient toderive the other parameter provided that the aspect ratio offace is known This aspect of the localization of periocular isused in Section 42 Equation (3) considers elliptical model

to represent face while finding the ratio of periocular regionand area of a human face It justifies the usefulness of using anoptimally selected periocular template for human recognitionrather than a full face recognition system

This method achieves periocular localization withoutknowledge of iris radiusHence it is suitable for localization ofperiocular region for unconstrained images where iris radiusis not detectable by machines due to low-quality partialclosure of eye or luminance of the visible spectrum eyeimage

However to make the system work in more uncon-strained environment periocular boundary can be achievedthrough sclera detection for the scenario when iris cannotbe properly located due to unconstrained acquisition of eyeor when the image captured is a low-quality color face imagecaptured from a distance

BioMed Research International 7

411 Detection of Sclera Region and Noise Removal

(1) The input RGB iris image 119894119898 is converted to grayscaleimage im gray

(2) The input RGB iris image 119894119898 is converted to HSIcolor model where 119878 component of each pixel can bedetermined by

119878 = 1 minus3

119877 + 119866 + 119861[min (119877 119866 119861)] (4)

where R G B denotes the Red Green and Bluecolor component of a particular pixel Let the imagehence formed containing S component of each pixelis 119904119886119905119906119903119886119905119894119900119899 119894119898

(3) If 119878 lt 120591 where 120591 is a predefined threshold then thatpixel is marked as sclera region else as a nonscleraregion Authors in [25] have experimented with 120591 =

021 to get a binarymap of sclera region through bina-rization of 119904119886119905119906119903119886119905119894119900119899 119894119898 as follows 119904119888119897119890119903119886 119899119900119894119904119910 =

119904119886119905119906119903119886119905119894119900119899 119894119898 lt 120591 Only a noisy binary map of sclera119904119888119897119890119903119886 119899119900119894119904119910 can be found through this process inwhich white pixels denote noisy sclera region andblack pixels denote non-sclera region

(4) im bin is formed as follows for every nonzero pixel(119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = average intensity of 17 times 17 window

around the pixel (119894 119895) in 119894119898 119892119903119886119910

(5)

for every zero pixel (119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = 0 (6)

(5) 119904119888119897119890119903119886 119886119889119886119901119905119894V119890 is formed as follows

119904119888119897119890119903119886 119886119889119886119901119905119894V119890 (119894 119895) =

0 if 119904119888119897119890119903119886 119899119900119894119904119910 (119894 119895) = 1 or119894119898 119892119903119886119910 (119894 119895) lt 119894119898 119887119894119899 (119894 119895)

1 otherwise(7)

(6) All binary connected components present in119904119888119897119890119903119886 119886119889119886119901119905119894V119890 are removed except the largest andsecond largest components

(7) If size of the second largest connected component isless than 25 of that of the large one it is interpretedthat the largest component is the single sclera detectedand the second largest connected component isremoved hence Else both components are retained asbinary map of sclera

After processing these above specified steps the binaryimage would only contain one or two components describingthe sclera region after removing noises

412 Content Retrieval of Sclera Region After a denoisedbinarymap of sclera regionwithin an eye image is obtained itis necessary to retrieve the information about sclera whethertwo parts of sclera on two sides of iris are separately visibleonly one of them is detected or both parts of sclera aredetected as a single component

There can be three exhaustive cases in the binary imagefound as sclera (a) the two sides of the sclera is connectedand found as a single connected component (b) two scleraregions are found as two different connected componentsand (c) only one side of the sclera is detected due to the poseof eye in the image If the number of connected componentsis found to be two then it is classified as aforementionedCase b (as shown in Figures 3(a) 3(b) and 3(c)) and twocomponents are treated as two portions of sclera Else if asingle connected component is obtained it is checked forthe ratio of length and breadth of the best fitted orientedbounding rectangle If the ratio is greater than 125 then itbelongs to aforementioned Case a else belongs to Case c(shown in Figure 3(e)) For the aforementioned Case a theregion is subdivided into two components (through detectingminimal cut that divides the joined sclera into two parts) asshown in Figure 3(d) and further processing is performed

413 Nodal Points Extraction from Sclera Region Each sclerais subjected to following processing through which threenodal points are detected from each sclera region namely (a)center of sclera (b) center of concave region of sclera and (c)eye corner So in general cases where two parts of the scleraare detected six nodal points will be detectedThemethod ofnodal point extraction is illustrated below

(1) Finding Center of Sclera The sclera component issubjected to a distance transform where the value ofeachwhite pixel (indicating pixels belonging to sclera)is replaced by its minimum distance from any blackpixel The pixel which is farthest from all black pixelswill have highest value after this transformationThatpixel is labeled as center of sclera

(2) Finding Center of Concave Region of Sclera Themidpoints of every straight line joining any twoborder pixels of the detected sclera component arefound out as shown in Figure 5 The midpointslying on the component itself (shown by red pointbetween 1198751 and 1198752 in Figure 5) are neglected Themidpoints lying outside the component (shown byyellow point between 1198753 and 1198754 in Figure 5) aretaken into account Due to discrete computation ofstraight lines midpoints of many straight lines drawnin aforementioned way overlap on a single pixel Aseparate matrix having the same size as the scleraitself is introduced which is having zero value ofeach pixel initially For every valid midpoint thevalue of corresponding pixel in this new matrix isincremented Once this process is over more thanone connected components of nonzero values will beobtained in the matrix signifying concave regionsThe largest connected component is retained whileothers are removedThe pixel havingmaximum value

8 BioMed Research International

(a) Sample output 1 from UBIRISv2 database

(b) Sample output 2 from UBIRISv2 database

(c) Sample output 3 from UBIRISv2 database

(d) Sample output 4 from UBIRISv2 database

(e) Sample output 5 from UBIRISv2 database

Figure 3 Result of nodal point detection through sclera segmentation

(a) (b) (c) (d)

Figure 4 Cropped images from an iris image centering at pupil center

in the largest component is labeled as the center ofconcave region

(3) Finding the Eye Corner The distances of all pixelslying on boundary of sclera region from the scleracenter are also calculated to find the center of sclera asdescribed aboveThe boundary pixel which is farthestfrom the center of the sclera is labeled as the eyecorner

The result of extracting these nodal points from eyeimage helps in finding the tilt of eye along with the positionof iris in eye Figure 3 depicts five sample images fromUBIRISv2 dataset and the outputs obtained from their pro-cessing through the aforementioned nodal point extraction

technique This information can be useful in localization ofperiocular region

42 Through Demand of Accuracy of Biometric SystemBeginning with the center of the eye (pupil center) abounding rectangular box is taken of which only enclosesthe iris Figure 4 shows how the eye images changes whenit is cropped with pupil center and the bounding size isgradually increased The corresponding accuracy of everycropped image is tested In subsequent steps the coverageof this bounding box is increased with a width of 3 of thediameter of the iris and the change in accuracy is observedAfter certain iterations of this procedure the bounding boxwill come to a portion of periocular region where thereis no more change in intensity hence the region is low

BioMed Research International 9

P1

P2

P3

P4

Figure 5 Method of formation of concave region of a binarizedsclera component

b

a

042a

067a

042a

019a

07a

013b

007b021b

005b005b

013b

032b057b

Figure 6 Different ratios of portions of face from human anthro-pometry

entropic Hence no more local feature can be extracted fromthis region even if the bounding box is increased In suchscenario the saturation accuracy is achieved and on thebasis of saturation accuracy the corresponding minimumbounding box is considered as the desired periocular regionAs the demand of different biometric systems may vary thebounding box corresponding to certain predefined accuracycan also be segmented as periocular region Similar resultshave also been observed for FERET database

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

to recognition to recognition Eye region contributing Eye region not contributing

Figure 7 Change of accuracy of periocular recognition with changein size of periocular template tested on subset of UBIRISv2 andFERET datasets

The exact method of obtaining the dynamic boundary isas follows

(1) For 119894 = 0 to 100 follow the steps 2 to 4(2) For each image in database find approximate iris

location in eye image(3) For each image in database centering at the iris center

crop a bounding box whose width 119908 = 100 + 3 times 119894of diameter of iris height ℎ = 73 of 119908

(4) Find accuracy of the system with this image size(5) Observe the change in accuracy with 119908

Figure 7 illustrates a plot of accuracy against 119908 whichshows that the accuracy of the biometric system saturatesafter a particular size of the bounding box Increasing thebox further does not increase the accuracy To carry outthis experiment Local Binary Pattern (LBP) [26] along withScale Invariant Feature Transform (SIFT) [27] are employedas feature extractor from the eye images First LBP is appliedand resulting image is subjected for extracting local featurethrough SIFT In the process a maximum accuracy of 8564is achieved while testing with randomly chosen 50 eye imagesof 12 subjects from UBIRISv2 dataset [28] When the sameexperiment is executed for randomly chosen 50 eye images of12 subjects from FERET dataset [29] a maximum accuracyof 7829 is achieved These saturation accuracy values areobtained when a rectangular boundary of width 300 ofdiameter of iris is considered or a wider rectangular eye areais taken into consideration To validate the experiment runon the sample strongly the same experiment was conductedon complete UBIRISv2 and FERET dataset which yielded8543 and 7801 accuracy respectively This concludesthat a subset of a large database can be employed to findthe optimal template size and the result found can be usedon whole dataset for cropping of images So to minimizetemplate size without compromising in accuracy the smallestwide rectangle with saturation accuracy can be used aslocalization boundary to periocular region It is also observed

10 BioMed Research International

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (complete dataset)LBP + SIFT on FERET (complete dataset)

Eye region contributing to recognition Eye region not contributing to recognition

Figure 8 Change of accuracy of periocular recognition with changein size of periocular template tested on full UBIRISv2 and FERETdatasets

0 10 20 30 40 50 60 70 80 9001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 9 Distribution of scores for imposter and genuine matchingtested on full UBIRISv2 dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

that the region beyond 300 of diameter of iris though doesnot participate in recognition increases the matching time asshown in Figure 11 This is also another reason of removingthe redundant eye region to make the recognition processfast

To validate this experiment the same experiment hasbeen carried out once again on full database of UBIRISv2 andFERETThe obtained accuracy values as depicted in Figure 8ensure the experimental objective that there is no significantfeature in periocular region beyond 300 of diameter of iriswhich can contribute to recognition The score distributionof imposter and genuine scores is shown in Figures 9 and 10

43 Human Expert Judgement on Importance of Portions ofEye Human expertise has been utilized to decide a sortedorder of importance of different sections of periocular regiontowards recognition [17] This information can be used to

0 10 20 30 40 50 6001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 10Distribution of scores for imposter and genuinematchingtested on full FERET dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

300 310 320 330 340 350 360 370 380 390 4000

00501

01502

02503

03504

04505

Width of periocular region (w)

Aver

age 1

1 m

atch

ing

time (

s)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

Figure 11 Change of 1 1 matching time with change in size ofperiocular template tested on full UBIRISv2 and FERET datasets

detect only the most important section in human eye thatis most important towards recognition If that section is notfound in human eye region the captured image is markedas Failure to Acquire (FTA) and not used for recognitionHence a predecision on the quality of live query templatecan increase the accuracy of the system by reducing falserejections However this technique is human-supervisedwhile enrolling an image in the database and while a livequery comes The human expert has to verify whether themost important portion of eye is visible in the image and hasto guide the biometric system accordingly

44Through SubdivisionApproach andAutomation ofHumanExpertise During enrolment phase of a biometric systema human expert needs to verify manually whether thecaptured image includes expected region of interestThroughautomated labeling different sections of an eye it can bestated which portion of eye is necessary for identification

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 2: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

2 BioMed Research International

Database module

Key point Descriptorselection generation

LocalizationSensor and preprocessing module Local feature

extraction module Match generation

Final scoregeneration

ScoreY Ngt

threshold

Matching module

Genuine ImposterEnrolmentRecognition

Figure 1 Working model of periocular biometric system

Eyebrow

Upper eye fold

Upper eyelidBlood vesselsTear ductInner cornerLower eyelidLower eye fold

Skin textureand colorOuter corner

Figure 2 Important features from a periocular image

flexibility of recognizing an individual using the perioculardata along with iris data without extra storage or acquisitioncost Moreover periocular features can be used when an irisimage does not contain subtle details which mostly occursdue to poor image quality Periocular biometric also comesinto play as a candidate for fusion with face image for betterrecognition accuracy

This paper approaches to fit an optimal boundary to theperiocular region which is sufficient and necessary for recog-nition Unlike other biometric traits edge information is notthe required criteria to exactly localize periocular regionRather periocular region can be localized where the periph-ery of eye contains no further information Researchers have

considered a static rectangular boundary around the eyeto recognize human and termed the localized rectangle asperiocular region However this approach is naive as thesame static boundary does notwork for every face image (egwhen the face image is captured through different distancesfrom the camera or when there is a tilt of face or cameraduring acquisition) So there is a need of deriving a dynamicboundary to describe periocular region While deciding theperiocular boundary the objective of achieving the highestrecognition accuracy also needs to be maintained The paperspecifies fewmetrics through which periocular region can beoptimally localized in scale and rotation invariant manner

The rest of the paper is organized as follows Section 2describes the landmark works in the direction of recognitionand classification through periocular region and analyzes theneed for optimizing the periocular region considered forrecognition pointed in Section 3 In Section 4 four methodsof template optimization are described and subsequentlySection 5 records experimental results obtained to establishthe proposed methods Finally Section 6 concludes withdescribing the decided periocular template which is optimalfor human recognition and marks its importance for recog-nition from a large database

2 Literature Review

Investigations have been made by researchers in the direc-tion of localizing iris from high quality constrained eyeimages captured inNIR illumination Table 2 summarizes the

BioMed Research International 3

Table 1 Comparison of biometric traits present in human face

Trait Advantages Possible challenges

IrisHigh-dimensional feature can be extracted difficult tospoof permanence of iris secured within eye folds andcan be captured in noninvasive way

Yields accuracy in NIR images than VS images cost of NIRacquisition device is high low recognition accuracy inunconstrained scenarios low recognition accuracy for lowresolution occlusion due to use of lens eye may close at thetime of capture do not work for keratoconus and keratitispatients

Face Easy to acquire yields accuracy in VS images mostavailable in criminal investigations

Not socially acceptable for some religions full face imagemakes database large variation with expression and age

Periocular Can be captured with faceiris region without extraacquisition cost Can be occluded by spectacle less features in case of infants

Lip Existence of both global and local features Difficult to acquire less acceptable socially shape changeswith human expression

Ear Easy segmentation due to presence of contrast in thevicinity Difficult to acquire and can be partially occluded by hair

comparative study of accuracy obtained by few benchmarkiris localization technique The results conclude that highlocalization accuracy has been achieved for NIR iris imagesSeveral global and local matching techniques have beenapplied for matching NIR iris images and researchers havegot high accuracy However when it comes to recognize aperson only through his iris image captured under visiblespectrum the results have been observed to be unsatisfactorySo researchers have been motivated to take into account notonly iris but also its peripheral regions while recognizingvisible spectrum images

The task of recognition is more challenging than classifi-cation and hence draws more attentionThe most commonlyused feature extraction techniques in context of periocularrecognition are Scale Invariant Feature Transform LocalBinary Pattern Tables 3 and 4 outline the methods usedand performance obtained towards periocular classificationand recognition in visual spectrum images respectivelyHowever the portion of eye on which it is applied is notcomputationally justified in the literature Any arbitraryrectangular portion centering the eye has been taken intoaccount without questioning the following

(a) Will the accuracy obtained from this arbitrary bound-ary increase if a larger region is considered

(b) How much of the considered periocular region isactually contributing to recognition

(c) Is there any portion within this arbitrary consideredperiocular region which can be removed and stillcomparable accuracy can be achieved

The derivation of optimal dynamic periocular regiongives a simultaneous solution to the aforementioned ques-tions

3 Why Optimal Template forPeriocular Region Is Required

Unlike other biometric traits periocular region has noboundary defined by any edge information Hence periocular

region cannot be detected through differential change in pixelvalue in different directions Rather the location of boundaryis the region which is smooth in terms of pixel intensity thatis a region with no information The authors of [2] havelocalized the periocular region statically by taking a rectanglehaving dimension 6119877iris times 4119877iris centering the iris where 119877irisdefines the radius of the iris But this localizationmethod failswhen the eye is tilted or gaze is not frontal Moreover themethod presumes the location of iris center to be accuratelydetectable However iris center cannot be detected for someeye images due to low-resolution nature of the image

The objective of the paper is to attain a dynamic boundaryaround the eye that defines periocular region The regionhence derived should have the following properties (a)should be able to recognize humans uniquely (b) shouldbe achievable for low-quality VS images (c) should containmain identifiable features of eye region identifiable by ahuman being and (d) no subset of the derived periocularregion should be equally potent as the derived region forrecognition

The optimally selected periocular template can be atemplate to hold identity of an individual If such template canbe generated for the whole nation it can serve as authorizedidentity (ie biometric passport [23]) of every citizen of thenation

4 Proposed Periocular TemplateSelection Methods

To achieve the above stated properties four different dynamicmodels are proposed through which periocular region canbe segmented out These models are based on (a) humananthropometry (b) demand of the accuracy of biometricsystem (c) human expert judgement and (d) subdivisionapproach

41 Through Human Anthropometry In a given face imageface can be extracted out by neural training to the system orby fast color-segmentationmethodsThe color-segmentationmethods detect skin region in the image and find the

4 BioMed Research International

Table 2 Performance comparison of some benchmark NIR iris localization approaches

Year Authors Approach Testing database Accuracy results

2002 Camus and Wildes[3]

Multiresolution coarse-to-finestrategy

Constrained iris images (640without glasses 30 withglasses)

Overall 98 (995 for subjectswithout glasses and 666 for subjectswearing glasses)

2004 Sung et al [4]Bisection method cannyedge-map detector andhistogram equalization

3176 images acquired througha CCD camera

100 inner boundary and 945 forcollarette boundary

2004 Bonney et al [5] Least significant bit plane andstandard deviations

108 images from CASIA v1and 104 images from UNSA

Pupil detection 991 and limbicdetection 665

2005 Liu et al [6] Modification to Masekrsquossegmentation algorithm

317 gallery and 4249 probeimages acquired using IridianLG 2200 iris imaging system

9708 rank-1 recognition

2006 Proenca andAlexandre [7]

Moment functions dependenton fuzzy clustering

1214 good quality 663 noisyimages from 241 subjects intwo sessions

9802 on good data set and 9788 onnoisy data set

2008 Pundlik et al [8] Markov random field andgraph cut WVU nonideal database Pixel label error rate 59

2009 He et al [9] Adaboost-cascade iris detectorfor iris center prediction

NIST Iris ChallengeEvaluation (ICE) v 10CASIA-Iris-V3-lampUBIRISv10

053 EER for ICEv10 and 075 EERfor CASIA Iris-V3-lamp

2010 Liu et al [10] 119870-means cluster CASIAv3 and UBIRISv2019 false positive and 213 falsenegative (on a fresh data set not usedto tune the system)

2010 Tan et al [11] Gray distribution features andgray projection CASIAv1 9914 accuracy (processing time

0484 simage)

2011 Bakshi et al [12] Image morphology andconnected component analysis CASIAv3 9576 accuracy with processing

(0396 simage)

Table 3 Survey on classification through periocular biometric

Authors Classification type Algorithm Classifier Testing database Accuracy ()

Abiantun and Savvides [13] Left versus right eye Adaboost HaarGabor features LDA SVM ICE 8995

Bhat and Savvides [14] Left versus right eye ASM SVM ICE LG Left eye 91 righteye 89

Merkow et al [15] Gender LBP LDA SVM PCA Downloadedfrom web 849

Lyle et al [16] Gender and ethnicity LBP SVM FRGC Gender 93ethnic 91

connected components in such a region Depending onconnected components having skin color the system labelsthe component largest in size as face Algorithm 1 proposes abinary component analysis based skin detection The thresh-olds are experimentally fitted to obtain highest accuracy insegmenting skin region in face images comprising skin colorswith different skin tonesThe algorithm takes RGB face imageas input It first converts the face image to 119884119862119887119862119903 colorspace and normalizes the pixel values In the next step theaverage luminance value is calculated by summing up the 119884

component values of each pixel and dividing the total numberof pixels in the image A brightness compensated imageis generated depending on the value of average luminanceas specified in the algorithm In the obtained brightnesscompensated image compound condition is applied and a

thresholding is performed to obtain the skin-map finallyThrough connected component analysis of the skin mapin 119884119862119887119862119903 color space open eye region can be obtained asexplained in Algorithm 2The reason of segmenting open eyeregion is to obtain the nonskin region within detected facewhich can be labeled as eye and thus to achieve approximatelocation of eye center

Once the eye region is detected the iris center can beobtained using conventional pupil detection and integrodif-ferential approach for finding the iris boundary and a staticboundary can be fitted As described earlier the authors of[2] bounded periocular region with 6119877iris times 4119877iris rectanglecentering the iris center But no justification is produced inthe paper regarding the empirically taken height and width ofthis periocular boundary This process of finding periocular

BioMed Research International 5

Table 4 Survey on recognition through periocular biometric

Year Authors Algorithm Features Testing database Performance results2010 Hollingsworth et al [17] Human analysis Eye region NIR images of 120 subjects Accuracy of 92

2010 Woodard et al [18] LBP fused withiris matching Skin

MBGC NIR images from88 subjects

Left eye rank-1recognition rate

Iris 138Periocular 925Both 965

Right eye rank-1recognition rate

Iris 101Periocular 887Both 924

2010 Miller et al [19] LBPColorinformationskin texture

FRGC neutral expressiondifferent session

Rank-1 recognitionrate

Periocular 9410Face 9438

FRGC alternateexpression same session

Rank-1 recognitionrate

Periocular 9950Face 9975

FRGC alternateexpression a differentsession

Rank-1 recognitionrate

Periocular 9490Face 9037

2010 Miller et al [20] LBP city blockdistance

Skin

FRGC VS images from410 subjects

Rank-1 recognitionrate

Left eye 8439Right eye 8390Both eyes 8976

FERET VS images from54 subjects

Rank-1 recognitionrate

Left eye 7222Right eye 7037Both eyes 7407

2010 Adams et al [21] LBP GE toselect features Skin

FRGC VS images from410 subjects

Rank-1 recognitionrate

Left eye 8685Right eye 8626Both eyes 9216

FERET VS images from54 subjects

Rank-1 recognitionrate

Left eye 8025Right eye 8080Both eyes 8506

2011 Woodard et al [22]LBP colorhistograms Skin

FRGC neutral expressiona different session

Rank-1 recognitionrate

Left eye 871Right eye 883Both eyes 910

FRGC alternateexpression same session

Rank-1 recognitionrate

Left eye 968Right eye 968Both eyes 983

FRGC alternateexpression differentsession

Rank-1 recognitionrate

Left eye 871Right eye 871Both eyes 912

boundary has prerequisite of knowledge of coordinates of iriscenter and radius of iris

Anthropometric analysis [24] of human face and eyeregion gives the information regarding the ratio of eye andiris and ratio of width of face and eye A typical block diagramin Figure 6 depicts the ratios of different parts of human facewith respect to height or width of face From the analysis it isfound that

widthperiocular = widtheyebrow = 067 timesheightface

2

heightperiocular = 2 times 119889(eyebroweyecenter)

= 2 times (021 +007

2)widthface

2

= 049 timeswidthface

2

(1)

where 119889(eyebroweyecenter) denotes the distance between center ofeyebrow and eye center

heighteyewidtheye

=049

067times

widthface2heightface2

= 073 timeswidthfaceheightface

(2)

areaperiocular = widthperiocular times heightperiocular

= 067 timesheightface

2times 049

widthface2

= 033 timesheightface

2timeswidthface

2

=033

120587times (120587 times

heightface2

timeswidthface

2)

= 011 times areaface

(3)

6 BioMed Research International

Require 119868 RGB face image of size 119898 times 119899

Ensure 119878 Binary face image indicating skin-map(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) Compute the average luminance value of image 119868 as

119868119884avg =1

119898119899

119898

sum

119894=1

119899

sum

119895=1

119868119884119894119895

(4) Brightness compensated image 1198681198621015840 is obtained as 119868119862

1015840

119894119895= 119868119877

1015840

119894119895 1198681198661015840

119894119895 119868119861119894119895

where 1198681198771015840

119894119895= (119868119877119894119895)

120591 and 1198681198661015840

119894119895= (119868119866119894119895)

120591 and

120591 =

15 if119868119884avg lt 64

07 if 119868119884avg gt 190

1 otherwise(5) The skin map 119878 is detected from 119868119862

1015840 as

119878119894119895

=

0 if119877119894119895 + 1

119866119894119895 + 1gt 108

119877119894119895 + 1

119861119894119895 + 1gt 108 119866119894119895 gt 30 119866119894119895 lt 140

1 otherwisewhere 119878119894119895 = 0 indicates skin region and 119878119894119895 = 1 indicates non-skin regions

(6) return 119878

Algorithm 1 Skin Detection

Require 119868 RGB face image of size 119898 times 119899 119878 Binary face image indicating skin-mapEnsure EM Binary face image indicating open eye regions

(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) FC = Set of connected components in 119878

(4) EM = FC(5) For each connected component EM119901 in 119878 repeat Step 5 to 8(6) EB119901 = 0

(7) For each pixel (119894119895) in EM119901 the value of EB119901 is updated as

EB119901 = EB119901 + 0 if 65 lt 119868119884119894119895 lt 80

EB119901 + 1 otherwise(8) if EB119901 = Number of pixels in EM119901 then do EM = EM minus EM119901

(Removal of the 119901th connected component)(9) return EM

Algorithm 2 Open Eye Detection

This information can be used to decide the boundaryof periocular region In (1) width and height of eye areexpressed as a function of the height andwidth of human faceHence to gauge the width and height of periocular templateboundary there is no need to have knowledge of iris radiusHowever knowledge of coordinates of iris center is necessaryFrom these information a bounding box can be fit composingall visible portions of periocular region for example eyebroweyelashes tear duct eye fold eye corner and so forth Thisapproach is crude and dependent on the human supervisionor intelligent detection of these nodal points in human eye

Further from (2) it is observable that either informationof the height or width of periocular region is sufficient toderive the other parameter provided that the aspect ratio offace is known This aspect of the localization of periocular isused in Section 42 Equation (3) considers elliptical model

to represent face while finding the ratio of periocular regionand area of a human face It justifies the usefulness of using anoptimally selected periocular template for human recognitionrather than a full face recognition system

This method achieves periocular localization withoutknowledge of iris radiusHence it is suitable for localization ofperiocular region for unconstrained images where iris radiusis not detectable by machines due to low-quality partialclosure of eye or luminance of the visible spectrum eyeimage

However to make the system work in more uncon-strained environment periocular boundary can be achievedthrough sclera detection for the scenario when iris cannotbe properly located due to unconstrained acquisition of eyeor when the image captured is a low-quality color face imagecaptured from a distance

BioMed Research International 7

411 Detection of Sclera Region and Noise Removal

(1) The input RGB iris image 119894119898 is converted to grayscaleimage im gray

(2) The input RGB iris image 119894119898 is converted to HSIcolor model where 119878 component of each pixel can bedetermined by

119878 = 1 minus3

119877 + 119866 + 119861[min (119877 119866 119861)] (4)

where R G B denotes the Red Green and Bluecolor component of a particular pixel Let the imagehence formed containing S component of each pixelis 119904119886119905119906119903119886119905119894119900119899 119894119898

(3) If 119878 lt 120591 where 120591 is a predefined threshold then thatpixel is marked as sclera region else as a nonscleraregion Authors in [25] have experimented with 120591 =

021 to get a binarymap of sclera region through bina-rization of 119904119886119905119906119903119886119905119894119900119899 119894119898 as follows 119904119888119897119890119903119886 119899119900119894119904119910 =

119904119886119905119906119903119886119905119894119900119899 119894119898 lt 120591 Only a noisy binary map of sclera119904119888119897119890119903119886 119899119900119894119904119910 can be found through this process inwhich white pixels denote noisy sclera region andblack pixels denote non-sclera region

(4) im bin is formed as follows for every nonzero pixel(119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = average intensity of 17 times 17 window

around the pixel (119894 119895) in 119894119898 119892119903119886119910

(5)

for every zero pixel (119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = 0 (6)

(5) 119904119888119897119890119903119886 119886119889119886119901119905119894V119890 is formed as follows

119904119888119897119890119903119886 119886119889119886119901119905119894V119890 (119894 119895) =

0 if 119904119888119897119890119903119886 119899119900119894119904119910 (119894 119895) = 1 or119894119898 119892119903119886119910 (119894 119895) lt 119894119898 119887119894119899 (119894 119895)

1 otherwise(7)

(6) All binary connected components present in119904119888119897119890119903119886 119886119889119886119901119905119894V119890 are removed except the largest andsecond largest components

(7) If size of the second largest connected component isless than 25 of that of the large one it is interpretedthat the largest component is the single sclera detectedand the second largest connected component isremoved hence Else both components are retained asbinary map of sclera

After processing these above specified steps the binaryimage would only contain one or two components describingthe sclera region after removing noises

412 Content Retrieval of Sclera Region After a denoisedbinarymap of sclera regionwithin an eye image is obtained itis necessary to retrieve the information about sclera whethertwo parts of sclera on two sides of iris are separately visibleonly one of them is detected or both parts of sclera aredetected as a single component

There can be three exhaustive cases in the binary imagefound as sclera (a) the two sides of the sclera is connectedand found as a single connected component (b) two scleraregions are found as two different connected componentsand (c) only one side of the sclera is detected due to the poseof eye in the image If the number of connected componentsis found to be two then it is classified as aforementionedCase b (as shown in Figures 3(a) 3(b) and 3(c)) and twocomponents are treated as two portions of sclera Else if asingle connected component is obtained it is checked forthe ratio of length and breadth of the best fitted orientedbounding rectangle If the ratio is greater than 125 then itbelongs to aforementioned Case a else belongs to Case c(shown in Figure 3(e)) For the aforementioned Case a theregion is subdivided into two components (through detectingminimal cut that divides the joined sclera into two parts) asshown in Figure 3(d) and further processing is performed

413 Nodal Points Extraction from Sclera Region Each sclerais subjected to following processing through which threenodal points are detected from each sclera region namely (a)center of sclera (b) center of concave region of sclera and (c)eye corner So in general cases where two parts of the scleraare detected six nodal points will be detectedThemethod ofnodal point extraction is illustrated below

(1) Finding Center of Sclera The sclera component issubjected to a distance transform where the value ofeachwhite pixel (indicating pixels belonging to sclera)is replaced by its minimum distance from any blackpixel The pixel which is farthest from all black pixelswill have highest value after this transformationThatpixel is labeled as center of sclera

(2) Finding Center of Concave Region of Sclera Themidpoints of every straight line joining any twoborder pixels of the detected sclera component arefound out as shown in Figure 5 The midpointslying on the component itself (shown by red pointbetween 1198751 and 1198752 in Figure 5) are neglected Themidpoints lying outside the component (shown byyellow point between 1198753 and 1198754 in Figure 5) aretaken into account Due to discrete computation ofstraight lines midpoints of many straight lines drawnin aforementioned way overlap on a single pixel Aseparate matrix having the same size as the scleraitself is introduced which is having zero value ofeach pixel initially For every valid midpoint thevalue of corresponding pixel in this new matrix isincremented Once this process is over more thanone connected components of nonzero values will beobtained in the matrix signifying concave regionsThe largest connected component is retained whileothers are removedThe pixel havingmaximum value

8 BioMed Research International

(a) Sample output 1 from UBIRISv2 database

(b) Sample output 2 from UBIRISv2 database

(c) Sample output 3 from UBIRISv2 database

(d) Sample output 4 from UBIRISv2 database

(e) Sample output 5 from UBIRISv2 database

Figure 3 Result of nodal point detection through sclera segmentation

(a) (b) (c) (d)

Figure 4 Cropped images from an iris image centering at pupil center

in the largest component is labeled as the center ofconcave region

(3) Finding the Eye Corner The distances of all pixelslying on boundary of sclera region from the scleracenter are also calculated to find the center of sclera asdescribed aboveThe boundary pixel which is farthestfrom the center of the sclera is labeled as the eyecorner

The result of extracting these nodal points from eyeimage helps in finding the tilt of eye along with the positionof iris in eye Figure 3 depicts five sample images fromUBIRISv2 dataset and the outputs obtained from their pro-cessing through the aforementioned nodal point extraction

technique This information can be useful in localization ofperiocular region

42 Through Demand of Accuracy of Biometric SystemBeginning with the center of the eye (pupil center) abounding rectangular box is taken of which only enclosesthe iris Figure 4 shows how the eye images changes whenit is cropped with pupil center and the bounding size isgradually increased The corresponding accuracy of everycropped image is tested In subsequent steps the coverageof this bounding box is increased with a width of 3 of thediameter of the iris and the change in accuracy is observedAfter certain iterations of this procedure the bounding boxwill come to a portion of periocular region where thereis no more change in intensity hence the region is low

BioMed Research International 9

P1

P2

P3

P4

Figure 5 Method of formation of concave region of a binarizedsclera component

b

a

042a

067a

042a

019a

07a

013b

007b021b

005b005b

013b

032b057b

Figure 6 Different ratios of portions of face from human anthro-pometry

entropic Hence no more local feature can be extracted fromthis region even if the bounding box is increased In suchscenario the saturation accuracy is achieved and on thebasis of saturation accuracy the corresponding minimumbounding box is considered as the desired periocular regionAs the demand of different biometric systems may vary thebounding box corresponding to certain predefined accuracycan also be segmented as periocular region Similar resultshave also been observed for FERET database

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

to recognition to recognition Eye region contributing Eye region not contributing

Figure 7 Change of accuracy of periocular recognition with changein size of periocular template tested on subset of UBIRISv2 andFERET datasets

The exact method of obtaining the dynamic boundary isas follows

(1) For 119894 = 0 to 100 follow the steps 2 to 4(2) For each image in database find approximate iris

location in eye image(3) For each image in database centering at the iris center

crop a bounding box whose width 119908 = 100 + 3 times 119894of diameter of iris height ℎ = 73 of 119908

(4) Find accuracy of the system with this image size(5) Observe the change in accuracy with 119908

Figure 7 illustrates a plot of accuracy against 119908 whichshows that the accuracy of the biometric system saturatesafter a particular size of the bounding box Increasing thebox further does not increase the accuracy To carry outthis experiment Local Binary Pattern (LBP) [26] along withScale Invariant Feature Transform (SIFT) [27] are employedas feature extractor from the eye images First LBP is appliedand resulting image is subjected for extracting local featurethrough SIFT In the process a maximum accuracy of 8564is achieved while testing with randomly chosen 50 eye imagesof 12 subjects from UBIRISv2 dataset [28] When the sameexperiment is executed for randomly chosen 50 eye images of12 subjects from FERET dataset [29] a maximum accuracyof 7829 is achieved These saturation accuracy values areobtained when a rectangular boundary of width 300 ofdiameter of iris is considered or a wider rectangular eye areais taken into consideration To validate the experiment runon the sample strongly the same experiment was conductedon complete UBIRISv2 and FERET dataset which yielded8543 and 7801 accuracy respectively This concludesthat a subset of a large database can be employed to findthe optimal template size and the result found can be usedon whole dataset for cropping of images So to minimizetemplate size without compromising in accuracy the smallestwide rectangle with saturation accuracy can be used aslocalization boundary to periocular region It is also observed

10 BioMed Research International

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (complete dataset)LBP + SIFT on FERET (complete dataset)

Eye region contributing to recognition Eye region not contributing to recognition

Figure 8 Change of accuracy of periocular recognition with changein size of periocular template tested on full UBIRISv2 and FERETdatasets

0 10 20 30 40 50 60 70 80 9001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 9 Distribution of scores for imposter and genuine matchingtested on full UBIRISv2 dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

that the region beyond 300 of diameter of iris though doesnot participate in recognition increases the matching time asshown in Figure 11 This is also another reason of removingthe redundant eye region to make the recognition processfast

To validate this experiment the same experiment hasbeen carried out once again on full database of UBIRISv2 andFERETThe obtained accuracy values as depicted in Figure 8ensure the experimental objective that there is no significantfeature in periocular region beyond 300 of diameter of iriswhich can contribute to recognition The score distributionof imposter and genuine scores is shown in Figures 9 and 10

43 Human Expert Judgement on Importance of Portions ofEye Human expertise has been utilized to decide a sortedorder of importance of different sections of periocular regiontowards recognition [17] This information can be used to

0 10 20 30 40 50 6001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 10Distribution of scores for imposter and genuinematchingtested on full FERET dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

300 310 320 330 340 350 360 370 380 390 4000

00501

01502

02503

03504

04505

Width of periocular region (w)

Aver

age 1

1 m

atch

ing

time (

s)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

Figure 11 Change of 1 1 matching time with change in size ofperiocular template tested on full UBIRISv2 and FERET datasets

detect only the most important section in human eye thatis most important towards recognition If that section is notfound in human eye region the captured image is markedas Failure to Acquire (FTA) and not used for recognitionHence a predecision on the quality of live query templatecan increase the accuracy of the system by reducing falserejections However this technique is human-supervisedwhile enrolling an image in the database and while a livequery comes The human expert has to verify whether themost important portion of eye is visible in the image and hasto guide the biometric system accordingly

44Through SubdivisionApproach andAutomation ofHumanExpertise During enrolment phase of a biometric systema human expert needs to verify manually whether thecaptured image includes expected region of interestThroughautomated labeling different sections of an eye it can bestated which portion of eye is necessary for identification

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 3: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

BioMed Research International 3

Table 1 Comparison of biometric traits present in human face

Trait Advantages Possible challenges

IrisHigh-dimensional feature can be extracted difficult tospoof permanence of iris secured within eye folds andcan be captured in noninvasive way

Yields accuracy in NIR images than VS images cost of NIRacquisition device is high low recognition accuracy inunconstrained scenarios low recognition accuracy for lowresolution occlusion due to use of lens eye may close at thetime of capture do not work for keratoconus and keratitispatients

Face Easy to acquire yields accuracy in VS images mostavailable in criminal investigations

Not socially acceptable for some religions full face imagemakes database large variation with expression and age

Periocular Can be captured with faceiris region without extraacquisition cost Can be occluded by spectacle less features in case of infants

Lip Existence of both global and local features Difficult to acquire less acceptable socially shape changeswith human expression

Ear Easy segmentation due to presence of contrast in thevicinity Difficult to acquire and can be partially occluded by hair

comparative study of accuracy obtained by few benchmarkiris localization technique The results conclude that highlocalization accuracy has been achieved for NIR iris imagesSeveral global and local matching techniques have beenapplied for matching NIR iris images and researchers havegot high accuracy However when it comes to recognize aperson only through his iris image captured under visiblespectrum the results have been observed to be unsatisfactorySo researchers have been motivated to take into account notonly iris but also its peripheral regions while recognizingvisible spectrum images

The task of recognition is more challenging than classifi-cation and hence draws more attentionThe most commonlyused feature extraction techniques in context of periocularrecognition are Scale Invariant Feature Transform LocalBinary Pattern Tables 3 and 4 outline the methods usedand performance obtained towards periocular classificationand recognition in visual spectrum images respectivelyHowever the portion of eye on which it is applied is notcomputationally justified in the literature Any arbitraryrectangular portion centering the eye has been taken intoaccount without questioning the following

(a) Will the accuracy obtained from this arbitrary bound-ary increase if a larger region is considered

(b) How much of the considered periocular region isactually contributing to recognition

(c) Is there any portion within this arbitrary consideredperiocular region which can be removed and stillcomparable accuracy can be achieved

The derivation of optimal dynamic periocular regiongives a simultaneous solution to the aforementioned ques-tions

3 Why Optimal Template forPeriocular Region Is Required

Unlike other biometric traits periocular region has noboundary defined by any edge information Hence periocular

region cannot be detected through differential change in pixelvalue in different directions Rather the location of boundaryis the region which is smooth in terms of pixel intensity thatis a region with no information The authors of [2] havelocalized the periocular region statically by taking a rectanglehaving dimension 6119877iris times 4119877iris centering the iris where 119877irisdefines the radius of the iris But this localizationmethod failswhen the eye is tilted or gaze is not frontal Moreover themethod presumes the location of iris center to be accuratelydetectable However iris center cannot be detected for someeye images due to low-resolution nature of the image

The objective of the paper is to attain a dynamic boundaryaround the eye that defines periocular region The regionhence derived should have the following properties (a)should be able to recognize humans uniquely (b) shouldbe achievable for low-quality VS images (c) should containmain identifiable features of eye region identifiable by ahuman being and (d) no subset of the derived periocularregion should be equally potent as the derived region forrecognition

The optimally selected periocular template can be atemplate to hold identity of an individual If such template canbe generated for the whole nation it can serve as authorizedidentity (ie biometric passport [23]) of every citizen of thenation

4 Proposed Periocular TemplateSelection Methods

To achieve the above stated properties four different dynamicmodels are proposed through which periocular region canbe segmented out These models are based on (a) humananthropometry (b) demand of the accuracy of biometricsystem (c) human expert judgement and (d) subdivisionapproach

41 Through Human Anthropometry In a given face imageface can be extracted out by neural training to the system orby fast color-segmentationmethodsThe color-segmentationmethods detect skin region in the image and find the

4 BioMed Research International

Table 2 Performance comparison of some benchmark NIR iris localization approaches

Year Authors Approach Testing database Accuracy results

2002 Camus and Wildes[3]

Multiresolution coarse-to-finestrategy

Constrained iris images (640without glasses 30 withglasses)

Overall 98 (995 for subjectswithout glasses and 666 for subjectswearing glasses)

2004 Sung et al [4]Bisection method cannyedge-map detector andhistogram equalization

3176 images acquired througha CCD camera

100 inner boundary and 945 forcollarette boundary

2004 Bonney et al [5] Least significant bit plane andstandard deviations

108 images from CASIA v1and 104 images from UNSA

Pupil detection 991 and limbicdetection 665

2005 Liu et al [6] Modification to Masekrsquossegmentation algorithm

317 gallery and 4249 probeimages acquired using IridianLG 2200 iris imaging system

9708 rank-1 recognition

2006 Proenca andAlexandre [7]

Moment functions dependenton fuzzy clustering

1214 good quality 663 noisyimages from 241 subjects intwo sessions

9802 on good data set and 9788 onnoisy data set

2008 Pundlik et al [8] Markov random field andgraph cut WVU nonideal database Pixel label error rate 59

2009 He et al [9] Adaboost-cascade iris detectorfor iris center prediction

NIST Iris ChallengeEvaluation (ICE) v 10CASIA-Iris-V3-lampUBIRISv10

053 EER for ICEv10 and 075 EERfor CASIA Iris-V3-lamp

2010 Liu et al [10] 119870-means cluster CASIAv3 and UBIRISv2019 false positive and 213 falsenegative (on a fresh data set not usedto tune the system)

2010 Tan et al [11] Gray distribution features andgray projection CASIAv1 9914 accuracy (processing time

0484 simage)

2011 Bakshi et al [12] Image morphology andconnected component analysis CASIAv3 9576 accuracy with processing

(0396 simage)

Table 3 Survey on classification through periocular biometric

Authors Classification type Algorithm Classifier Testing database Accuracy ()

Abiantun and Savvides [13] Left versus right eye Adaboost HaarGabor features LDA SVM ICE 8995

Bhat and Savvides [14] Left versus right eye ASM SVM ICE LG Left eye 91 righteye 89

Merkow et al [15] Gender LBP LDA SVM PCA Downloadedfrom web 849

Lyle et al [16] Gender and ethnicity LBP SVM FRGC Gender 93ethnic 91

connected components in such a region Depending onconnected components having skin color the system labelsthe component largest in size as face Algorithm 1 proposes abinary component analysis based skin detection The thresh-olds are experimentally fitted to obtain highest accuracy insegmenting skin region in face images comprising skin colorswith different skin tonesThe algorithm takes RGB face imageas input It first converts the face image to 119884119862119887119862119903 colorspace and normalizes the pixel values In the next step theaverage luminance value is calculated by summing up the 119884

component values of each pixel and dividing the total numberof pixels in the image A brightness compensated imageis generated depending on the value of average luminanceas specified in the algorithm In the obtained brightnesscompensated image compound condition is applied and a

thresholding is performed to obtain the skin-map finallyThrough connected component analysis of the skin mapin 119884119862119887119862119903 color space open eye region can be obtained asexplained in Algorithm 2The reason of segmenting open eyeregion is to obtain the nonskin region within detected facewhich can be labeled as eye and thus to achieve approximatelocation of eye center

Once the eye region is detected the iris center can beobtained using conventional pupil detection and integrodif-ferential approach for finding the iris boundary and a staticboundary can be fitted As described earlier the authors of[2] bounded periocular region with 6119877iris times 4119877iris rectanglecentering the iris center But no justification is produced inthe paper regarding the empirically taken height and width ofthis periocular boundary This process of finding periocular

BioMed Research International 5

Table 4 Survey on recognition through periocular biometric

Year Authors Algorithm Features Testing database Performance results2010 Hollingsworth et al [17] Human analysis Eye region NIR images of 120 subjects Accuracy of 92

2010 Woodard et al [18] LBP fused withiris matching Skin

MBGC NIR images from88 subjects

Left eye rank-1recognition rate

Iris 138Periocular 925Both 965

Right eye rank-1recognition rate

Iris 101Periocular 887Both 924

2010 Miller et al [19] LBPColorinformationskin texture

FRGC neutral expressiondifferent session

Rank-1 recognitionrate

Periocular 9410Face 9438

FRGC alternateexpression same session

Rank-1 recognitionrate

Periocular 9950Face 9975

FRGC alternateexpression a differentsession

Rank-1 recognitionrate

Periocular 9490Face 9037

2010 Miller et al [20] LBP city blockdistance

Skin

FRGC VS images from410 subjects

Rank-1 recognitionrate

Left eye 8439Right eye 8390Both eyes 8976

FERET VS images from54 subjects

Rank-1 recognitionrate

Left eye 7222Right eye 7037Both eyes 7407

2010 Adams et al [21] LBP GE toselect features Skin

FRGC VS images from410 subjects

Rank-1 recognitionrate

Left eye 8685Right eye 8626Both eyes 9216

FERET VS images from54 subjects

Rank-1 recognitionrate

Left eye 8025Right eye 8080Both eyes 8506

2011 Woodard et al [22]LBP colorhistograms Skin

FRGC neutral expressiona different session

Rank-1 recognitionrate

Left eye 871Right eye 883Both eyes 910

FRGC alternateexpression same session

Rank-1 recognitionrate

Left eye 968Right eye 968Both eyes 983

FRGC alternateexpression differentsession

Rank-1 recognitionrate

Left eye 871Right eye 871Both eyes 912

boundary has prerequisite of knowledge of coordinates of iriscenter and radius of iris

Anthropometric analysis [24] of human face and eyeregion gives the information regarding the ratio of eye andiris and ratio of width of face and eye A typical block diagramin Figure 6 depicts the ratios of different parts of human facewith respect to height or width of face From the analysis it isfound that

widthperiocular = widtheyebrow = 067 timesheightface

2

heightperiocular = 2 times 119889(eyebroweyecenter)

= 2 times (021 +007

2)widthface

2

= 049 timeswidthface

2

(1)

where 119889(eyebroweyecenter) denotes the distance between center ofeyebrow and eye center

heighteyewidtheye

=049

067times

widthface2heightface2

= 073 timeswidthfaceheightface

(2)

areaperiocular = widthperiocular times heightperiocular

= 067 timesheightface

2times 049

widthface2

= 033 timesheightface

2timeswidthface

2

=033

120587times (120587 times

heightface2

timeswidthface

2)

= 011 times areaface

(3)

6 BioMed Research International

Require 119868 RGB face image of size 119898 times 119899

Ensure 119878 Binary face image indicating skin-map(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) Compute the average luminance value of image 119868 as

119868119884avg =1

119898119899

119898

sum

119894=1

119899

sum

119895=1

119868119884119894119895

(4) Brightness compensated image 1198681198621015840 is obtained as 119868119862

1015840

119894119895= 119868119877

1015840

119894119895 1198681198661015840

119894119895 119868119861119894119895

where 1198681198771015840

119894119895= (119868119877119894119895)

120591 and 1198681198661015840

119894119895= (119868119866119894119895)

120591 and

120591 =

15 if119868119884avg lt 64

07 if 119868119884avg gt 190

1 otherwise(5) The skin map 119878 is detected from 119868119862

1015840 as

119878119894119895

=

0 if119877119894119895 + 1

119866119894119895 + 1gt 108

119877119894119895 + 1

119861119894119895 + 1gt 108 119866119894119895 gt 30 119866119894119895 lt 140

1 otherwisewhere 119878119894119895 = 0 indicates skin region and 119878119894119895 = 1 indicates non-skin regions

(6) return 119878

Algorithm 1 Skin Detection

Require 119868 RGB face image of size 119898 times 119899 119878 Binary face image indicating skin-mapEnsure EM Binary face image indicating open eye regions

(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) FC = Set of connected components in 119878

(4) EM = FC(5) For each connected component EM119901 in 119878 repeat Step 5 to 8(6) EB119901 = 0

(7) For each pixel (119894119895) in EM119901 the value of EB119901 is updated as

EB119901 = EB119901 + 0 if 65 lt 119868119884119894119895 lt 80

EB119901 + 1 otherwise(8) if EB119901 = Number of pixels in EM119901 then do EM = EM minus EM119901

(Removal of the 119901th connected component)(9) return EM

Algorithm 2 Open Eye Detection

This information can be used to decide the boundaryof periocular region In (1) width and height of eye areexpressed as a function of the height andwidth of human faceHence to gauge the width and height of periocular templateboundary there is no need to have knowledge of iris radiusHowever knowledge of coordinates of iris center is necessaryFrom these information a bounding box can be fit composingall visible portions of periocular region for example eyebroweyelashes tear duct eye fold eye corner and so forth Thisapproach is crude and dependent on the human supervisionor intelligent detection of these nodal points in human eye

Further from (2) it is observable that either informationof the height or width of periocular region is sufficient toderive the other parameter provided that the aspect ratio offace is known This aspect of the localization of periocular isused in Section 42 Equation (3) considers elliptical model

to represent face while finding the ratio of periocular regionand area of a human face It justifies the usefulness of using anoptimally selected periocular template for human recognitionrather than a full face recognition system

This method achieves periocular localization withoutknowledge of iris radiusHence it is suitable for localization ofperiocular region for unconstrained images where iris radiusis not detectable by machines due to low-quality partialclosure of eye or luminance of the visible spectrum eyeimage

However to make the system work in more uncon-strained environment periocular boundary can be achievedthrough sclera detection for the scenario when iris cannotbe properly located due to unconstrained acquisition of eyeor when the image captured is a low-quality color face imagecaptured from a distance

BioMed Research International 7

411 Detection of Sclera Region and Noise Removal

(1) The input RGB iris image 119894119898 is converted to grayscaleimage im gray

(2) The input RGB iris image 119894119898 is converted to HSIcolor model where 119878 component of each pixel can bedetermined by

119878 = 1 minus3

119877 + 119866 + 119861[min (119877 119866 119861)] (4)

where R G B denotes the Red Green and Bluecolor component of a particular pixel Let the imagehence formed containing S component of each pixelis 119904119886119905119906119903119886119905119894119900119899 119894119898

(3) If 119878 lt 120591 where 120591 is a predefined threshold then thatpixel is marked as sclera region else as a nonscleraregion Authors in [25] have experimented with 120591 =

021 to get a binarymap of sclera region through bina-rization of 119904119886119905119906119903119886119905119894119900119899 119894119898 as follows 119904119888119897119890119903119886 119899119900119894119904119910 =

119904119886119905119906119903119886119905119894119900119899 119894119898 lt 120591 Only a noisy binary map of sclera119904119888119897119890119903119886 119899119900119894119904119910 can be found through this process inwhich white pixels denote noisy sclera region andblack pixels denote non-sclera region

(4) im bin is formed as follows for every nonzero pixel(119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = average intensity of 17 times 17 window

around the pixel (119894 119895) in 119894119898 119892119903119886119910

(5)

for every zero pixel (119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = 0 (6)

(5) 119904119888119897119890119903119886 119886119889119886119901119905119894V119890 is formed as follows

119904119888119897119890119903119886 119886119889119886119901119905119894V119890 (119894 119895) =

0 if 119904119888119897119890119903119886 119899119900119894119904119910 (119894 119895) = 1 or119894119898 119892119903119886119910 (119894 119895) lt 119894119898 119887119894119899 (119894 119895)

1 otherwise(7)

(6) All binary connected components present in119904119888119897119890119903119886 119886119889119886119901119905119894V119890 are removed except the largest andsecond largest components

(7) If size of the second largest connected component isless than 25 of that of the large one it is interpretedthat the largest component is the single sclera detectedand the second largest connected component isremoved hence Else both components are retained asbinary map of sclera

After processing these above specified steps the binaryimage would only contain one or two components describingthe sclera region after removing noises

412 Content Retrieval of Sclera Region After a denoisedbinarymap of sclera regionwithin an eye image is obtained itis necessary to retrieve the information about sclera whethertwo parts of sclera on two sides of iris are separately visibleonly one of them is detected or both parts of sclera aredetected as a single component

There can be three exhaustive cases in the binary imagefound as sclera (a) the two sides of the sclera is connectedand found as a single connected component (b) two scleraregions are found as two different connected componentsand (c) only one side of the sclera is detected due to the poseof eye in the image If the number of connected componentsis found to be two then it is classified as aforementionedCase b (as shown in Figures 3(a) 3(b) and 3(c)) and twocomponents are treated as two portions of sclera Else if asingle connected component is obtained it is checked forthe ratio of length and breadth of the best fitted orientedbounding rectangle If the ratio is greater than 125 then itbelongs to aforementioned Case a else belongs to Case c(shown in Figure 3(e)) For the aforementioned Case a theregion is subdivided into two components (through detectingminimal cut that divides the joined sclera into two parts) asshown in Figure 3(d) and further processing is performed

413 Nodal Points Extraction from Sclera Region Each sclerais subjected to following processing through which threenodal points are detected from each sclera region namely (a)center of sclera (b) center of concave region of sclera and (c)eye corner So in general cases where two parts of the scleraare detected six nodal points will be detectedThemethod ofnodal point extraction is illustrated below

(1) Finding Center of Sclera The sclera component issubjected to a distance transform where the value ofeachwhite pixel (indicating pixels belonging to sclera)is replaced by its minimum distance from any blackpixel The pixel which is farthest from all black pixelswill have highest value after this transformationThatpixel is labeled as center of sclera

(2) Finding Center of Concave Region of Sclera Themidpoints of every straight line joining any twoborder pixels of the detected sclera component arefound out as shown in Figure 5 The midpointslying on the component itself (shown by red pointbetween 1198751 and 1198752 in Figure 5) are neglected Themidpoints lying outside the component (shown byyellow point between 1198753 and 1198754 in Figure 5) aretaken into account Due to discrete computation ofstraight lines midpoints of many straight lines drawnin aforementioned way overlap on a single pixel Aseparate matrix having the same size as the scleraitself is introduced which is having zero value ofeach pixel initially For every valid midpoint thevalue of corresponding pixel in this new matrix isincremented Once this process is over more thanone connected components of nonzero values will beobtained in the matrix signifying concave regionsThe largest connected component is retained whileothers are removedThe pixel havingmaximum value

8 BioMed Research International

(a) Sample output 1 from UBIRISv2 database

(b) Sample output 2 from UBIRISv2 database

(c) Sample output 3 from UBIRISv2 database

(d) Sample output 4 from UBIRISv2 database

(e) Sample output 5 from UBIRISv2 database

Figure 3 Result of nodal point detection through sclera segmentation

(a) (b) (c) (d)

Figure 4 Cropped images from an iris image centering at pupil center

in the largest component is labeled as the center ofconcave region

(3) Finding the Eye Corner The distances of all pixelslying on boundary of sclera region from the scleracenter are also calculated to find the center of sclera asdescribed aboveThe boundary pixel which is farthestfrom the center of the sclera is labeled as the eyecorner

The result of extracting these nodal points from eyeimage helps in finding the tilt of eye along with the positionof iris in eye Figure 3 depicts five sample images fromUBIRISv2 dataset and the outputs obtained from their pro-cessing through the aforementioned nodal point extraction

technique This information can be useful in localization ofperiocular region

42 Through Demand of Accuracy of Biometric SystemBeginning with the center of the eye (pupil center) abounding rectangular box is taken of which only enclosesthe iris Figure 4 shows how the eye images changes whenit is cropped with pupil center and the bounding size isgradually increased The corresponding accuracy of everycropped image is tested In subsequent steps the coverageof this bounding box is increased with a width of 3 of thediameter of the iris and the change in accuracy is observedAfter certain iterations of this procedure the bounding boxwill come to a portion of periocular region where thereis no more change in intensity hence the region is low

BioMed Research International 9

P1

P2

P3

P4

Figure 5 Method of formation of concave region of a binarizedsclera component

b

a

042a

067a

042a

019a

07a

013b

007b021b

005b005b

013b

032b057b

Figure 6 Different ratios of portions of face from human anthro-pometry

entropic Hence no more local feature can be extracted fromthis region even if the bounding box is increased In suchscenario the saturation accuracy is achieved and on thebasis of saturation accuracy the corresponding minimumbounding box is considered as the desired periocular regionAs the demand of different biometric systems may vary thebounding box corresponding to certain predefined accuracycan also be segmented as periocular region Similar resultshave also been observed for FERET database

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

to recognition to recognition Eye region contributing Eye region not contributing

Figure 7 Change of accuracy of periocular recognition with changein size of periocular template tested on subset of UBIRISv2 andFERET datasets

The exact method of obtaining the dynamic boundary isas follows

(1) For 119894 = 0 to 100 follow the steps 2 to 4(2) For each image in database find approximate iris

location in eye image(3) For each image in database centering at the iris center

crop a bounding box whose width 119908 = 100 + 3 times 119894of diameter of iris height ℎ = 73 of 119908

(4) Find accuracy of the system with this image size(5) Observe the change in accuracy with 119908

Figure 7 illustrates a plot of accuracy against 119908 whichshows that the accuracy of the biometric system saturatesafter a particular size of the bounding box Increasing thebox further does not increase the accuracy To carry outthis experiment Local Binary Pattern (LBP) [26] along withScale Invariant Feature Transform (SIFT) [27] are employedas feature extractor from the eye images First LBP is appliedand resulting image is subjected for extracting local featurethrough SIFT In the process a maximum accuracy of 8564is achieved while testing with randomly chosen 50 eye imagesof 12 subjects from UBIRISv2 dataset [28] When the sameexperiment is executed for randomly chosen 50 eye images of12 subjects from FERET dataset [29] a maximum accuracyof 7829 is achieved These saturation accuracy values areobtained when a rectangular boundary of width 300 ofdiameter of iris is considered or a wider rectangular eye areais taken into consideration To validate the experiment runon the sample strongly the same experiment was conductedon complete UBIRISv2 and FERET dataset which yielded8543 and 7801 accuracy respectively This concludesthat a subset of a large database can be employed to findthe optimal template size and the result found can be usedon whole dataset for cropping of images So to minimizetemplate size without compromising in accuracy the smallestwide rectangle with saturation accuracy can be used aslocalization boundary to periocular region It is also observed

10 BioMed Research International

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (complete dataset)LBP + SIFT on FERET (complete dataset)

Eye region contributing to recognition Eye region not contributing to recognition

Figure 8 Change of accuracy of periocular recognition with changein size of periocular template tested on full UBIRISv2 and FERETdatasets

0 10 20 30 40 50 60 70 80 9001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 9 Distribution of scores for imposter and genuine matchingtested on full UBIRISv2 dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

that the region beyond 300 of diameter of iris though doesnot participate in recognition increases the matching time asshown in Figure 11 This is also another reason of removingthe redundant eye region to make the recognition processfast

To validate this experiment the same experiment hasbeen carried out once again on full database of UBIRISv2 andFERETThe obtained accuracy values as depicted in Figure 8ensure the experimental objective that there is no significantfeature in periocular region beyond 300 of diameter of iriswhich can contribute to recognition The score distributionof imposter and genuine scores is shown in Figures 9 and 10

43 Human Expert Judgement on Importance of Portions ofEye Human expertise has been utilized to decide a sortedorder of importance of different sections of periocular regiontowards recognition [17] This information can be used to

0 10 20 30 40 50 6001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 10Distribution of scores for imposter and genuinematchingtested on full FERET dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

300 310 320 330 340 350 360 370 380 390 4000

00501

01502

02503

03504

04505

Width of periocular region (w)

Aver

age 1

1 m

atch

ing

time (

s)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

Figure 11 Change of 1 1 matching time with change in size ofperiocular template tested on full UBIRISv2 and FERET datasets

detect only the most important section in human eye thatis most important towards recognition If that section is notfound in human eye region the captured image is markedas Failure to Acquire (FTA) and not used for recognitionHence a predecision on the quality of live query templatecan increase the accuracy of the system by reducing falserejections However this technique is human-supervisedwhile enrolling an image in the database and while a livequery comes The human expert has to verify whether themost important portion of eye is visible in the image and hasto guide the biometric system accordingly

44Through SubdivisionApproach andAutomation ofHumanExpertise During enrolment phase of a biometric systema human expert needs to verify manually whether thecaptured image includes expected region of interestThroughautomated labeling different sections of an eye it can bestated which portion of eye is necessary for identification

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 4: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

4 BioMed Research International

Table 2 Performance comparison of some benchmark NIR iris localization approaches

Year Authors Approach Testing database Accuracy results

2002 Camus and Wildes[3]

Multiresolution coarse-to-finestrategy

Constrained iris images (640without glasses 30 withglasses)

Overall 98 (995 for subjectswithout glasses and 666 for subjectswearing glasses)

2004 Sung et al [4]Bisection method cannyedge-map detector andhistogram equalization

3176 images acquired througha CCD camera

100 inner boundary and 945 forcollarette boundary

2004 Bonney et al [5] Least significant bit plane andstandard deviations

108 images from CASIA v1and 104 images from UNSA

Pupil detection 991 and limbicdetection 665

2005 Liu et al [6] Modification to Masekrsquossegmentation algorithm

317 gallery and 4249 probeimages acquired using IridianLG 2200 iris imaging system

9708 rank-1 recognition

2006 Proenca andAlexandre [7]

Moment functions dependenton fuzzy clustering

1214 good quality 663 noisyimages from 241 subjects intwo sessions

9802 on good data set and 9788 onnoisy data set

2008 Pundlik et al [8] Markov random field andgraph cut WVU nonideal database Pixel label error rate 59

2009 He et al [9] Adaboost-cascade iris detectorfor iris center prediction

NIST Iris ChallengeEvaluation (ICE) v 10CASIA-Iris-V3-lampUBIRISv10

053 EER for ICEv10 and 075 EERfor CASIA Iris-V3-lamp

2010 Liu et al [10] 119870-means cluster CASIAv3 and UBIRISv2019 false positive and 213 falsenegative (on a fresh data set not usedto tune the system)

2010 Tan et al [11] Gray distribution features andgray projection CASIAv1 9914 accuracy (processing time

0484 simage)

2011 Bakshi et al [12] Image morphology andconnected component analysis CASIAv3 9576 accuracy with processing

(0396 simage)

Table 3 Survey on classification through periocular biometric

Authors Classification type Algorithm Classifier Testing database Accuracy ()

Abiantun and Savvides [13] Left versus right eye Adaboost HaarGabor features LDA SVM ICE 8995

Bhat and Savvides [14] Left versus right eye ASM SVM ICE LG Left eye 91 righteye 89

Merkow et al [15] Gender LBP LDA SVM PCA Downloadedfrom web 849

Lyle et al [16] Gender and ethnicity LBP SVM FRGC Gender 93ethnic 91

connected components in such a region Depending onconnected components having skin color the system labelsthe component largest in size as face Algorithm 1 proposes abinary component analysis based skin detection The thresh-olds are experimentally fitted to obtain highest accuracy insegmenting skin region in face images comprising skin colorswith different skin tonesThe algorithm takes RGB face imageas input It first converts the face image to 119884119862119887119862119903 colorspace and normalizes the pixel values In the next step theaverage luminance value is calculated by summing up the 119884

component values of each pixel and dividing the total numberof pixels in the image A brightness compensated imageis generated depending on the value of average luminanceas specified in the algorithm In the obtained brightnesscompensated image compound condition is applied and a

thresholding is performed to obtain the skin-map finallyThrough connected component analysis of the skin mapin 119884119862119887119862119903 color space open eye region can be obtained asexplained in Algorithm 2The reason of segmenting open eyeregion is to obtain the nonskin region within detected facewhich can be labeled as eye and thus to achieve approximatelocation of eye center

Once the eye region is detected the iris center can beobtained using conventional pupil detection and integrodif-ferential approach for finding the iris boundary and a staticboundary can be fitted As described earlier the authors of[2] bounded periocular region with 6119877iris times 4119877iris rectanglecentering the iris center But no justification is produced inthe paper regarding the empirically taken height and width ofthis periocular boundary This process of finding periocular

BioMed Research International 5

Table 4 Survey on recognition through periocular biometric

Year Authors Algorithm Features Testing database Performance results2010 Hollingsworth et al [17] Human analysis Eye region NIR images of 120 subjects Accuracy of 92

2010 Woodard et al [18] LBP fused withiris matching Skin

MBGC NIR images from88 subjects

Left eye rank-1recognition rate

Iris 138Periocular 925Both 965

Right eye rank-1recognition rate

Iris 101Periocular 887Both 924

2010 Miller et al [19] LBPColorinformationskin texture

FRGC neutral expressiondifferent session

Rank-1 recognitionrate

Periocular 9410Face 9438

FRGC alternateexpression same session

Rank-1 recognitionrate

Periocular 9950Face 9975

FRGC alternateexpression a differentsession

Rank-1 recognitionrate

Periocular 9490Face 9037

2010 Miller et al [20] LBP city blockdistance

Skin

FRGC VS images from410 subjects

Rank-1 recognitionrate

Left eye 8439Right eye 8390Both eyes 8976

FERET VS images from54 subjects

Rank-1 recognitionrate

Left eye 7222Right eye 7037Both eyes 7407

2010 Adams et al [21] LBP GE toselect features Skin

FRGC VS images from410 subjects

Rank-1 recognitionrate

Left eye 8685Right eye 8626Both eyes 9216

FERET VS images from54 subjects

Rank-1 recognitionrate

Left eye 8025Right eye 8080Both eyes 8506

2011 Woodard et al [22]LBP colorhistograms Skin

FRGC neutral expressiona different session

Rank-1 recognitionrate

Left eye 871Right eye 883Both eyes 910

FRGC alternateexpression same session

Rank-1 recognitionrate

Left eye 968Right eye 968Both eyes 983

FRGC alternateexpression differentsession

Rank-1 recognitionrate

Left eye 871Right eye 871Both eyes 912

boundary has prerequisite of knowledge of coordinates of iriscenter and radius of iris

Anthropometric analysis [24] of human face and eyeregion gives the information regarding the ratio of eye andiris and ratio of width of face and eye A typical block diagramin Figure 6 depicts the ratios of different parts of human facewith respect to height or width of face From the analysis it isfound that

widthperiocular = widtheyebrow = 067 timesheightface

2

heightperiocular = 2 times 119889(eyebroweyecenter)

= 2 times (021 +007

2)widthface

2

= 049 timeswidthface

2

(1)

where 119889(eyebroweyecenter) denotes the distance between center ofeyebrow and eye center

heighteyewidtheye

=049

067times

widthface2heightface2

= 073 timeswidthfaceheightface

(2)

areaperiocular = widthperiocular times heightperiocular

= 067 timesheightface

2times 049

widthface2

= 033 timesheightface

2timeswidthface

2

=033

120587times (120587 times

heightface2

timeswidthface

2)

= 011 times areaface

(3)

6 BioMed Research International

Require 119868 RGB face image of size 119898 times 119899

Ensure 119878 Binary face image indicating skin-map(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) Compute the average luminance value of image 119868 as

119868119884avg =1

119898119899

119898

sum

119894=1

119899

sum

119895=1

119868119884119894119895

(4) Brightness compensated image 1198681198621015840 is obtained as 119868119862

1015840

119894119895= 119868119877

1015840

119894119895 1198681198661015840

119894119895 119868119861119894119895

where 1198681198771015840

119894119895= (119868119877119894119895)

120591 and 1198681198661015840

119894119895= (119868119866119894119895)

120591 and

120591 =

15 if119868119884avg lt 64

07 if 119868119884avg gt 190

1 otherwise(5) The skin map 119878 is detected from 119868119862

1015840 as

119878119894119895

=

0 if119877119894119895 + 1

119866119894119895 + 1gt 108

119877119894119895 + 1

119861119894119895 + 1gt 108 119866119894119895 gt 30 119866119894119895 lt 140

1 otherwisewhere 119878119894119895 = 0 indicates skin region and 119878119894119895 = 1 indicates non-skin regions

(6) return 119878

Algorithm 1 Skin Detection

Require 119868 RGB face image of size 119898 times 119899 119878 Binary face image indicating skin-mapEnsure EM Binary face image indicating open eye regions

(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) FC = Set of connected components in 119878

(4) EM = FC(5) For each connected component EM119901 in 119878 repeat Step 5 to 8(6) EB119901 = 0

(7) For each pixel (119894119895) in EM119901 the value of EB119901 is updated as

EB119901 = EB119901 + 0 if 65 lt 119868119884119894119895 lt 80

EB119901 + 1 otherwise(8) if EB119901 = Number of pixels in EM119901 then do EM = EM minus EM119901

(Removal of the 119901th connected component)(9) return EM

Algorithm 2 Open Eye Detection

This information can be used to decide the boundaryof periocular region In (1) width and height of eye areexpressed as a function of the height andwidth of human faceHence to gauge the width and height of periocular templateboundary there is no need to have knowledge of iris radiusHowever knowledge of coordinates of iris center is necessaryFrom these information a bounding box can be fit composingall visible portions of periocular region for example eyebroweyelashes tear duct eye fold eye corner and so forth Thisapproach is crude and dependent on the human supervisionor intelligent detection of these nodal points in human eye

Further from (2) it is observable that either informationof the height or width of periocular region is sufficient toderive the other parameter provided that the aspect ratio offace is known This aspect of the localization of periocular isused in Section 42 Equation (3) considers elliptical model

to represent face while finding the ratio of periocular regionand area of a human face It justifies the usefulness of using anoptimally selected periocular template for human recognitionrather than a full face recognition system

This method achieves periocular localization withoutknowledge of iris radiusHence it is suitable for localization ofperiocular region for unconstrained images where iris radiusis not detectable by machines due to low-quality partialclosure of eye or luminance of the visible spectrum eyeimage

However to make the system work in more uncon-strained environment periocular boundary can be achievedthrough sclera detection for the scenario when iris cannotbe properly located due to unconstrained acquisition of eyeor when the image captured is a low-quality color face imagecaptured from a distance

BioMed Research International 7

411 Detection of Sclera Region and Noise Removal

(1) The input RGB iris image 119894119898 is converted to grayscaleimage im gray

(2) The input RGB iris image 119894119898 is converted to HSIcolor model where 119878 component of each pixel can bedetermined by

119878 = 1 minus3

119877 + 119866 + 119861[min (119877 119866 119861)] (4)

where R G B denotes the Red Green and Bluecolor component of a particular pixel Let the imagehence formed containing S component of each pixelis 119904119886119905119906119903119886119905119894119900119899 119894119898

(3) If 119878 lt 120591 where 120591 is a predefined threshold then thatpixel is marked as sclera region else as a nonscleraregion Authors in [25] have experimented with 120591 =

021 to get a binarymap of sclera region through bina-rization of 119904119886119905119906119903119886119905119894119900119899 119894119898 as follows 119904119888119897119890119903119886 119899119900119894119904119910 =

119904119886119905119906119903119886119905119894119900119899 119894119898 lt 120591 Only a noisy binary map of sclera119904119888119897119890119903119886 119899119900119894119904119910 can be found through this process inwhich white pixels denote noisy sclera region andblack pixels denote non-sclera region

(4) im bin is formed as follows for every nonzero pixel(119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = average intensity of 17 times 17 window

around the pixel (119894 119895) in 119894119898 119892119903119886119910

(5)

for every zero pixel (119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = 0 (6)

(5) 119904119888119897119890119903119886 119886119889119886119901119905119894V119890 is formed as follows

119904119888119897119890119903119886 119886119889119886119901119905119894V119890 (119894 119895) =

0 if 119904119888119897119890119903119886 119899119900119894119904119910 (119894 119895) = 1 or119894119898 119892119903119886119910 (119894 119895) lt 119894119898 119887119894119899 (119894 119895)

1 otherwise(7)

(6) All binary connected components present in119904119888119897119890119903119886 119886119889119886119901119905119894V119890 are removed except the largest andsecond largest components

(7) If size of the second largest connected component isless than 25 of that of the large one it is interpretedthat the largest component is the single sclera detectedand the second largest connected component isremoved hence Else both components are retained asbinary map of sclera

After processing these above specified steps the binaryimage would only contain one or two components describingthe sclera region after removing noises

412 Content Retrieval of Sclera Region After a denoisedbinarymap of sclera regionwithin an eye image is obtained itis necessary to retrieve the information about sclera whethertwo parts of sclera on two sides of iris are separately visibleonly one of them is detected or both parts of sclera aredetected as a single component

There can be three exhaustive cases in the binary imagefound as sclera (a) the two sides of the sclera is connectedand found as a single connected component (b) two scleraregions are found as two different connected componentsand (c) only one side of the sclera is detected due to the poseof eye in the image If the number of connected componentsis found to be two then it is classified as aforementionedCase b (as shown in Figures 3(a) 3(b) and 3(c)) and twocomponents are treated as two portions of sclera Else if asingle connected component is obtained it is checked forthe ratio of length and breadth of the best fitted orientedbounding rectangle If the ratio is greater than 125 then itbelongs to aforementioned Case a else belongs to Case c(shown in Figure 3(e)) For the aforementioned Case a theregion is subdivided into two components (through detectingminimal cut that divides the joined sclera into two parts) asshown in Figure 3(d) and further processing is performed

413 Nodal Points Extraction from Sclera Region Each sclerais subjected to following processing through which threenodal points are detected from each sclera region namely (a)center of sclera (b) center of concave region of sclera and (c)eye corner So in general cases where two parts of the scleraare detected six nodal points will be detectedThemethod ofnodal point extraction is illustrated below

(1) Finding Center of Sclera The sclera component issubjected to a distance transform where the value ofeachwhite pixel (indicating pixels belonging to sclera)is replaced by its minimum distance from any blackpixel The pixel which is farthest from all black pixelswill have highest value after this transformationThatpixel is labeled as center of sclera

(2) Finding Center of Concave Region of Sclera Themidpoints of every straight line joining any twoborder pixels of the detected sclera component arefound out as shown in Figure 5 The midpointslying on the component itself (shown by red pointbetween 1198751 and 1198752 in Figure 5) are neglected Themidpoints lying outside the component (shown byyellow point between 1198753 and 1198754 in Figure 5) aretaken into account Due to discrete computation ofstraight lines midpoints of many straight lines drawnin aforementioned way overlap on a single pixel Aseparate matrix having the same size as the scleraitself is introduced which is having zero value ofeach pixel initially For every valid midpoint thevalue of corresponding pixel in this new matrix isincremented Once this process is over more thanone connected components of nonzero values will beobtained in the matrix signifying concave regionsThe largest connected component is retained whileothers are removedThe pixel havingmaximum value

8 BioMed Research International

(a) Sample output 1 from UBIRISv2 database

(b) Sample output 2 from UBIRISv2 database

(c) Sample output 3 from UBIRISv2 database

(d) Sample output 4 from UBIRISv2 database

(e) Sample output 5 from UBIRISv2 database

Figure 3 Result of nodal point detection through sclera segmentation

(a) (b) (c) (d)

Figure 4 Cropped images from an iris image centering at pupil center

in the largest component is labeled as the center ofconcave region

(3) Finding the Eye Corner The distances of all pixelslying on boundary of sclera region from the scleracenter are also calculated to find the center of sclera asdescribed aboveThe boundary pixel which is farthestfrom the center of the sclera is labeled as the eyecorner

The result of extracting these nodal points from eyeimage helps in finding the tilt of eye along with the positionof iris in eye Figure 3 depicts five sample images fromUBIRISv2 dataset and the outputs obtained from their pro-cessing through the aforementioned nodal point extraction

technique This information can be useful in localization ofperiocular region

42 Through Demand of Accuracy of Biometric SystemBeginning with the center of the eye (pupil center) abounding rectangular box is taken of which only enclosesthe iris Figure 4 shows how the eye images changes whenit is cropped with pupil center and the bounding size isgradually increased The corresponding accuracy of everycropped image is tested In subsequent steps the coverageof this bounding box is increased with a width of 3 of thediameter of the iris and the change in accuracy is observedAfter certain iterations of this procedure the bounding boxwill come to a portion of periocular region where thereis no more change in intensity hence the region is low

BioMed Research International 9

P1

P2

P3

P4

Figure 5 Method of formation of concave region of a binarizedsclera component

b

a

042a

067a

042a

019a

07a

013b

007b021b

005b005b

013b

032b057b

Figure 6 Different ratios of portions of face from human anthro-pometry

entropic Hence no more local feature can be extracted fromthis region even if the bounding box is increased In suchscenario the saturation accuracy is achieved and on thebasis of saturation accuracy the corresponding minimumbounding box is considered as the desired periocular regionAs the demand of different biometric systems may vary thebounding box corresponding to certain predefined accuracycan also be segmented as periocular region Similar resultshave also been observed for FERET database

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

to recognition to recognition Eye region contributing Eye region not contributing

Figure 7 Change of accuracy of periocular recognition with changein size of periocular template tested on subset of UBIRISv2 andFERET datasets

The exact method of obtaining the dynamic boundary isas follows

(1) For 119894 = 0 to 100 follow the steps 2 to 4(2) For each image in database find approximate iris

location in eye image(3) For each image in database centering at the iris center

crop a bounding box whose width 119908 = 100 + 3 times 119894of diameter of iris height ℎ = 73 of 119908

(4) Find accuracy of the system with this image size(5) Observe the change in accuracy with 119908

Figure 7 illustrates a plot of accuracy against 119908 whichshows that the accuracy of the biometric system saturatesafter a particular size of the bounding box Increasing thebox further does not increase the accuracy To carry outthis experiment Local Binary Pattern (LBP) [26] along withScale Invariant Feature Transform (SIFT) [27] are employedas feature extractor from the eye images First LBP is appliedand resulting image is subjected for extracting local featurethrough SIFT In the process a maximum accuracy of 8564is achieved while testing with randomly chosen 50 eye imagesof 12 subjects from UBIRISv2 dataset [28] When the sameexperiment is executed for randomly chosen 50 eye images of12 subjects from FERET dataset [29] a maximum accuracyof 7829 is achieved These saturation accuracy values areobtained when a rectangular boundary of width 300 ofdiameter of iris is considered or a wider rectangular eye areais taken into consideration To validate the experiment runon the sample strongly the same experiment was conductedon complete UBIRISv2 and FERET dataset which yielded8543 and 7801 accuracy respectively This concludesthat a subset of a large database can be employed to findthe optimal template size and the result found can be usedon whole dataset for cropping of images So to minimizetemplate size without compromising in accuracy the smallestwide rectangle with saturation accuracy can be used aslocalization boundary to periocular region It is also observed

10 BioMed Research International

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (complete dataset)LBP + SIFT on FERET (complete dataset)

Eye region contributing to recognition Eye region not contributing to recognition

Figure 8 Change of accuracy of periocular recognition with changein size of periocular template tested on full UBIRISv2 and FERETdatasets

0 10 20 30 40 50 60 70 80 9001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 9 Distribution of scores for imposter and genuine matchingtested on full UBIRISv2 dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

that the region beyond 300 of diameter of iris though doesnot participate in recognition increases the matching time asshown in Figure 11 This is also another reason of removingthe redundant eye region to make the recognition processfast

To validate this experiment the same experiment hasbeen carried out once again on full database of UBIRISv2 andFERETThe obtained accuracy values as depicted in Figure 8ensure the experimental objective that there is no significantfeature in periocular region beyond 300 of diameter of iriswhich can contribute to recognition The score distributionof imposter and genuine scores is shown in Figures 9 and 10

43 Human Expert Judgement on Importance of Portions ofEye Human expertise has been utilized to decide a sortedorder of importance of different sections of periocular regiontowards recognition [17] This information can be used to

0 10 20 30 40 50 6001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 10Distribution of scores for imposter and genuinematchingtested on full FERET dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

300 310 320 330 340 350 360 370 380 390 4000

00501

01502

02503

03504

04505

Width of periocular region (w)

Aver

age 1

1 m

atch

ing

time (

s)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

Figure 11 Change of 1 1 matching time with change in size ofperiocular template tested on full UBIRISv2 and FERET datasets

detect only the most important section in human eye thatis most important towards recognition If that section is notfound in human eye region the captured image is markedas Failure to Acquire (FTA) and not used for recognitionHence a predecision on the quality of live query templatecan increase the accuracy of the system by reducing falserejections However this technique is human-supervisedwhile enrolling an image in the database and while a livequery comes The human expert has to verify whether themost important portion of eye is visible in the image and hasto guide the biometric system accordingly

44Through SubdivisionApproach andAutomation ofHumanExpertise During enrolment phase of a biometric systema human expert needs to verify manually whether thecaptured image includes expected region of interestThroughautomated labeling different sections of an eye it can bestated which portion of eye is necessary for identification

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 5: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

BioMed Research International 5

Table 4 Survey on recognition through periocular biometric

Year Authors Algorithm Features Testing database Performance results2010 Hollingsworth et al [17] Human analysis Eye region NIR images of 120 subjects Accuracy of 92

2010 Woodard et al [18] LBP fused withiris matching Skin

MBGC NIR images from88 subjects

Left eye rank-1recognition rate

Iris 138Periocular 925Both 965

Right eye rank-1recognition rate

Iris 101Periocular 887Both 924

2010 Miller et al [19] LBPColorinformationskin texture

FRGC neutral expressiondifferent session

Rank-1 recognitionrate

Periocular 9410Face 9438

FRGC alternateexpression same session

Rank-1 recognitionrate

Periocular 9950Face 9975

FRGC alternateexpression a differentsession

Rank-1 recognitionrate

Periocular 9490Face 9037

2010 Miller et al [20] LBP city blockdistance

Skin

FRGC VS images from410 subjects

Rank-1 recognitionrate

Left eye 8439Right eye 8390Both eyes 8976

FERET VS images from54 subjects

Rank-1 recognitionrate

Left eye 7222Right eye 7037Both eyes 7407

2010 Adams et al [21] LBP GE toselect features Skin

FRGC VS images from410 subjects

Rank-1 recognitionrate

Left eye 8685Right eye 8626Both eyes 9216

FERET VS images from54 subjects

Rank-1 recognitionrate

Left eye 8025Right eye 8080Both eyes 8506

2011 Woodard et al [22]LBP colorhistograms Skin

FRGC neutral expressiona different session

Rank-1 recognitionrate

Left eye 871Right eye 883Both eyes 910

FRGC alternateexpression same session

Rank-1 recognitionrate

Left eye 968Right eye 968Both eyes 983

FRGC alternateexpression differentsession

Rank-1 recognitionrate

Left eye 871Right eye 871Both eyes 912

boundary has prerequisite of knowledge of coordinates of iriscenter and radius of iris

Anthropometric analysis [24] of human face and eyeregion gives the information regarding the ratio of eye andiris and ratio of width of face and eye A typical block diagramin Figure 6 depicts the ratios of different parts of human facewith respect to height or width of face From the analysis it isfound that

widthperiocular = widtheyebrow = 067 timesheightface

2

heightperiocular = 2 times 119889(eyebroweyecenter)

= 2 times (021 +007

2)widthface

2

= 049 timeswidthface

2

(1)

where 119889(eyebroweyecenter) denotes the distance between center ofeyebrow and eye center

heighteyewidtheye

=049

067times

widthface2heightface2

= 073 timeswidthfaceheightface

(2)

areaperiocular = widthperiocular times heightperiocular

= 067 timesheightface

2times 049

widthface2

= 033 timesheightface

2timeswidthface

2

=033

120587times (120587 times

heightface2

timeswidthface

2)

= 011 times areaface

(3)

6 BioMed Research International

Require 119868 RGB face image of size 119898 times 119899

Ensure 119878 Binary face image indicating skin-map(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) Compute the average luminance value of image 119868 as

119868119884avg =1

119898119899

119898

sum

119894=1

119899

sum

119895=1

119868119884119894119895

(4) Brightness compensated image 1198681198621015840 is obtained as 119868119862

1015840

119894119895= 119868119877

1015840

119894119895 1198681198661015840

119894119895 119868119861119894119895

where 1198681198771015840

119894119895= (119868119877119894119895)

120591 and 1198681198661015840

119894119895= (119868119866119894119895)

120591 and

120591 =

15 if119868119884avg lt 64

07 if 119868119884avg gt 190

1 otherwise(5) The skin map 119878 is detected from 119868119862

1015840 as

119878119894119895

=

0 if119877119894119895 + 1

119866119894119895 + 1gt 108

119877119894119895 + 1

119861119894119895 + 1gt 108 119866119894119895 gt 30 119866119894119895 lt 140

1 otherwisewhere 119878119894119895 = 0 indicates skin region and 119878119894119895 = 1 indicates non-skin regions

(6) return 119878

Algorithm 1 Skin Detection

Require 119868 RGB face image of size 119898 times 119899 119878 Binary face image indicating skin-mapEnsure EM Binary face image indicating open eye regions

(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) FC = Set of connected components in 119878

(4) EM = FC(5) For each connected component EM119901 in 119878 repeat Step 5 to 8(6) EB119901 = 0

(7) For each pixel (119894119895) in EM119901 the value of EB119901 is updated as

EB119901 = EB119901 + 0 if 65 lt 119868119884119894119895 lt 80

EB119901 + 1 otherwise(8) if EB119901 = Number of pixels in EM119901 then do EM = EM minus EM119901

(Removal of the 119901th connected component)(9) return EM

Algorithm 2 Open Eye Detection

This information can be used to decide the boundaryof periocular region In (1) width and height of eye areexpressed as a function of the height andwidth of human faceHence to gauge the width and height of periocular templateboundary there is no need to have knowledge of iris radiusHowever knowledge of coordinates of iris center is necessaryFrom these information a bounding box can be fit composingall visible portions of periocular region for example eyebroweyelashes tear duct eye fold eye corner and so forth Thisapproach is crude and dependent on the human supervisionor intelligent detection of these nodal points in human eye

Further from (2) it is observable that either informationof the height or width of periocular region is sufficient toderive the other parameter provided that the aspect ratio offace is known This aspect of the localization of periocular isused in Section 42 Equation (3) considers elliptical model

to represent face while finding the ratio of periocular regionand area of a human face It justifies the usefulness of using anoptimally selected periocular template for human recognitionrather than a full face recognition system

This method achieves periocular localization withoutknowledge of iris radiusHence it is suitable for localization ofperiocular region for unconstrained images where iris radiusis not detectable by machines due to low-quality partialclosure of eye or luminance of the visible spectrum eyeimage

However to make the system work in more uncon-strained environment periocular boundary can be achievedthrough sclera detection for the scenario when iris cannotbe properly located due to unconstrained acquisition of eyeor when the image captured is a low-quality color face imagecaptured from a distance

BioMed Research International 7

411 Detection of Sclera Region and Noise Removal

(1) The input RGB iris image 119894119898 is converted to grayscaleimage im gray

(2) The input RGB iris image 119894119898 is converted to HSIcolor model where 119878 component of each pixel can bedetermined by

119878 = 1 minus3

119877 + 119866 + 119861[min (119877 119866 119861)] (4)

where R G B denotes the Red Green and Bluecolor component of a particular pixel Let the imagehence formed containing S component of each pixelis 119904119886119905119906119903119886119905119894119900119899 119894119898

(3) If 119878 lt 120591 where 120591 is a predefined threshold then thatpixel is marked as sclera region else as a nonscleraregion Authors in [25] have experimented with 120591 =

021 to get a binarymap of sclera region through bina-rization of 119904119886119905119906119903119886119905119894119900119899 119894119898 as follows 119904119888119897119890119903119886 119899119900119894119904119910 =

119904119886119905119906119903119886119905119894119900119899 119894119898 lt 120591 Only a noisy binary map of sclera119904119888119897119890119903119886 119899119900119894119904119910 can be found through this process inwhich white pixels denote noisy sclera region andblack pixels denote non-sclera region

(4) im bin is formed as follows for every nonzero pixel(119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = average intensity of 17 times 17 window

around the pixel (119894 119895) in 119894119898 119892119903119886119910

(5)

for every zero pixel (119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = 0 (6)

(5) 119904119888119897119890119903119886 119886119889119886119901119905119894V119890 is formed as follows

119904119888119897119890119903119886 119886119889119886119901119905119894V119890 (119894 119895) =

0 if 119904119888119897119890119903119886 119899119900119894119904119910 (119894 119895) = 1 or119894119898 119892119903119886119910 (119894 119895) lt 119894119898 119887119894119899 (119894 119895)

1 otherwise(7)

(6) All binary connected components present in119904119888119897119890119903119886 119886119889119886119901119905119894V119890 are removed except the largest andsecond largest components

(7) If size of the second largest connected component isless than 25 of that of the large one it is interpretedthat the largest component is the single sclera detectedand the second largest connected component isremoved hence Else both components are retained asbinary map of sclera

After processing these above specified steps the binaryimage would only contain one or two components describingthe sclera region after removing noises

412 Content Retrieval of Sclera Region After a denoisedbinarymap of sclera regionwithin an eye image is obtained itis necessary to retrieve the information about sclera whethertwo parts of sclera on two sides of iris are separately visibleonly one of them is detected or both parts of sclera aredetected as a single component

There can be three exhaustive cases in the binary imagefound as sclera (a) the two sides of the sclera is connectedand found as a single connected component (b) two scleraregions are found as two different connected componentsand (c) only one side of the sclera is detected due to the poseof eye in the image If the number of connected componentsis found to be two then it is classified as aforementionedCase b (as shown in Figures 3(a) 3(b) and 3(c)) and twocomponents are treated as two portions of sclera Else if asingle connected component is obtained it is checked forthe ratio of length and breadth of the best fitted orientedbounding rectangle If the ratio is greater than 125 then itbelongs to aforementioned Case a else belongs to Case c(shown in Figure 3(e)) For the aforementioned Case a theregion is subdivided into two components (through detectingminimal cut that divides the joined sclera into two parts) asshown in Figure 3(d) and further processing is performed

413 Nodal Points Extraction from Sclera Region Each sclerais subjected to following processing through which threenodal points are detected from each sclera region namely (a)center of sclera (b) center of concave region of sclera and (c)eye corner So in general cases where two parts of the scleraare detected six nodal points will be detectedThemethod ofnodal point extraction is illustrated below

(1) Finding Center of Sclera The sclera component issubjected to a distance transform where the value ofeachwhite pixel (indicating pixels belonging to sclera)is replaced by its minimum distance from any blackpixel The pixel which is farthest from all black pixelswill have highest value after this transformationThatpixel is labeled as center of sclera

(2) Finding Center of Concave Region of Sclera Themidpoints of every straight line joining any twoborder pixels of the detected sclera component arefound out as shown in Figure 5 The midpointslying on the component itself (shown by red pointbetween 1198751 and 1198752 in Figure 5) are neglected Themidpoints lying outside the component (shown byyellow point between 1198753 and 1198754 in Figure 5) aretaken into account Due to discrete computation ofstraight lines midpoints of many straight lines drawnin aforementioned way overlap on a single pixel Aseparate matrix having the same size as the scleraitself is introduced which is having zero value ofeach pixel initially For every valid midpoint thevalue of corresponding pixel in this new matrix isincremented Once this process is over more thanone connected components of nonzero values will beobtained in the matrix signifying concave regionsThe largest connected component is retained whileothers are removedThe pixel havingmaximum value

8 BioMed Research International

(a) Sample output 1 from UBIRISv2 database

(b) Sample output 2 from UBIRISv2 database

(c) Sample output 3 from UBIRISv2 database

(d) Sample output 4 from UBIRISv2 database

(e) Sample output 5 from UBIRISv2 database

Figure 3 Result of nodal point detection through sclera segmentation

(a) (b) (c) (d)

Figure 4 Cropped images from an iris image centering at pupil center

in the largest component is labeled as the center ofconcave region

(3) Finding the Eye Corner The distances of all pixelslying on boundary of sclera region from the scleracenter are also calculated to find the center of sclera asdescribed aboveThe boundary pixel which is farthestfrom the center of the sclera is labeled as the eyecorner

The result of extracting these nodal points from eyeimage helps in finding the tilt of eye along with the positionof iris in eye Figure 3 depicts five sample images fromUBIRISv2 dataset and the outputs obtained from their pro-cessing through the aforementioned nodal point extraction

technique This information can be useful in localization ofperiocular region

42 Through Demand of Accuracy of Biometric SystemBeginning with the center of the eye (pupil center) abounding rectangular box is taken of which only enclosesthe iris Figure 4 shows how the eye images changes whenit is cropped with pupil center and the bounding size isgradually increased The corresponding accuracy of everycropped image is tested In subsequent steps the coverageof this bounding box is increased with a width of 3 of thediameter of the iris and the change in accuracy is observedAfter certain iterations of this procedure the bounding boxwill come to a portion of periocular region where thereis no more change in intensity hence the region is low

BioMed Research International 9

P1

P2

P3

P4

Figure 5 Method of formation of concave region of a binarizedsclera component

b

a

042a

067a

042a

019a

07a

013b

007b021b

005b005b

013b

032b057b

Figure 6 Different ratios of portions of face from human anthro-pometry

entropic Hence no more local feature can be extracted fromthis region even if the bounding box is increased In suchscenario the saturation accuracy is achieved and on thebasis of saturation accuracy the corresponding minimumbounding box is considered as the desired periocular regionAs the demand of different biometric systems may vary thebounding box corresponding to certain predefined accuracycan also be segmented as periocular region Similar resultshave also been observed for FERET database

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

to recognition to recognition Eye region contributing Eye region not contributing

Figure 7 Change of accuracy of periocular recognition with changein size of periocular template tested on subset of UBIRISv2 andFERET datasets

The exact method of obtaining the dynamic boundary isas follows

(1) For 119894 = 0 to 100 follow the steps 2 to 4(2) For each image in database find approximate iris

location in eye image(3) For each image in database centering at the iris center

crop a bounding box whose width 119908 = 100 + 3 times 119894of diameter of iris height ℎ = 73 of 119908

(4) Find accuracy of the system with this image size(5) Observe the change in accuracy with 119908

Figure 7 illustrates a plot of accuracy against 119908 whichshows that the accuracy of the biometric system saturatesafter a particular size of the bounding box Increasing thebox further does not increase the accuracy To carry outthis experiment Local Binary Pattern (LBP) [26] along withScale Invariant Feature Transform (SIFT) [27] are employedas feature extractor from the eye images First LBP is appliedand resulting image is subjected for extracting local featurethrough SIFT In the process a maximum accuracy of 8564is achieved while testing with randomly chosen 50 eye imagesof 12 subjects from UBIRISv2 dataset [28] When the sameexperiment is executed for randomly chosen 50 eye images of12 subjects from FERET dataset [29] a maximum accuracyof 7829 is achieved These saturation accuracy values areobtained when a rectangular boundary of width 300 ofdiameter of iris is considered or a wider rectangular eye areais taken into consideration To validate the experiment runon the sample strongly the same experiment was conductedon complete UBIRISv2 and FERET dataset which yielded8543 and 7801 accuracy respectively This concludesthat a subset of a large database can be employed to findthe optimal template size and the result found can be usedon whole dataset for cropping of images So to minimizetemplate size without compromising in accuracy the smallestwide rectangle with saturation accuracy can be used aslocalization boundary to periocular region It is also observed

10 BioMed Research International

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (complete dataset)LBP + SIFT on FERET (complete dataset)

Eye region contributing to recognition Eye region not contributing to recognition

Figure 8 Change of accuracy of periocular recognition with changein size of periocular template tested on full UBIRISv2 and FERETdatasets

0 10 20 30 40 50 60 70 80 9001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 9 Distribution of scores for imposter and genuine matchingtested on full UBIRISv2 dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

that the region beyond 300 of diameter of iris though doesnot participate in recognition increases the matching time asshown in Figure 11 This is also another reason of removingthe redundant eye region to make the recognition processfast

To validate this experiment the same experiment hasbeen carried out once again on full database of UBIRISv2 andFERETThe obtained accuracy values as depicted in Figure 8ensure the experimental objective that there is no significantfeature in periocular region beyond 300 of diameter of iriswhich can contribute to recognition The score distributionof imposter and genuine scores is shown in Figures 9 and 10

43 Human Expert Judgement on Importance of Portions ofEye Human expertise has been utilized to decide a sortedorder of importance of different sections of periocular regiontowards recognition [17] This information can be used to

0 10 20 30 40 50 6001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 10Distribution of scores for imposter and genuinematchingtested on full FERET dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

300 310 320 330 340 350 360 370 380 390 4000

00501

01502

02503

03504

04505

Width of periocular region (w)

Aver

age 1

1 m

atch

ing

time (

s)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

Figure 11 Change of 1 1 matching time with change in size ofperiocular template tested on full UBIRISv2 and FERET datasets

detect only the most important section in human eye thatis most important towards recognition If that section is notfound in human eye region the captured image is markedas Failure to Acquire (FTA) and not used for recognitionHence a predecision on the quality of live query templatecan increase the accuracy of the system by reducing falserejections However this technique is human-supervisedwhile enrolling an image in the database and while a livequery comes The human expert has to verify whether themost important portion of eye is visible in the image and hasto guide the biometric system accordingly

44Through SubdivisionApproach andAutomation ofHumanExpertise During enrolment phase of a biometric systema human expert needs to verify manually whether thecaptured image includes expected region of interestThroughautomated labeling different sections of an eye it can bestated which portion of eye is necessary for identification

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 6: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

6 BioMed Research International

Require 119868 RGB face image of size 119898 times 119899

Ensure 119878 Binary face image indicating skin-map(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) Compute the average luminance value of image 119868 as

119868119884avg =1

119898119899

119898

sum

119894=1

119899

sum

119895=1

119868119884119894119895

(4) Brightness compensated image 1198681198621015840 is obtained as 119868119862

1015840

119894119895= 119868119877

1015840

119894119895 1198681198661015840

119894119895 119868119861119894119895

where 1198681198771015840

119894119895= (119868119877119894119895)

120591 and 1198681198661015840

119894119895= (119868119866119894119895)

120591 and

120591 =

15 if119868119884avg lt 64

07 if 119868119884avg gt 190

1 otherwise(5) The skin map 119878 is detected from 119868119862

1015840 as

119878119894119895

=

0 if119877119894119895 + 1

119866119894119895 + 1gt 108

119877119894119895 + 1

119861119894119895 + 1gt 108 119866119894119895 gt 30 119866119894119895 lt 140

1 otherwisewhere 119878119894119895 = 0 indicates skin region and 119878119894119895 = 1 indicates non-skin regions

(6) return 119878

Algorithm 1 Skin Detection

Require 119868 RGB face image of size 119898 times 119899 119878 Binary face image indicating skin-mapEnsure EM Binary face image indicating open eye regions

(1) Convert RGB image 119868 to 119884119862119887119862119903 color space(2) Normalize 119868119884119894119895 to [0 255] where 119868119884119894119895 denotes 119884 value for the pixel (119894119895)(3) FC = Set of connected components in 119878

(4) EM = FC(5) For each connected component EM119901 in 119878 repeat Step 5 to 8(6) EB119901 = 0

(7) For each pixel (119894119895) in EM119901 the value of EB119901 is updated as

EB119901 = EB119901 + 0 if 65 lt 119868119884119894119895 lt 80

EB119901 + 1 otherwise(8) if EB119901 = Number of pixels in EM119901 then do EM = EM minus EM119901

(Removal of the 119901th connected component)(9) return EM

Algorithm 2 Open Eye Detection

This information can be used to decide the boundaryof periocular region In (1) width and height of eye areexpressed as a function of the height andwidth of human faceHence to gauge the width and height of periocular templateboundary there is no need to have knowledge of iris radiusHowever knowledge of coordinates of iris center is necessaryFrom these information a bounding box can be fit composingall visible portions of periocular region for example eyebroweyelashes tear duct eye fold eye corner and so forth Thisapproach is crude and dependent on the human supervisionor intelligent detection of these nodal points in human eye

Further from (2) it is observable that either informationof the height or width of periocular region is sufficient toderive the other parameter provided that the aspect ratio offace is known This aspect of the localization of periocular isused in Section 42 Equation (3) considers elliptical model

to represent face while finding the ratio of periocular regionand area of a human face It justifies the usefulness of using anoptimally selected periocular template for human recognitionrather than a full face recognition system

This method achieves periocular localization withoutknowledge of iris radiusHence it is suitable for localization ofperiocular region for unconstrained images where iris radiusis not detectable by machines due to low-quality partialclosure of eye or luminance of the visible spectrum eyeimage

However to make the system work in more uncon-strained environment periocular boundary can be achievedthrough sclera detection for the scenario when iris cannotbe properly located due to unconstrained acquisition of eyeor when the image captured is a low-quality color face imagecaptured from a distance

BioMed Research International 7

411 Detection of Sclera Region and Noise Removal

(1) The input RGB iris image 119894119898 is converted to grayscaleimage im gray

(2) The input RGB iris image 119894119898 is converted to HSIcolor model where 119878 component of each pixel can bedetermined by

119878 = 1 minus3

119877 + 119866 + 119861[min (119877 119866 119861)] (4)

where R G B denotes the Red Green and Bluecolor component of a particular pixel Let the imagehence formed containing S component of each pixelis 119904119886119905119906119903119886119905119894119900119899 119894119898

(3) If 119878 lt 120591 where 120591 is a predefined threshold then thatpixel is marked as sclera region else as a nonscleraregion Authors in [25] have experimented with 120591 =

021 to get a binarymap of sclera region through bina-rization of 119904119886119905119906119903119886119905119894119900119899 119894119898 as follows 119904119888119897119890119903119886 119899119900119894119904119910 =

119904119886119905119906119903119886119905119894119900119899 119894119898 lt 120591 Only a noisy binary map of sclera119904119888119897119890119903119886 119899119900119894119904119910 can be found through this process inwhich white pixels denote noisy sclera region andblack pixels denote non-sclera region

(4) im bin is formed as follows for every nonzero pixel(119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = average intensity of 17 times 17 window

around the pixel (119894 119895) in 119894119898 119892119903119886119910

(5)

for every zero pixel (119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = 0 (6)

(5) 119904119888119897119890119903119886 119886119889119886119901119905119894V119890 is formed as follows

119904119888119897119890119903119886 119886119889119886119901119905119894V119890 (119894 119895) =

0 if 119904119888119897119890119903119886 119899119900119894119904119910 (119894 119895) = 1 or119894119898 119892119903119886119910 (119894 119895) lt 119894119898 119887119894119899 (119894 119895)

1 otherwise(7)

(6) All binary connected components present in119904119888119897119890119903119886 119886119889119886119901119905119894V119890 are removed except the largest andsecond largest components

(7) If size of the second largest connected component isless than 25 of that of the large one it is interpretedthat the largest component is the single sclera detectedand the second largest connected component isremoved hence Else both components are retained asbinary map of sclera

After processing these above specified steps the binaryimage would only contain one or two components describingthe sclera region after removing noises

412 Content Retrieval of Sclera Region After a denoisedbinarymap of sclera regionwithin an eye image is obtained itis necessary to retrieve the information about sclera whethertwo parts of sclera on two sides of iris are separately visibleonly one of them is detected or both parts of sclera aredetected as a single component

There can be three exhaustive cases in the binary imagefound as sclera (a) the two sides of the sclera is connectedand found as a single connected component (b) two scleraregions are found as two different connected componentsand (c) only one side of the sclera is detected due to the poseof eye in the image If the number of connected componentsis found to be two then it is classified as aforementionedCase b (as shown in Figures 3(a) 3(b) and 3(c)) and twocomponents are treated as two portions of sclera Else if asingle connected component is obtained it is checked forthe ratio of length and breadth of the best fitted orientedbounding rectangle If the ratio is greater than 125 then itbelongs to aforementioned Case a else belongs to Case c(shown in Figure 3(e)) For the aforementioned Case a theregion is subdivided into two components (through detectingminimal cut that divides the joined sclera into two parts) asshown in Figure 3(d) and further processing is performed

413 Nodal Points Extraction from Sclera Region Each sclerais subjected to following processing through which threenodal points are detected from each sclera region namely (a)center of sclera (b) center of concave region of sclera and (c)eye corner So in general cases where two parts of the scleraare detected six nodal points will be detectedThemethod ofnodal point extraction is illustrated below

(1) Finding Center of Sclera The sclera component issubjected to a distance transform where the value ofeachwhite pixel (indicating pixels belonging to sclera)is replaced by its minimum distance from any blackpixel The pixel which is farthest from all black pixelswill have highest value after this transformationThatpixel is labeled as center of sclera

(2) Finding Center of Concave Region of Sclera Themidpoints of every straight line joining any twoborder pixels of the detected sclera component arefound out as shown in Figure 5 The midpointslying on the component itself (shown by red pointbetween 1198751 and 1198752 in Figure 5) are neglected Themidpoints lying outside the component (shown byyellow point between 1198753 and 1198754 in Figure 5) aretaken into account Due to discrete computation ofstraight lines midpoints of many straight lines drawnin aforementioned way overlap on a single pixel Aseparate matrix having the same size as the scleraitself is introduced which is having zero value ofeach pixel initially For every valid midpoint thevalue of corresponding pixel in this new matrix isincremented Once this process is over more thanone connected components of nonzero values will beobtained in the matrix signifying concave regionsThe largest connected component is retained whileothers are removedThe pixel havingmaximum value

8 BioMed Research International

(a) Sample output 1 from UBIRISv2 database

(b) Sample output 2 from UBIRISv2 database

(c) Sample output 3 from UBIRISv2 database

(d) Sample output 4 from UBIRISv2 database

(e) Sample output 5 from UBIRISv2 database

Figure 3 Result of nodal point detection through sclera segmentation

(a) (b) (c) (d)

Figure 4 Cropped images from an iris image centering at pupil center

in the largest component is labeled as the center ofconcave region

(3) Finding the Eye Corner The distances of all pixelslying on boundary of sclera region from the scleracenter are also calculated to find the center of sclera asdescribed aboveThe boundary pixel which is farthestfrom the center of the sclera is labeled as the eyecorner

The result of extracting these nodal points from eyeimage helps in finding the tilt of eye along with the positionof iris in eye Figure 3 depicts five sample images fromUBIRISv2 dataset and the outputs obtained from their pro-cessing through the aforementioned nodal point extraction

technique This information can be useful in localization ofperiocular region

42 Through Demand of Accuracy of Biometric SystemBeginning with the center of the eye (pupil center) abounding rectangular box is taken of which only enclosesthe iris Figure 4 shows how the eye images changes whenit is cropped with pupil center and the bounding size isgradually increased The corresponding accuracy of everycropped image is tested In subsequent steps the coverageof this bounding box is increased with a width of 3 of thediameter of the iris and the change in accuracy is observedAfter certain iterations of this procedure the bounding boxwill come to a portion of periocular region where thereis no more change in intensity hence the region is low

BioMed Research International 9

P1

P2

P3

P4

Figure 5 Method of formation of concave region of a binarizedsclera component

b

a

042a

067a

042a

019a

07a

013b

007b021b

005b005b

013b

032b057b

Figure 6 Different ratios of portions of face from human anthro-pometry

entropic Hence no more local feature can be extracted fromthis region even if the bounding box is increased In suchscenario the saturation accuracy is achieved and on thebasis of saturation accuracy the corresponding minimumbounding box is considered as the desired periocular regionAs the demand of different biometric systems may vary thebounding box corresponding to certain predefined accuracycan also be segmented as periocular region Similar resultshave also been observed for FERET database

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

to recognition to recognition Eye region contributing Eye region not contributing

Figure 7 Change of accuracy of periocular recognition with changein size of periocular template tested on subset of UBIRISv2 andFERET datasets

The exact method of obtaining the dynamic boundary isas follows

(1) For 119894 = 0 to 100 follow the steps 2 to 4(2) For each image in database find approximate iris

location in eye image(3) For each image in database centering at the iris center

crop a bounding box whose width 119908 = 100 + 3 times 119894of diameter of iris height ℎ = 73 of 119908

(4) Find accuracy of the system with this image size(5) Observe the change in accuracy with 119908

Figure 7 illustrates a plot of accuracy against 119908 whichshows that the accuracy of the biometric system saturatesafter a particular size of the bounding box Increasing thebox further does not increase the accuracy To carry outthis experiment Local Binary Pattern (LBP) [26] along withScale Invariant Feature Transform (SIFT) [27] are employedas feature extractor from the eye images First LBP is appliedand resulting image is subjected for extracting local featurethrough SIFT In the process a maximum accuracy of 8564is achieved while testing with randomly chosen 50 eye imagesof 12 subjects from UBIRISv2 dataset [28] When the sameexperiment is executed for randomly chosen 50 eye images of12 subjects from FERET dataset [29] a maximum accuracyof 7829 is achieved These saturation accuracy values areobtained when a rectangular boundary of width 300 ofdiameter of iris is considered or a wider rectangular eye areais taken into consideration To validate the experiment runon the sample strongly the same experiment was conductedon complete UBIRISv2 and FERET dataset which yielded8543 and 7801 accuracy respectively This concludesthat a subset of a large database can be employed to findthe optimal template size and the result found can be usedon whole dataset for cropping of images So to minimizetemplate size without compromising in accuracy the smallestwide rectangle with saturation accuracy can be used aslocalization boundary to periocular region It is also observed

10 BioMed Research International

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (complete dataset)LBP + SIFT on FERET (complete dataset)

Eye region contributing to recognition Eye region not contributing to recognition

Figure 8 Change of accuracy of periocular recognition with changein size of periocular template tested on full UBIRISv2 and FERETdatasets

0 10 20 30 40 50 60 70 80 9001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 9 Distribution of scores for imposter and genuine matchingtested on full UBIRISv2 dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

that the region beyond 300 of diameter of iris though doesnot participate in recognition increases the matching time asshown in Figure 11 This is also another reason of removingthe redundant eye region to make the recognition processfast

To validate this experiment the same experiment hasbeen carried out once again on full database of UBIRISv2 andFERETThe obtained accuracy values as depicted in Figure 8ensure the experimental objective that there is no significantfeature in periocular region beyond 300 of diameter of iriswhich can contribute to recognition The score distributionof imposter and genuine scores is shown in Figures 9 and 10

43 Human Expert Judgement on Importance of Portions ofEye Human expertise has been utilized to decide a sortedorder of importance of different sections of periocular regiontowards recognition [17] This information can be used to

0 10 20 30 40 50 6001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 10Distribution of scores for imposter and genuinematchingtested on full FERET dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

300 310 320 330 340 350 360 370 380 390 4000

00501

01502

02503

03504

04505

Width of periocular region (w)

Aver

age 1

1 m

atch

ing

time (

s)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

Figure 11 Change of 1 1 matching time with change in size ofperiocular template tested on full UBIRISv2 and FERET datasets

detect only the most important section in human eye thatis most important towards recognition If that section is notfound in human eye region the captured image is markedas Failure to Acquire (FTA) and not used for recognitionHence a predecision on the quality of live query templatecan increase the accuracy of the system by reducing falserejections However this technique is human-supervisedwhile enrolling an image in the database and while a livequery comes The human expert has to verify whether themost important portion of eye is visible in the image and hasto guide the biometric system accordingly

44Through SubdivisionApproach andAutomation ofHumanExpertise During enrolment phase of a biometric systema human expert needs to verify manually whether thecaptured image includes expected region of interestThroughautomated labeling different sections of an eye it can bestated which portion of eye is necessary for identification

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 7: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

BioMed Research International 7

411 Detection of Sclera Region and Noise Removal

(1) The input RGB iris image 119894119898 is converted to grayscaleimage im gray

(2) The input RGB iris image 119894119898 is converted to HSIcolor model where 119878 component of each pixel can bedetermined by

119878 = 1 minus3

119877 + 119866 + 119861[min (119877 119866 119861)] (4)

where R G B denotes the Red Green and Bluecolor component of a particular pixel Let the imagehence formed containing S component of each pixelis 119904119886119905119906119903119886119905119894119900119899 119894119898

(3) If 119878 lt 120591 where 120591 is a predefined threshold then thatpixel is marked as sclera region else as a nonscleraregion Authors in [25] have experimented with 120591 =

021 to get a binarymap of sclera region through bina-rization of 119904119886119905119906119903119886119905119894119900119899 119894119898 as follows 119904119888119897119890119903119886 119899119900119894119904119910 =

119904119886119905119906119903119886119905119894119900119899 119894119898 lt 120591 Only a noisy binary map of sclera119904119888119897119890119903119886 119899119900119894119904119910 can be found through this process inwhich white pixels denote noisy sclera region andblack pixels denote non-sclera region

(4) im bin is formed as follows for every nonzero pixel(119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = average intensity of 17 times 17 window

around the pixel (119894 119895) in 119894119898 119892119903119886119910

(5)

for every zero pixel (119894 119895) in 119904119888119897119890119903119886 119899119900119894119904119910

119894119898 119887119894119899 (119894 119895) = 0 (6)

(5) 119904119888119897119890119903119886 119886119889119886119901119905119894V119890 is formed as follows

119904119888119897119890119903119886 119886119889119886119901119905119894V119890 (119894 119895) =

0 if 119904119888119897119890119903119886 119899119900119894119904119910 (119894 119895) = 1 or119894119898 119892119903119886119910 (119894 119895) lt 119894119898 119887119894119899 (119894 119895)

1 otherwise(7)

(6) All binary connected components present in119904119888119897119890119903119886 119886119889119886119901119905119894V119890 are removed except the largest andsecond largest components

(7) If size of the second largest connected component isless than 25 of that of the large one it is interpretedthat the largest component is the single sclera detectedand the second largest connected component isremoved hence Else both components are retained asbinary map of sclera

After processing these above specified steps the binaryimage would only contain one or two components describingthe sclera region after removing noises

412 Content Retrieval of Sclera Region After a denoisedbinarymap of sclera regionwithin an eye image is obtained itis necessary to retrieve the information about sclera whethertwo parts of sclera on two sides of iris are separately visibleonly one of them is detected or both parts of sclera aredetected as a single component

There can be three exhaustive cases in the binary imagefound as sclera (a) the two sides of the sclera is connectedand found as a single connected component (b) two scleraregions are found as two different connected componentsand (c) only one side of the sclera is detected due to the poseof eye in the image If the number of connected componentsis found to be two then it is classified as aforementionedCase b (as shown in Figures 3(a) 3(b) and 3(c)) and twocomponents are treated as two portions of sclera Else if asingle connected component is obtained it is checked forthe ratio of length and breadth of the best fitted orientedbounding rectangle If the ratio is greater than 125 then itbelongs to aforementioned Case a else belongs to Case c(shown in Figure 3(e)) For the aforementioned Case a theregion is subdivided into two components (through detectingminimal cut that divides the joined sclera into two parts) asshown in Figure 3(d) and further processing is performed

413 Nodal Points Extraction from Sclera Region Each sclerais subjected to following processing through which threenodal points are detected from each sclera region namely (a)center of sclera (b) center of concave region of sclera and (c)eye corner So in general cases where two parts of the scleraare detected six nodal points will be detectedThemethod ofnodal point extraction is illustrated below

(1) Finding Center of Sclera The sclera component issubjected to a distance transform where the value ofeachwhite pixel (indicating pixels belonging to sclera)is replaced by its minimum distance from any blackpixel The pixel which is farthest from all black pixelswill have highest value after this transformationThatpixel is labeled as center of sclera

(2) Finding Center of Concave Region of Sclera Themidpoints of every straight line joining any twoborder pixels of the detected sclera component arefound out as shown in Figure 5 The midpointslying on the component itself (shown by red pointbetween 1198751 and 1198752 in Figure 5) are neglected Themidpoints lying outside the component (shown byyellow point between 1198753 and 1198754 in Figure 5) aretaken into account Due to discrete computation ofstraight lines midpoints of many straight lines drawnin aforementioned way overlap on a single pixel Aseparate matrix having the same size as the scleraitself is introduced which is having zero value ofeach pixel initially For every valid midpoint thevalue of corresponding pixel in this new matrix isincremented Once this process is over more thanone connected components of nonzero values will beobtained in the matrix signifying concave regionsThe largest connected component is retained whileothers are removedThe pixel havingmaximum value

8 BioMed Research International

(a) Sample output 1 from UBIRISv2 database

(b) Sample output 2 from UBIRISv2 database

(c) Sample output 3 from UBIRISv2 database

(d) Sample output 4 from UBIRISv2 database

(e) Sample output 5 from UBIRISv2 database

Figure 3 Result of nodal point detection through sclera segmentation

(a) (b) (c) (d)

Figure 4 Cropped images from an iris image centering at pupil center

in the largest component is labeled as the center ofconcave region

(3) Finding the Eye Corner The distances of all pixelslying on boundary of sclera region from the scleracenter are also calculated to find the center of sclera asdescribed aboveThe boundary pixel which is farthestfrom the center of the sclera is labeled as the eyecorner

The result of extracting these nodal points from eyeimage helps in finding the tilt of eye along with the positionof iris in eye Figure 3 depicts five sample images fromUBIRISv2 dataset and the outputs obtained from their pro-cessing through the aforementioned nodal point extraction

technique This information can be useful in localization ofperiocular region

42 Through Demand of Accuracy of Biometric SystemBeginning with the center of the eye (pupil center) abounding rectangular box is taken of which only enclosesthe iris Figure 4 shows how the eye images changes whenit is cropped with pupil center and the bounding size isgradually increased The corresponding accuracy of everycropped image is tested In subsequent steps the coverageof this bounding box is increased with a width of 3 of thediameter of the iris and the change in accuracy is observedAfter certain iterations of this procedure the bounding boxwill come to a portion of periocular region where thereis no more change in intensity hence the region is low

BioMed Research International 9

P1

P2

P3

P4

Figure 5 Method of formation of concave region of a binarizedsclera component

b

a

042a

067a

042a

019a

07a

013b

007b021b

005b005b

013b

032b057b

Figure 6 Different ratios of portions of face from human anthro-pometry

entropic Hence no more local feature can be extracted fromthis region even if the bounding box is increased In suchscenario the saturation accuracy is achieved and on thebasis of saturation accuracy the corresponding minimumbounding box is considered as the desired periocular regionAs the demand of different biometric systems may vary thebounding box corresponding to certain predefined accuracycan also be segmented as periocular region Similar resultshave also been observed for FERET database

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

to recognition to recognition Eye region contributing Eye region not contributing

Figure 7 Change of accuracy of periocular recognition with changein size of periocular template tested on subset of UBIRISv2 andFERET datasets

The exact method of obtaining the dynamic boundary isas follows

(1) For 119894 = 0 to 100 follow the steps 2 to 4(2) For each image in database find approximate iris

location in eye image(3) For each image in database centering at the iris center

crop a bounding box whose width 119908 = 100 + 3 times 119894of diameter of iris height ℎ = 73 of 119908

(4) Find accuracy of the system with this image size(5) Observe the change in accuracy with 119908

Figure 7 illustrates a plot of accuracy against 119908 whichshows that the accuracy of the biometric system saturatesafter a particular size of the bounding box Increasing thebox further does not increase the accuracy To carry outthis experiment Local Binary Pattern (LBP) [26] along withScale Invariant Feature Transform (SIFT) [27] are employedas feature extractor from the eye images First LBP is appliedand resulting image is subjected for extracting local featurethrough SIFT In the process a maximum accuracy of 8564is achieved while testing with randomly chosen 50 eye imagesof 12 subjects from UBIRISv2 dataset [28] When the sameexperiment is executed for randomly chosen 50 eye images of12 subjects from FERET dataset [29] a maximum accuracyof 7829 is achieved These saturation accuracy values areobtained when a rectangular boundary of width 300 ofdiameter of iris is considered or a wider rectangular eye areais taken into consideration To validate the experiment runon the sample strongly the same experiment was conductedon complete UBIRISv2 and FERET dataset which yielded8543 and 7801 accuracy respectively This concludesthat a subset of a large database can be employed to findthe optimal template size and the result found can be usedon whole dataset for cropping of images So to minimizetemplate size without compromising in accuracy the smallestwide rectangle with saturation accuracy can be used aslocalization boundary to periocular region It is also observed

10 BioMed Research International

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (complete dataset)LBP + SIFT on FERET (complete dataset)

Eye region contributing to recognition Eye region not contributing to recognition

Figure 8 Change of accuracy of periocular recognition with changein size of periocular template tested on full UBIRISv2 and FERETdatasets

0 10 20 30 40 50 60 70 80 9001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 9 Distribution of scores for imposter and genuine matchingtested on full UBIRISv2 dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

that the region beyond 300 of diameter of iris though doesnot participate in recognition increases the matching time asshown in Figure 11 This is also another reason of removingthe redundant eye region to make the recognition processfast

To validate this experiment the same experiment hasbeen carried out once again on full database of UBIRISv2 andFERETThe obtained accuracy values as depicted in Figure 8ensure the experimental objective that there is no significantfeature in periocular region beyond 300 of diameter of iriswhich can contribute to recognition The score distributionof imposter and genuine scores is shown in Figures 9 and 10

43 Human Expert Judgement on Importance of Portions ofEye Human expertise has been utilized to decide a sortedorder of importance of different sections of periocular regiontowards recognition [17] This information can be used to

0 10 20 30 40 50 6001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 10Distribution of scores for imposter and genuinematchingtested on full FERET dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

300 310 320 330 340 350 360 370 380 390 4000

00501

01502

02503

03504

04505

Width of periocular region (w)

Aver

age 1

1 m

atch

ing

time (

s)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

Figure 11 Change of 1 1 matching time with change in size ofperiocular template tested on full UBIRISv2 and FERET datasets

detect only the most important section in human eye thatis most important towards recognition If that section is notfound in human eye region the captured image is markedas Failure to Acquire (FTA) and not used for recognitionHence a predecision on the quality of live query templatecan increase the accuracy of the system by reducing falserejections However this technique is human-supervisedwhile enrolling an image in the database and while a livequery comes The human expert has to verify whether themost important portion of eye is visible in the image and hasto guide the biometric system accordingly

44Through SubdivisionApproach andAutomation ofHumanExpertise During enrolment phase of a biometric systema human expert needs to verify manually whether thecaptured image includes expected region of interestThroughautomated labeling different sections of an eye it can bestated which portion of eye is necessary for identification

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 8: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

8 BioMed Research International

(a) Sample output 1 from UBIRISv2 database

(b) Sample output 2 from UBIRISv2 database

(c) Sample output 3 from UBIRISv2 database

(d) Sample output 4 from UBIRISv2 database

(e) Sample output 5 from UBIRISv2 database

Figure 3 Result of nodal point detection through sclera segmentation

(a) (b) (c) (d)

Figure 4 Cropped images from an iris image centering at pupil center

in the largest component is labeled as the center ofconcave region

(3) Finding the Eye Corner The distances of all pixelslying on boundary of sclera region from the scleracenter are also calculated to find the center of sclera asdescribed aboveThe boundary pixel which is farthestfrom the center of the sclera is labeled as the eyecorner

The result of extracting these nodal points from eyeimage helps in finding the tilt of eye along with the positionof iris in eye Figure 3 depicts five sample images fromUBIRISv2 dataset and the outputs obtained from their pro-cessing through the aforementioned nodal point extraction

technique This information can be useful in localization ofperiocular region

42 Through Demand of Accuracy of Biometric SystemBeginning with the center of the eye (pupil center) abounding rectangular box is taken of which only enclosesthe iris Figure 4 shows how the eye images changes whenit is cropped with pupil center and the bounding size isgradually increased The corresponding accuracy of everycropped image is tested In subsequent steps the coverageof this bounding box is increased with a width of 3 of thediameter of the iris and the change in accuracy is observedAfter certain iterations of this procedure the bounding boxwill come to a portion of periocular region where thereis no more change in intensity hence the region is low

BioMed Research International 9

P1

P2

P3

P4

Figure 5 Method of formation of concave region of a binarizedsclera component

b

a

042a

067a

042a

019a

07a

013b

007b021b

005b005b

013b

032b057b

Figure 6 Different ratios of portions of face from human anthro-pometry

entropic Hence no more local feature can be extracted fromthis region even if the bounding box is increased In suchscenario the saturation accuracy is achieved and on thebasis of saturation accuracy the corresponding minimumbounding box is considered as the desired periocular regionAs the demand of different biometric systems may vary thebounding box corresponding to certain predefined accuracycan also be segmented as periocular region Similar resultshave also been observed for FERET database

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

to recognition to recognition Eye region contributing Eye region not contributing

Figure 7 Change of accuracy of periocular recognition with changein size of periocular template tested on subset of UBIRISv2 andFERET datasets

The exact method of obtaining the dynamic boundary isas follows

(1) For 119894 = 0 to 100 follow the steps 2 to 4(2) For each image in database find approximate iris

location in eye image(3) For each image in database centering at the iris center

crop a bounding box whose width 119908 = 100 + 3 times 119894of diameter of iris height ℎ = 73 of 119908

(4) Find accuracy of the system with this image size(5) Observe the change in accuracy with 119908

Figure 7 illustrates a plot of accuracy against 119908 whichshows that the accuracy of the biometric system saturatesafter a particular size of the bounding box Increasing thebox further does not increase the accuracy To carry outthis experiment Local Binary Pattern (LBP) [26] along withScale Invariant Feature Transform (SIFT) [27] are employedas feature extractor from the eye images First LBP is appliedand resulting image is subjected for extracting local featurethrough SIFT In the process a maximum accuracy of 8564is achieved while testing with randomly chosen 50 eye imagesof 12 subjects from UBIRISv2 dataset [28] When the sameexperiment is executed for randomly chosen 50 eye images of12 subjects from FERET dataset [29] a maximum accuracyof 7829 is achieved These saturation accuracy values areobtained when a rectangular boundary of width 300 ofdiameter of iris is considered or a wider rectangular eye areais taken into consideration To validate the experiment runon the sample strongly the same experiment was conductedon complete UBIRISv2 and FERET dataset which yielded8543 and 7801 accuracy respectively This concludesthat a subset of a large database can be employed to findthe optimal template size and the result found can be usedon whole dataset for cropping of images So to minimizetemplate size without compromising in accuracy the smallestwide rectangle with saturation accuracy can be used aslocalization boundary to periocular region It is also observed

10 BioMed Research International

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (complete dataset)LBP + SIFT on FERET (complete dataset)

Eye region contributing to recognition Eye region not contributing to recognition

Figure 8 Change of accuracy of periocular recognition with changein size of periocular template tested on full UBIRISv2 and FERETdatasets

0 10 20 30 40 50 60 70 80 9001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 9 Distribution of scores for imposter and genuine matchingtested on full UBIRISv2 dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

that the region beyond 300 of diameter of iris though doesnot participate in recognition increases the matching time asshown in Figure 11 This is also another reason of removingthe redundant eye region to make the recognition processfast

To validate this experiment the same experiment hasbeen carried out once again on full database of UBIRISv2 andFERETThe obtained accuracy values as depicted in Figure 8ensure the experimental objective that there is no significantfeature in periocular region beyond 300 of diameter of iriswhich can contribute to recognition The score distributionof imposter and genuine scores is shown in Figures 9 and 10

43 Human Expert Judgement on Importance of Portions ofEye Human expertise has been utilized to decide a sortedorder of importance of different sections of periocular regiontowards recognition [17] This information can be used to

0 10 20 30 40 50 6001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 10Distribution of scores for imposter and genuinematchingtested on full FERET dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

300 310 320 330 340 350 360 370 380 390 4000

00501

01502

02503

03504

04505

Width of periocular region (w)

Aver

age 1

1 m

atch

ing

time (

s)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

Figure 11 Change of 1 1 matching time with change in size ofperiocular template tested on full UBIRISv2 and FERET datasets

detect only the most important section in human eye thatis most important towards recognition If that section is notfound in human eye region the captured image is markedas Failure to Acquire (FTA) and not used for recognitionHence a predecision on the quality of live query templatecan increase the accuracy of the system by reducing falserejections However this technique is human-supervisedwhile enrolling an image in the database and while a livequery comes The human expert has to verify whether themost important portion of eye is visible in the image and hasto guide the biometric system accordingly

44Through SubdivisionApproach andAutomation ofHumanExpertise During enrolment phase of a biometric systema human expert needs to verify manually whether thecaptured image includes expected region of interestThroughautomated labeling different sections of an eye it can bestated which portion of eye is necessary for identification

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 9: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

BioMed Research International 9

P1

P2

P3

P4

Figure 5 Method of formation of concave region of a binarizedsclera component

b

a

042a

067a

042a

019a

07a

013b

007b021b

005b005b

013b

032b057b

Figure 6 Different ratios of portions of face from human anthro-pometry

entropic Hence no more local feature can be extracted fromthis region even if the bounding box is increased In suchscenario the saturation accuracy is achieved and on thebasis of saturation accuracy the corresponding minimumbounding box is considered as the desired periocular regionAs the demand of different biometric systems may vary thebounding box corresponding to certain predefined accuracycan also be segmented as periocular region Similar resultshave also been observed for FERET database

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

to recognition to recognition Eye region contributing Eye region not contributing

Figure 7 Change of accuracy of periocular recognition with changein size of periocular template tested on subset of UBIRISv2 andFERET datasets

The exact method of obtaining the dynamic boundary isas follows

(1) For 119894 = 0 to 100 follow the steps 2 to 4(2) For each image in database find approximate iris

location in eye image(3) For each image in database centering at the iris center

crop a bounding box whose width 119908 = 100 + 3 times 119894of diameter of iris height ℎ = 73 of 119908

(4) Find accuracy of the system with this image size(5) Observe the change in accuracy with 119908

Figure 7 illustrates a plot of accuracy against 119908 whichshows that the accuracy of the biometric system saturatesafter a particular size of the bounding box Increasing thebox further does not increase the accuracy To carry outthis experiment Local Binary Pattern (LBP) [26] along withScale Invariant Feature Transform (SIFT) [27] are employedas feature extractor from the eye images First LBP is appliedand resulting image is subjected for extracting local featurethrough SIFT In the process a maximum accuracy of 8564is achieved while testing with randomly chosen 50 eye imagesof 12 subjects from UBIRISv2 dataset [28] When the sameexperiment is executed for randomly chosen 50 eye images of12 subjects from FERET dataset [29] a maximum accuracyof 7829 is achieved These saturation accuracy values areobtained when a rectangular boundary of width 300 ofdiameter of iris is considered or a wider rectangular eye areais taken into consideration To validate the experiment runon the sample strongly the same experiment was conductedon complete UBIRISv2 and FERET dataset which yielded8543 and 7801 accuracy respectively This concludesthat a subset of a large database can be employed to findthe optimal template size and the result found can be usedon whole dataset for cropping of images So to minimizetemplate size without compromising in accuracy the smallestwide rectangle with saturation accuracy can be used aslocalization boundary to periocular region It is also observed

10 BioMed Research International

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (complete dataset)LBP + SIFT on FERET (complete dataset)

Eye region contributing to recognition Eye region not contributing to recognition

Figure 8 Change of accuracy of periocular recognition with changein size of periocular template tested on full UBIRISv2 and FERETdatasets

0 10 20 30 40 50 60 70 80 9001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 9 Distribution of scores for imposter and genuine matchingtested on full UBIRISv2 dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

that the region beyond 300 of diameter of iris though doesnot participate in recognition increases the matching time asshown in Figure 11 This is also another reason of removingthe redundant eye region to make the recognition processfast

To validate this experiment the same experiment hasbeen carried out once again on full database of UBIRISv2 andFERETThe obtained accuracy values as depicted in Figure 8ensure the experimental objective that there is no significantfeature in periocular region beyond 300 of diameter of iriswhich can contribute to recognition The score distributionof imposter and genuine scores is shown in Figures 9 and 10

43 Human Expert Judgement on Importance of Portions ofEye Human expertise has been utilized to decide a sortedorder of importance of different sections of periocular regiontowards recognition [17] This information can be used to

0 10 20 30 40 50 6001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 10Distribution of scores for imposter and genuinematchingtested on full FERET dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

300 310 320 330 340 350 360 370 380 390 4000

00501

01502

02503

03504

04505

Width of periocular region (w)

Aver

age 1

1 m

atch

ing

time (

s)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

Figure 11 Change of 1 1 matching time with change in size ofperiocular template tested on full UBIRISv2 and FERET datasets

detect only the most important section in human eye thatis most important towards recognition If that section is notfound in human eye region the captured image is markedas Failure to Acquire (FTA) and not used for recognitionHence a predecision on the quality of live query templatecan increase the accuracy of the system by reducing falserejections However this technique is human-supervisedwhile enrolling an image in the database and while a livequery comes The human expert has to verify whether themost important portion of eye is visible in the image and hasto guide the biometric system accordingly

44Through SubdivisionApproach andAutomation ofHumanExpertise During enrolment phase of a biometric systema human expert needs to verify manually whether thecaptured image includes expected region of interestThroughautomated labeling different sections of an eye it can bestated which portion of eye is necessary for identification

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 10: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

10 BioMed Research International

100 150 200 250 300 350 40010

20

30

40

50

60

70

80

90

Width of periocular region (w)

Reco

gniti

on ac

cura

cy (

)

LBP + SIFT on UBIRISv2 (complete dataset)LBP + SIFT on FERET (complete dataset)

Eye region contributing to recognition Eye region not contributing to recognition

Figure 8 Change of accuracy of periocular recognition with changein size of periocular template tested on full UBIRISv2 and FERETdatasets

0 10 20 30 40 50 60 70 80 9001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 9 Distribution of scores for imposter and genuine matchingtested on full UBIRISv2 dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

that the region beyond 300 of diameter of iris though doesnot participate in recognition increases the matching time asshown in Figure 11 This is also another reason of removingthe redundant eye region to make the recognition processfast

To validate this experiment the same experiment hasbeen carried out once again on full database of UBIRISv2 andFERETThe obtained accuracy values as depicted in Figure 8ensure the experimental objective that there is no significantfeature in periocular region beyond 300 of diameter of iriswhich can contribute to recognition The score distributionof imposter and genuine scores is shown in Figures 9 and 10

43 Human Expert Judgement on Importance of Portions ofEye Human expertise has been utilized to decide a sortedorder of importance of different sections of periocular regiontowards recognition [17] This information can be used to

0 10 20 30 40 50 6001

1

10

100

Matching score

Scor

e dist

ribut

ion

ImpostersGenuine

times105

Figure 10Distribution of scores for imposter and genuinematchingtested on full FERET dataset applying LBP + SIFT on perioculartemplate having width as 300 of the iris diameter

300 310 320 330 340 350 360 370 380 390 4000

00501

01502

02503

03504

04505

Width of periocular region (w)

Aver

age 1

1 m

atch

ing

time (

s)

LBP + SIFT on UBIRISv2 (50 samples of 12 subjects)LBP + SIFT on FERET (50 samples of 12 subjects)

Figure 11 Change of 1 1 matching time with change in size ofperiocular template tested on full UBIRISv2 and FERET datasets

detect only the most important section in human eye thatis most important towards recognition If that section is notfound in human eye region the captured image is markedas Failure to Acquire (FTA) and not used for recognitionHence a predecision on the quality of live query templatecan increase the accuracy of the system by reducing falserejections However this technique is human-supervisedwhile enrolling an image in the database and while a livequery comes The human expert has to verify whether themost important portion of eye is visible in the image and hasto guide the biometric system accordingly

44Through SubdivisionApproach andAutomation ofHumanExpertise During enrolment phase of a biometric systema human expert needs to verify manually whether thecaptured image includes expected region of interestThroughautomated labeling different sections of an eye it can bestated which portion of eye is necessary for identification

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 11: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

BioMed Research International 11

0102030405060708090

100

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

10minus2

10minus1

100

101

102

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

Figure 12 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for UBIRISv2

(from human expert knowledge already discussed) and anautomated FTA detection system can be made Hence thereis no need of a human expert for verifying the existence ofimportant portions of human eye in an acquired eye image

The challenge in incorporating this strategy in local-ization of periocular region is the automatic detection ofportions of human eye like eyelid eye corner tear duct lower-eyefold and so forth An attempt to do subdivision detectionin eye region can be achieved through color detection andanalysis and applying different transformations

5 Experimental Results

There are four methods explained through which an optimalperiocular template can be selected for biometric recognitionThe first two methods explained in Sections 41 and 42 areexperimentally evaluated using publicly available FERET andUBIRISv2 databases A brief description of the two databasesused for evaluation are illustrated in Table 5 A total of(111022

) = 61621651 genuine and imposter matching amongimages from UBIRISv2 and (

141262

) = 99764875 genuine andimposter matching among images from FERET database areexperimented to claim the proposition of optimality

Anthropometry based approach performs accuratelyalong with proper skin detection and sclera detection in eyeregion The sample outputs are shown in Figure 3 which arefound to be proper when evaluated against ground truth

Saturation accuracy based approach performs with anaccuracy more than 80 with noisy and low-resolutionimages of UBIRISv2 and FERET which marks the efficiencyof the proposed approach To analyse the performance moredeeply Receiver Operating Characteristic (ROC) curve isexperimented out when the width of the periocular regionis 200 250 and 300 of the diameter of iris regionrespectively ROC curve depicts the dependence of falserejection rate (FRR) with false acceptance rate (FAR) forchange in the value of threshold The curve is plotted using

0 10 20 30 40 50 60 70 80 90 1000

102030405060708090

100

010203040506070

False acceptance rate (FAR)

False

reje

ctio

n ra

te (F

RR)

ROC curve for w = 300

ROC curve for w = 250

ROC curve for w = 200

10minus1

100

101

Figure 13 Receiver Operating Characteristic (ROC) curve fordifferent template sizes of periocular region for FERET

linear logarithmic or semilogarithmic scales As plotted inFigures 12 and 13 it is obvious to conclude that the systemperforms better with low FAR when 119908 = 300 than when119908 = 200 and 250 Hence the ROC curve reveals that theportions of eye lying between 200 and 300 of diameterof iris are very much responsible for the recognition andfeature-dense part of a periocular image Furthermore to havea 1 119899 matching analysis Cumulative Match Characteristic(CMC) curves representing the probability of identificationat various ranks are also experimented out when the widthof the periocular region is 200 250 and 300 of the irisregion respectively (shown in Figures 14 and 15)The119889

1015840 index[31]measures the separation between the arithmeticmeans ofthe genuine and imposter probability distribution in standarddeviation units is defined as follows

1198891015840=

radic210038161003816100381610038161003816120583genuine minus 120583imposter

10038161003816100381610038161003816

radic1205902

genuine + 1205902

imposter

(8)

where 120583 and 120590 are mean and standard deviation of genuineand imposter scores Table 6 yields the change of 119889

1015840 indexof recognition when the width of periocular region is variedThe value of 1198891015840 increases monotonically from 123 to 285 forUBIRISv2 dataset and from 119 to 269 for FERET datasetwith incremental change in 119908 An incremental nature in thevalues of 119889

1015840 for 119908 = 100 to 300 and an insignificant changein the value of 119889

1015840 for 119908 = 300 to 400 also establishes theexistence of a boundary between regions contributing andnotcontributing to recognition

Human expert judging is experimented byHollingsworthet al [17] and the results are used towards the direction ofoptimal periocular localization Human subjects are askedwhich part of eye they feel to be the most important forrecognition Most of the subjects voted that blood vessels arethe most important feature to recognize an individual fromVS eye image This information is used to infer which sub-portions of eye must belong to the optimal periocular region

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 12: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

12 BioMed Research International

Table 5 Detail of publicly available testing databases

Database Developer Version Number ofimages

Number ofsubjects Resolution Color model

UBIRIS

Soft Computing and ImageAnalysis (SOCIA) GroupDepartment of ComputerScience University of BeiraInterior Portugal

v1 [30]v2 [28]

187711102

241261

800 times 600

400 times 300

RGBsRGB

FERET [29]National Institute of Standardsand Technology (NIST)Gaithersburg Maryland

v4 14126 1191768 times 512

384 times 256

192 times 128

RGB

Table 6 Change of 1198891015840 index with change of cropping of periocular region

Width of periocular region (119908) 100 150 200 250 300 350 400Value of 1198891015840 index (for UBIRISv2 dataset) 123 160 205 234 261 272 285Value of 1198891015840 index (for FERET dataset) 119 155 201 229 253 266 269

10 20 30 40 50 60 70065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 14 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for UBIRISv2

for it to be a candidate for recognition Removal of thoseimportant regions will lead to rejection of the template

Subdivision approach needs manual supervision in theprocess of proper labeling of the different portions of humaneye Once the enrolled templates are labeled by the expert anoptimal part of the template can be selected for recognitionThe method is tested on FERET database and yielded properlocalization of periocular region

6 Conclusions

Recent research signifies why recognition through visualspectrum periocular image has gained so much importanceand how the present approaches work While developingrecognition system for a large database it is a crucial factorto optimize the template size Existence of any redundantregion in template will increase the matching time but willnot contribute to increase the accuracy of matching Hence

0 10 20 30 40 50 60 70 80065

07

075

08

085

09

095

1

Rank

Prob

abili

ty o

f ide

ntifi

catio

n

CMC curve for w = 300

CMC curve for w = 250

CMC curve for w = 200

Figure 15 Cumulative Match Characteristic (CMC) curve fordifferent template sizes of periocular region for FERET

removal of redundant region of the template should beaccomplished before the matching procedure As recognitiontime of identification is dependent on database size n hencea decrease of 1 1 matching time of t will actually decreasent matching time for identification in total As n is large (inthe range of 109 practical cases) nt is a significant amount oftime especially when concurrent matching is implementedin distributed biometric systems The paper prescribes fourmetrics for the optimization of visual spectrum periocularimage and experimentally establishes their relevance in termsof satisfying expected recognition accuracy These methodscan be used to localize the periocular region dynamically sothat an optimized region can be selectedwhich is best suitablefor recognition in terms of two contradictory objectives(a) minimal template size and (b) maximal recognitionaccuracy

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 13: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

BioMed Research International 13

Abbreviations

NIR Near infraredVS Visual spectrumLBP Local Binary PatternSIFT Scale Invariant Feature TransformROC Receiver Operating CharacteristicCMC Cumulative Match CharacteristicFTA Failure to AcquireFRR False rejection rateFAR False acceptance rate

References

[1] A Sohail and P Bhattacharya ldquoDetection of facial feature pointsusing anthropometric face modelrdquo Signal Processing for ImageEnhancement and Multimedia Processing vol 31 pp 189ndash2002008

[2] U Park R R Jillela A Ross and A K Jain ldquoPeriocularbiometrics in the visible spectrumrdquo IEEE Transactions onInformation Forensics and Security vol 6 no 1 pp 96ndash106 2011

[3] T Camus and R Wildes ldquoReliable and fast eye finding in close-up imagesrdquo in Proceedings of the 16th International Conferenceon Pattern Recognition vol 1 pp 389ndash394 2002

[4] H Sung J Lim J-H Park and Y Lee ldquoIris recognition usingcollarette boundary localizationrdquo in Proceedings of the 17thInternational Conference on Pattern Recognition (ICPR rsquo04) vol4 pp 857ndash860 August 2004

[5] B Bonney R Ives D Etter and Y Du ldquoIRIS pattern extractionusing bit planes and standard deviationsrdquo in Proceedings of the38th Asilomar Conference on Signals Systems and Computersvol 1 pp 582ndash586 November 2004

[6] X Liu K W Bowyer and P J Flynn ldquoExperiments withan improved iris segmentation algorithmrdquo in Proceedings ofthe 4th IEEE Workshop on Automatic Identification AdvancedTechnologies (AUTO ID rsquo05) pp 118ndash123 October 2005

[7] H Proenca and L A Alexandre ldquoIris segmentation methodol-ogy for non-cooperative recognitionrdquo IEE Proceedings VisionImage and Signal Processing vol 153 no 2 pp 199ndash205 2006

[8] S J Pundlik D L Woodard and S T Birchfield ldquoNon-idealiris segmentation using graph cutsrdquo in Proceedings of the IEEEComputer Society Conference on Computer Vision and PatternRecognition Workshops (CVPR rsquo08) pp 1ndash6 June 2008

[9] Z He T Tan Z Sun and X Qiu ldquoToward accurate and fast irissegmentation for iris biometricsrdquo IEEE Transactions on PatternAnalysis and Machine Intelligence vol 31 no 9 pp 1670ndash16842009

[10] J Liu X Fu and H Wang ldquoIris image segmentation basedon K-means clusterrdquo in Proceedings of the IEEE InternationalConference on Intelligent Computing and Intelligent Systems(ICIS rsquo10) vol 3 pp 194ndash198 October 2010

[11] F Tan Z Li and X Zhu ldquoIris localization algorithm based ongray distribution featuresrdquo in Proceedings of the 1st IEEE Inter-national Conference on Progress in Informatics and Computing(PIC rsquo10) vol 2 pp 719ndash722 December 2010

[12] S Bakshi H Mehrotra and B Majhi ldquoReal-time iris seg-mentation based on image morphologyrdquo in Proceedings of theInternational Conference on Communication Computing andSecurity (ICCCS rsquo11) pp 335ndash338 February 2011

[13] R Abiantun and M Savvides ldquoTear-duct detector for identify-ing left versus right iris imagesrdquo in Proceedings of the 37th IEEE

Applied Imagery Pattern Recognition Workshop (AIPR rsquo08) pp1ndash4 October 2008

[14] S Bhat and M Savvides ldquoEvaluating active shape models foreye-shape classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Acoustics Speech and Signal Processing(ICASSP rsquo08) pp 5228ndash5231 April 2008

[15] J Merkow B Jou and M Savvides ldquoAn exploration of genderidentification using only the periocular regionrdquo in Proceedingsof the 4th IEEE International Conference on Biometrics TheoryApplications and Systems (BTAS rsquo10) September 2010

[16] J R Lyle P E Miller S J Pundlik and D L Woodard ldquoSoftbiometric classification using periocular region featuresrdquo inProceedings of the 4th IEEE International Conference on Bio-metricsTheory Applications and Systems (BTAS rsquo10) September2010

[17] K Hollingsworth K W Bowyer and P J Flynn ldquoIdentifyinguseful features for recognition in near-infrared periocularimagesrdquo in Proceedings of the 4th IEEE International Conferenceon Biometrics Theory Applications and Systems (BTAS rsquo10)September 2010

[18] D L Woodard S Pundlik P Miller R Jillela and A RossldquoOn the fusion of periocular and iris biometrics in non-idealimageryrdquo in Proceedings of the 20th International Conference onPattern Recognition (ICPR rsquo10) pp 201ndash204 August 2010

[19] P E Miller J R Lyle S J Pundlik and D L WoodardldquoPerformance evaluation of local appearance based periocularrecognitionrdquo in Proceedings of the 4th IEEE International Con-ference on Biometrics Theory Applications and Systems (BTASrsquo10) September 2010

[20] P E Miller A W Rawls S J Pundlik and D L WoodardldquoPersonal identification using periocular skin texturerdquo in Pro-ceedings of the 25th Annual ACM Symposium on AppliedComputing (SAC rsquo10) pp 1496ndash1500 March 2010

[21] J Adams D L Woodard G Dozier P Miller K Bryant and GGlenn ldquoGenetic-based type II feature extraction for periocularbiometric recognition less is morerdquo in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR rsquo10) pp205ndash208 August 2010

[22] D L Woodard S J Pundlik P E Miller and J R LyleldquoAppearance-based periocular features in the context of faceand non-ideal iris recognitionrdquo Signal Image andVideo Process-ing vol 5 no 4 pp 443ndash455 2011

[23] DMalcik andMDrahansky ldquoAnatomy of biometric passportsrdquoJournal of Biomedicine and Biotechnology vol 2012 Article ID490362 8 pages 2012

[24] V Ramanathan and H Wechsler ldquoRobust human authentica-tion using appearance and holistic anthropometric featuresrdquoPattern Recognition Letters vol 31 no 15 pp 2425ndash2435 2010

[25] Y Chen M Adjouadi C Han et al ldquoA highly accurateand computationally efficient approach for unconstrained irissegmentationrdquo Image and Vision Computing vol 28 no 2 pp261ndash269 2010

[26] T Ojala M Pietikainen and D Harwood ldquoA comparativestudy of texture measures with classification based on featuredistributionsrdquo Pattern Recognition vol 29 no 1 pp 51ndash59 1996

[27] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[28] H Proenca S Filipe R Santos J Oliveira and L A AlexandreldquoThe UBIRISv2 a database of visible wavelength iris imagescaptured on-the-move and at-a-distancerdquo IEEE Transactions on

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 14: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

14 BioMed Research International

Pattern Analysis and Machine Intelligence vol 32 no 8 pp1529ndash1535 2010

[29] P Jonathon Phillips H Moon S A Rizvi and P J RaussldquoTheFERET evaluationmethodology for face-recognition algo-rithmsrdquo IEEE Transactions on Pattern Analysis and MachineIntelligence vol 22 no 10 pp 1090ndash1104 2000

[30] H Proenca and L A Alexandre ldquoUBIRIS a noisy iris imagedatabaserdquo in Proceedings of the 13th International Conferenceon Image Analysis and Processing vol 3617 of Lecture Notes inComputer Science pp 970ndash977 Springer Cagliari Italy 2005

[31] A K Jain P Flynn and A A Ross Handbook of BiometricsSpringer New York NY USA 2008

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology

Page 15: Research Article Optimized Periocular Template Selection ...downloads.hindawi.com/journals/bmri/2013/481431.pdf · uniquely. Computer aided identi cation of a person through face

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Anatomy Research International

PeptidesInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

International Journal of

Volume 2014

Zoology

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Molecular Biology International

GenomicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioinformaticsAdvances in

Marine BiologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Signal TransductionJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

Evolutionary BiologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Biochemistry Research International

ArchaeaHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Genetics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in

Virolog y

Hindawi Publishing Corporationhttpwwwhindawicom

Nucleic AcidsJournal of

Volume 2014

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Enzyme Research

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Microbiology