Upload
agbona-azeez
View
24
Download
1
Embed Size (px)
Citation preview
CHAPTER ONE
INTRODUCTION
1.1 BACKGROUND TO THE STUDY
Living in the information age, individuals have vast amounts of information that they wish to
keep private. Much of this information is protected by the use of passwords. Although this
approach is satisfactory with most individuals, some seek more secure methods. One approach is
using characteristics of individuals as the form of authentication, known as biometrics. Biometric
security is based on something you know or have. Fingerprints are the most common form of
biometrics and have several measurable distinctive characteristics. The biometrics industry is
growing fast because of new technology and the need for a more secure authentication system.
(Ling and Tamer, 2009).
Although many individuals feel that passwords are enough to protect our information, there are
several problems. People often forget passwords or worse, they can be stolen and then used by
other individuals. A person might have several different passwords used for different
applications. Passwords are often poorly chosen because individuals incorporate personal
information or use common dictionary words.
An alternative to passwords is using human characteristics for the purposes of identification; this
is known as biometrics. The individuals have these distinctive features; they can be used as a
form of identification. According to a study, as many as 80% of the public has allowed a
biometric feature to be recorded. Although there are several human characteristics that can be
measured for authentication including the face, eye, and voice, the fingerprint is the most
commonly used characteristic. Everyone is born with a fingerprint. They cannot be forgotten at
home or left in the car. Fingerprints are the oldest form of biometrics that has been used
successfully.
According to Wells (2001), in the 14th century, parents in China used the fingerprints and
footprints of their children as a form of identification. Since then, fingerprints have been studied
and their characteristics have been catalogued.
Each individual fingerprint is unique; everyone has an immutable fingerprint. A fingerprint
consists of several lines that produce patterns, called ridges, which can be used to verify and
authorize an individual.
1
The most common system used to classify the ridge patterns in fingerprints are known as the
Galton features. There are six classes of patterns. These are known as
Arch
Tented arch
Left loop
Right loop
Whorl
Twin loop.
Each pattern has its own distinct design that distinguishes them apart. The several features that
are classified are known as minutiae. These are the irregularities in the otherwise smooth pattern
of ridges in a fingerprint. The minutiae include characteristics called the
Crossover
Core
Bifurcation
Ridge ending
Island
Delta
Pore.
The crossover pattern is created when two different ridges cross each other. The point in which
swirls or other patterns often center around is known as the core. Bifurcation is the point at
which one ridge separates into two separate ridges. A ridge ending is the end point of a ridge.
An island is small ridge in the space between two other ridges and does not touch any other
ridges. The space in between ridges where several ridges surround is known as a delta.
Occurring inside ridges at steady intervals are pores. Ralph E. Johnson (1996).
A device is used to capture an image of the pattern of an individual’s fingerprint. There are two
main technologies used to capture the image of the fingerprint. The first technology involves
optical technologies using a prism in which a source of light is refracted. Using this light, the
device is able to take an accurate fingerprint image.
The second technology used is capacitive-based semiconductors. The fingerprint is obtained by
having the subject place the finger on a sensor chip. The chip then detects capacitance changes
between the ridges and valleys between the chip and skin and uses this to construct an image
2
according to the variance of voltages. According to Kroenke and David in Database Concepts.
3ed. New York: Prentice, 2007.
1.2 STATEMENTS OF THE PROBLEM
The problems associated with face to face attendance impersonation and impersonation at
examination hall are enormous, it is time consuming, strenuous, requires physical identification
and biasness. For these reasons, a secured biometric authentication system introduced to meet
with the latest technology in security considering the rate at which impersonation in classrooms
and examination hall are going. The existing method of identifying the students for class room
attendance and even examination writing is not reliable; as students can collect their colleagues’
identification without the lecturer or invigilator knowing. Friends can answer presence or write
attendance for their friends and write their examination for them.
1.3 AIM AND OBJECTIVES OF THE STUDY
The overall aim of the proposed system is to develop a desktop based application that can handle
student’s attendance system with a fingerprint biometric authentication system to prevent
impersonation in Computer Science Department of MoshoodAbiola Polytechnic, Abeokuta. To
be able to achieve this, the following objectives must be met:
To examine the major biometric technologies of today i.e. fingerprint biometric
authentication.
To be able to identify every student with a unique identity.
To relief the lecturer from the stress of manual attendance collection method and confirm
each and every student for attendance at required time.
To be able to use the latest technology for security measures.
To develop a system of examination model devoid of irregularities and fair to all
participants.
1.4 SIGNIFICANCE OF THE STUDY
The proposed system has a significant to the institution as well as to the students of Computer
Science, MoshoodAbiola Polytechnic, Abeokuta.
3
A. To the Institution
It will eradicate attendance impersonation.
It could also eradicate examination malpractice.
The fingerprint collected can be used as a form of identification when there is a
fraudulent act among the student.
The fingerprint of every student will be recorded in the database and as such it could be
used as a means of identification when there is need for it.
B. To the Student
There will be no mix up of student details as seen in the manual method, i.e. student record
search and sorting will be very efficient as no one shares same bio-data with another in the
world.
1.5 SCOPE OF THE STUDY
The proposed system is attendance system for the student with fingerprint biometric
authentication model for Computer Science Department, MoshoodAbiola Polytechnic,
Abeokuta. It allows for adding of student details with each person’s fingerprint. It as well has the
ability to calculate attendance values of the current students.
1.6 DEFINITION OF TERMS
These are words to be encountered in the proposed attendance system for the students.
Fingerprint:An impression or mark made on a surface by a person's fingertip, as used
for identifying individuals from the unique pattern of the finger.
Biometrics:Biometric is the science of measuring physical and behavioral characteristics
that uniquely identify individuals.
Fingerprint Authentication:Refers to the automated method of verifying a match
between two human fingerprints.
Fingerprint Scanning:Is the process of electronically obtaining and storing human
fingerprints.
Database:A database is the collection of information that has been organized so that it
can be access easily, manages, updated and retrieved by authorized user.
4
Response time:This system may take hours to match a candidate, while fingerprint
systems respond with seconds or fractions of seconds.
Capture:This system is designed to use the entire fingerprint, rolled from nail to nail,
and often capture all ten fingerprints. Fingerprint systems use only the center of the
fingerprint, capturing only a small fraction of the overall fingerprint data.
5
CHAPTER TWO
LITERATURE REVIEW
2.1 Brunelli (1993), used template matching for biometric recognition of human fingerprint.
The algorithm prepares a set of two masks representing skin and nail for each registered person.
To identify the unknown person in the image, the algorithm will first detect the skin using
template matching and then normalize position, scale and thickness of the finger skin and nail.
Next for each person in the database, the algorithm will place the two masks i.e. finger skin and
nail on their position.
Sato et al.,(1998) used neural networks instead of template matching to recognize a fingerprint.
In the neural networks, output unit corresponds to registered person and input units correspond to
pixels of the input image. In the biometric recognition of human fingerprint phase, the neural
network computes an output vector for each test image and the unknown person in the image is
classified as the person corresponding to the output unit that has the maximum value of the
output vector if the maximum value is greater than the threshold value.
Kwawaguchi, ( 2000) proposed a new algorithm to detect fingerprint nail of an individual in an
intensity image. They implemented the separability filter and though transform to measure the fit
of pair of fingers to the image. The algorithm then selects a pair of finger with the smallest cost
from the five fingers.
Kirby S., (1990) developed the first introduction of a low dimensional characterization of
fingers. This is meant to differentiate various fingers from each other.
Turk, (1991) used eigenspace method instead of template marching. This method constructs an
eigenspace for each registered person using sample finger images. In the biometric recognition of
fingerprint phase, the tested image is projected onto the eigenspaces of all registered person to
compute the matching errors. The unknown person in the image is classified as the person
corresponding to the eigenspace giving the smallest matching error.
Moon P., (2001) investigates principal component analysis using ferret database to examine the
Eigen finger performance through the changing illumination, compression algorithm, varying the
number of eigenvector and changing the similarity in the classification process.
Yang (2000)demonstrated the successful result in fingerprint recognition, detection and tracking
with representing the PCA second order statistics of the finger image. He also implements
several image processing such as segmentation, desk Ewing, zooming, rotation and warping to
6
observe the Eigen finger capability. The capability of neural network in a pattern classification
enables it to be chosen in finger recognition experiment.
Ahmad Fadzil (1994) also developed a biometric recognition system for scanning human
fingerprint using multilayer perception artificial neural network.
Biometrics originated from two Greek words: “bios” (life) and “metrikos” (measure). Biometrics
is defined as the identification of an individual based on physiological and
behavioralcharacteristics. Physiological characteristics include face (2D/3D facial images, facial
IR thermo gram), hand (fingerprint, hand geometry, palm print, hand IR thermo gram), eye (iris
and retina), ear, skin, odor, dental, and DNA, while behavioral characteristics include voice, gait,
keystroke, signature, mouse movement, and pulse. And multimodal biometrics can be combined
in a system to improve the recognition accuracy. In addition, some soft biometric traits like
gender, age, height, weight, ethnicity, and eye color can also be used to assist in identification
(Qinghai, 2010).
Generally, a biometric system is designed to solve a matching problem through the live
measurements of human body features. It operates in two stages. First, a person must register a
biometric (physiological and behavioral) in a system where biometric templates will be stored.
Second, the person must provide the same biometric for new measurements. The output of the
new measurements will be processed with the same algorithms as those used at registration and
then compared to the stored template. If the similarity is greater than a system-defined threshold,
the verification is successful; otherwise it will be considered unsuccessful (Qinghai, 2010).
Biometric technologies enable automatic personal recognition based on physiological or
behavioral characteristics (Prabakaretal, 2003). Biometric is defined as the "automated
identification or verification of human identity through the measurement of repeatable
physiological and behavioral characteristics" (Association of Biometric, 2004).
2.2 FEATURES OF BIOMETRIC AUTHENTICATION TO PREVENT
IMPERSONATION
The primary advantage of biometric authentication methods over other methods of user
authentication is that it is simple, easier to use and its portable in size. These methods use real
human physiological or behavioral characteristics to authenticate users. These biometric
characteristics are (more or less) permanent and not changeable. It is also not easy (although in
7
some cases not principally impossible) to change one’s fingerprint, iris or other biometric
characteristics. Users cannot pass their biometric characteristics to other users as easily as they
do with their cards or passwords. Biometric objects cannot be stolen as tokens, keys, cards or
other objects used for the traditional user authentication, yet biometric characteristics can be
stolen from computer systems and networks. Biometric characteristics are not secret and
therefore the availability of a user’s fingerprint or iris pattern does not break security the same
way as availability of the user’s password. Even the use of dead or artificial biometric
characteristics should not let the attacker in.
Most biometric techniques are based on something that cannot be lost or forgotten. This is an
advantage for users as well as for system administrators because the problems and costs
associated with lost, reissued or temporarily issued tokens/cards/passwords can be avoided, thus
saving some costs of the system management. Another advantage of biometric authentication
systems may be their speed. The authentication of a habituated user using an iris-based
identification system may take 2 (or 3) seconds while finding your key ring, locating the right
key and using it may take some 5 (or 10) seconds.
2.3 TECHNICAL DEVELOPMENT
2.3.1 Classification
An important issue when considering biometric technology is to address the distinct
classifications formally defined through application and implementation. The majority of the
literature makes a distinction between the two general categories of biometric identifiers, namely
physiological and behavioral.
i. Physiological methods include: DNA, ear, infrared thermo grams, hand/finger geometry,
iris scan, odor and retinal scan.
ii. Behavioral methods include: gait, keystroke dynamics, signature and voice.
Further comparative sub-classification helps to clarify biometric application categories
conceptually and also defines the specifics of usage within an application domain. Maltoni et al.,
provides the most comprehensive guidance in this regard. Whether behavioral or physical,
biometric systems will have usage permutations based on the following sub-categories, a
verification or identification system, on-line or off-line system and positive or negative modes of
operation.
8
Verification systems are often referred to as ‘Am I who I claim I am?’ systems. A user’s
captured biometric is authenticated against the biometric template provided by the user during
prior enrolment. So verification is a one-to-one comparison that usually requires two pieces of
information at the point of entry. The information required is either a user name or unique PIN
number and the necessary biometric. In a verification system the enrolment information for a
given user can be held on a database or on a smartcard issued to the user. (Note that verification
is also referred to as authentication).
Identification systems are often referred to as ‘Who am I’ systems. Essentially, an individual’s
biometric data is presented to the system as anonymous and comparison is carried out on a one-
to-many basis. In other words, the presented template is compared with all templates in the
entire database to find a possible match. The exhaustive search nature of this operation may
create a significant problem in computational complexity when dealing with very large stores of
biometric information.
Although there are possible exceptions to the rule, on-line systems usually require fast
recognition and speedy response e.g. network logon applications. However, off-line systems
may tolerate relatively long response periods such as crime scene fingerprint processing through
a forensic fingerprint application. In the case of identification systems, search efficiency will
generally be a bigger issue if the system is to be on-line rather than off-line. Usually, every
aspect of an on-line system is fully automatic and scanning is done using a ‘live-scan’ scanner.
Whereas an off-line system may contain some manual procedures, e.g. for AFIS, a duty officer
may process a suspect by first taking inked fingerprints whilst checking for quality acquisition at
the same time. The prints may then be added to a biometric system through a cold or off-line
scanner. It is interesting to note that such a system may also require an element of semi-
automatic functionality when producing matching results. A latent fingerprint processed through
the system may produce a list of possible candidates for identification. The list could then be
analyzed by a forensic expert before a final ‘human’ decision is made. Note also that this is an
example of an application requiring storage of the raw image data of the captured biometric.
In positive mode the system establishes whether a given individual is the identity being,
(implicitly or explicitly) claimed. A positive application makes sure that a given identity is used
by only one person and can operate in both verification and identification mode.
In negative mode the system establishes whether the person is who he (implicitly or explicitly)
9
denies being. A negative application makes sure a given individual cannot use multiple
identities within the system and operates only in identification mode. Another interesting point to
note about positive and negative recognition is that, although traditional systems of username
and password verification etc work for positive recognition, negative recognition can only be
achieved through biometrics, for example face recognition systems for airport security.
2.4 KEY TECHNOLOGIES
The fundamental computing concepts at the core of modern biometrics include image
processing, pattern recognition, statistics, basic signaling and some machine learning models
such as knowledge based systems and neural nets. This section gives the technological
background to the most common biometric identifiers as drawn from the literature.
Iris
Widely regarded as potentially the most robust of all biometric identifiers, iris recognition
systems are said to be distinct for both each person and in each eye. Even identical twins have
differing iris features and matching is extremely fast and accurate. Dr. John Daugman of
Cambridge University’s computer laboratory developed the key algorithms for image capture,
feature extraction and matching during the early 1990’s. Daugman gives a comprehensive
account of the technical and performance aspects of his algorithms.
The key problems during feature extraction are detecting the pupil, (which can vary up to 15%
from a central position in the eye) and removing noise created by eye-glasses or light reflection
on the cornea as well as parts of the iris obscured by the eyelashes or drooping of the eyelid. This
is achieved by using edge detection to create zones of texture across the iris by differentiating
between the sclera, white of the eye, on the outer zone and the varying dilation of the pupil on
the inner zone. The isolated image is demodulated using 2 dimensional Gabor wavelets to
extract the phase information only. Phase information is more discriminating because phase
angles can be assigned without the reliance that amplitude has on contrast and illumination.
After masking to reduce noise, a 2k, (256 byte) iris code is produced.
During matching, a normalized Hamming distance measure – (or count of bit differences
between two iris templates when masked) – is calculated to determine a match or non-match.
This matching principle Daugman calls, “the failure of a test of statistical independence on iris
phase structure…” with a failure signifying a match. The sheer “combinatorial complexity” of
10
the phase information produced by Daugman’s algorithms in all practical systems tested has yet
to produce one false match between two different iris templates.
Further, it is claimed that under ideal conditions enrolment can take less than a second and that
when matching, hamming distances can be calculated quickly, so large databases can be
searched at a rate of 40,000 templates in less than a second. Therefore, iris recognition systems
are the only biometric method that allow the same algorithms to be used for both verification and
identification
Voice
Voice is an acceptable biometric for many and in fact is the only possible biometric for most
audio-technologies. It is important to note that there is a distinction made between voice
verification or speaker recognition, (i.e. identifying a specific speaker) and speech recognition,
(i.e. identifying what is being said). Research into speaker recognition goes back over forty
years and relies on both behavioral and physical traits. Physical traits include such properties as
the size and shape of the vocal chords, vocal tract, palate etc and learned behaviors include style
of speech, voice pitch and timbre. The fact that behavioral as well as physical traits combine in
resulting speaker system templates or “voice prints”, leads to the method being classified as a
behavioral biometric in general, (Ibid). The ubiquity of acoustic technology such as telephony
makes speaker recognition an attractive security option. This is because it is often possible to
take advantage of existing audio hardware when deploying such systems.
Generally, speaker recognition systems must first convert captured analogue speech signals to
digital and further process them using spectral analysis principles. Typically, Fourier transforms
can be used to derive coefficients for complex audio wave functions which in turn can be used to
isolate the ceptstral feature vector for representing the human voice, (Ibid). For matching
algorithms, a number of techniques can be used, from machine learning, e.g. k-nearest neighbor,
Statistical models, e.g. Hidden Markov, to neural nets.
Face
Face recognition is considered one of the most non-intrusive of biometric methodologies because
we naturally use distinguishing facial characteristics to differentiate between people every day.
For image capture, standard optical scanners can be used for legacy data, i.e. still photos and live
capture can be done with ordinary photographic equipment. Certain newer technologies acquire
a 3D image of the face using stereo, structured light or phase-based ranging and near infrared can
11
be used to supplement face detection in poor lighting conditions.
Processing proceeds by applying any necessary linear transformations in order to normalize a
captured image. The next step is to detect the face within the image; this can be achieved by
running a generic algorithm to detect the face shape through facial textures. Image encoding can
be either localized or global. Local models are based on establishing the relationship between a
number of facial features, such as the distance between the eyes, or the distance between each
eye and the nose etc. The global model is template-based, such as the eigenface approach and
relies on general facial patterns for classification.
During the 1990’s Turk and Pentland popularized the eigenface method. The process works by
first deriving the standardized set of eigenfaces over a dataset of normalized face images using
matrix algebra. Each image is converted to a vector of intensity values with a length equal to the
number of pixels within the image. Next a mean “average face” is determined using all image
vectors in the dataset. Then the difference from the average is calculated for each image and
used to compute a covariance matrix. This provides correlation information across the dataset
and it is the eigenvectors in the matrix that represent the eigenfaces. However, as the dimensions
of the image vectors increase the number of generated eigenfaces grows exponentially. Principal
Component Analysis, (PCA), is a statistical model that provides the mechanism to reduce the
number of eigenvectors derived from the covariance matrix from the size of the image
dimension, to the size of the dataset. PCA can be used in face recognition because faces have
similar patterns in general and PCA tells us that for a number of images N, there are only N non-
trivial eigenvectors, i.e. those with the highest correlation values.
Any human face can be considered a combination of the standard sub-set of these eigenfaces.
Each eigenface represents a pattern of evaluation for different facial features, e.g. symmetry, size
of the nose etc. Therefore, a given facial image can be compressed to a list of correlation values
corresponding to each eigenface and in the range 1 to -1. For face recognition, a test image can
be projected onto the standard set and the closest image match located using Euclidean distance
measures. Hybridized approaches based on both local and global methods, called eigen features,
have also been developed and another technique, fisherfaces, is said to be less sensitive to light
variation and facial angle, (Ibid). Surface texture analysis, (STA), relies on the skin features of
an image, which contains the most detailed information and Kung et al have proposed algorithms
for probabilistic neural nets for localized feature extraction.
12
Hand
Hand geometry measures Length of fingers and/or width of hand to harvest an identifier that may
be invariant and specific to a user. However, hand/finger geometry is limited in distinctiveness.
For this reason it is only useful for small scale non-critical use. Hand readers are
computationally very efficient, easy to use and have a widespread application. For a given
system there is an associated feature set of hand measurements. When enrolling on a system a
user places their hand on the platen, which may use guiding pegs to make sure that a hand is
consistently presented. The captured hand image is processed and reduced to a feature vector, a
value based on the feature set of hand measurements.
For matching, a test feature vector is compared with the claimed identity feature vector from the
system database. Usually a distance metric model is used for comparison. Jain et al, tested four
different distance metrics, absolute, weighted absolute, Euclidian and weighted Euclidian. The
results showed that over a feature set of 16 measures the weighted Euclidian metric performed
best, although the researchers point out that better performance could be achieved with higher
level features, such as skin colour, wrinkles and folds on skin.
Fingerprint
Fingerprint-based identification is the oldest biometric system in terms of successful practical
application. The invariant and immutable aspects of a fingerprint supposedly lie in the patterns
of ridges and furrows, as well as the ridge characteristics occurring at either a ridge bifurcation
or a ridge ending – the so called minutiae points. A ridge ending is, obviously, where a ridge
line terminates and a bifurcation is where a ridge line splits into two separate ridge lines. In
terms of fingerprint representation, ridge and furrow patterns are global features and minutiae are
local features.
The traditional Henry system is based on identifying the global patterns observed in fingerprints
and classifying them based on the distinct flow of each pattern. Examples of classification are:
arch, tented arch, left whorl etc. These flow patterns are usually formed around other features
called singular point, which takes the form of a loop or delta shape. Where the singular points lie
is very important in classifying and indexing fingerprints. As it turns out, global patterns are not
particularly distinct between individuals. For AFIS applications an individual’s ten fingers are
classified and a signature vector is created known as a ten-print-card. Although not unique to an
13
individual, the ten-print-card can be used to index and create a sub-set of the AFIS database, thus
reducing the search space.
2.5 TYPES OF BIOMETRICS
Data Matching – channel biometric, the identification of an individual using the analysis
of segments from DNA.
Ear – Visual biometric, the identification of an individual using the shape of the ear.
Eyes-Iris recognition – Visual biometric, the use of pattern of veins in the back of the eye
to accomplish recognition.
Face recognition – Visual biometric, the analysis of facial feature patterns for
authentication or recognition of an individual’s identity. Most face recognition system
either use Eigen faces or local feature analysis.
Finger print recognition – Visual biometric, the use of the ridges and valleys (minutiae)
found on the surface tips of human finger to identify and individual.
Finger geometry recognition – Visual/spatial biometric, the use of 3D geometry of the
finger to determine identity.
Gait – behavior biometric, the use of individuals walking style or gaits to determine
identity.
Odor – olfactory biometric, the authentication of an individual odor to determine identity.
Signature recognition – Visual/ Behavioral biometric, the authentication of an individual
by the analysis of hand writing style, in particular the signature.
Typing recognition – Behavioral biometric, the use of the unique characteristics of a
person typing for establishes identity.
Vein recognition – Vein recognition is a type of biometric that can be used to identify
individuals based on the vein patterns in the human finger or palm.
Voice/speaker recognition – There are two major applications of speaker recognition.
i. Voice-speaker recognition/authentication – Auditory biometric, the use of the voice as a
method of determining the identity of a speaker for access control.
ii. Voice speaker identification – Auditory biometric identification, is the task of
determining an unknown speaker identity.
14
2.6 FINGER PRINT AUTHENTICATION
Finger print authentication or finger print recognition refers to the automated method of verifying a match
between two human finger prints. Finger prints are one of many forms of biometrics used to identify
individuals and verify their identity.
2.6.1 Background
The analysis of finger prints for matching purposes generally requires the comparison of several features
of the print pattern. These includes pattern, which are aggregate characteristics of ridges, and minutia
points, which are unique features found within the patterns.
It is also necessary to know the structure and properties of human skin in order to successfully employ
some of the imaging technologies.
2.6.2 Patterns
The three basic patterns of fingerprints ridges are the arch, loop and whorl:
Arch – The ridges enter from one side of the finger, rise in the center forming an arc, and
then exist the other side of the finger.
Loop – The ridges enter from side of a finger, from a curve, and then exit on that same
side.
Whorl: Ridges from circularly around a central point of the finger.
2.6.3 Minutia features
The major minutia features of finger print ridges are ridge ending, bifurcation, and short ridge (or dot).
The ridge ending is the point at which a ridge terminates. Bifurcations are points at which a ridge splits
into two ridges. Short ridges (or dots) are ridges which are significantly shorter than the average ridge
length on the finger print. Minutiae and patterns are very important in the analysis of finger prints since
no two fingers have been shown to be identical.
15
CHAPTER THREE
DESIGN METHODOLOGY
3.1 With this system in mind and taking into account the Ashbourne definition of a modern
biometric system, it is possible to discern the more relevant material for appraisal.
Much of this material tends to be recent, reflecting the most significant period for the
development of biometrics as a usable modern technology, (c. 1990’s to the present),
with the necessary overall historical context being provided in the main introduction.
Maltonietal, Bolle et al and Wikipedia are in agreement as to the list of general
characteristics a biometric must meet in order to provide high level performance, these
include:
Universality, a characteristic of everyone.
Distinctiveness, any two persons should be sufficiently different.
Permanence, i.e. invariant with respect to matching, over time.
Collectability, biometric can be measured quantitatively.
Performance, achievable recognition accuracy, speed, robustness.
Acceptability, the extent to which people are willing to accept the system.
Circumvention, how easy it is to fool the system.
As a gauge to performance these characteristics provide the context for understanding common
biometric identifiers in relation to the key aspects mentioned above. In summary, it is hoped that
the review will reflect the overall research objective, provide insight as to the current level/limits
of knowledge and identifying controversies and areas of further research.
3.2 INFORMATION GATHERING PROCESS
In the course of writing this project, several tools were used during the information gathering
process so as to successfully design and implement a functional System. Some of the tools used
are listed below:
Interview: Some information needed for the project was gathered by conducting interviews which helped
to gather facts about the conventional method of student attendance keeping.
Observation: As an Analyst, I made personal observations of the current method involved in keeping
student attendance.
16
The Web: The Internet has always provided information needed to build a user-friendly and effective
System.
3.3 SYSTEM DESIGN
The System design includes input to the System, the process, and output.
3.3.1 Inputs to the System
The inputs to this System are from the application users which are fully discussed under
the System’s data elements.
3.3.2 Data Elements
Data elements of a System are fields present in the System’s database. The followings are the
data elements of the Student clock-in System:
Full name: This is the name of the application user and it is required for identification
purpose.
User name: This is the username of the application user.
Password: Password of the application user.
Mobile number: This is the phone number of the application user required for contact
purpose.
Time-in: This is the exact time the application user clocks-in.
Time-out: This is the exact time the application user clocks-out.
Date of birth: This is the date of birth of the application user.
3.3.3 Process Of The System
The process of the system includes user registration, user clock-in, user clock-out, record update and
searching user records.
3.3.4 Output from the System
The outputs of this System are shown in Chapter 4, which is the Implementation Phase and it
also contains screen shots of the System.
17
3.4 MODULES OF THE BIOMETRIC SYSTEM
Any biometric system is basically made of the following component which is illustrated in
the Figure 1 below:
Figure 1: Components of Biometric System.
1. PORTAL: Its purpose is to protect some assets. An example of a portal is the gate at an
entrance of a building. If the user has been successfully authenticated and is authorized to
access an object then access is granted.
2. CENTRAL CONTROLLING UNIT: receives the authentication request, controls the
biometric authentication process and returns the result of user authentication.
3. INPUT DEVICE: The aim of the input device is biometric data acquisition. During the
acquisition process user’s aliveness and quality of the sample may be verified.
4. FEATURE EXTRACTION MODULE: processes the biometric data. The output of the
module is a set of extracted features suitable for the matching algorithm. During the
feature extraction process the module may also evaluate quality of the input biometric
data.
5. STORAGE OF BIOMETRIC TEMPLATES: This will typically be some kind of a
database. Biometric templates can also be stored on a user-held medium (e.g., smartcard).
18
Portal
Central Controlling
Unit
Storage
In that case a link between the user and her biometric template must exist (e.g., in the
form of an attribute certificate).
6. THE BIOMETRIC MATCHING ALGORITHM: compares the current biometric
features with the stored template. The desired security threshold level may be a parameter
of the matching process. In this case the result of the matching will be a yes/no answer.
Otherwise a score representing the similarity between the template and the current
biometric sample is returned. The central unit then makes the yes/no decision.
3.4.1Model ofthe Biometric Authentication System
3.4.1.1 System Architecture
Input of student
Bio-data
3
4
1
Input of student
Passportphotograph
Figure 2: Biometric System Architecture.
The diagram labeled 1 is the input of the student bio-data set to the system
The diagram labeled 2 is the scanner that takes the biometric property (finger print image) into
the system.
The diagram labeled 3 is the part of system that triggers the scanner to take image of finger print
and set it to the system.
The diagram labeled 4 is the input of the student passport photograph set to the system
The diagram label 5 is the main controlling unit that receives the image and some other
information about the user. The unit can create a new user and as well majorly authenticate and
existing user.
19
Create New Entity Database
Capture Image
The diagram label 6 is the back end database where all the system information is stored.
DESIGN APPROACH
The purpose of this is to use the information gathered in the analysis phase to design the System.
3.5 SYSTEM REQUIREMENTS
3.5.1 Hardware Requirements
The hardware requirements for this application are:
60GB hard disk or higher,
Monitor,
Uninterrupted power supply,
Random Access Memory size of 512MB or higher.
A Bio-metric scanner (preferably digital personnel)
3.5.2 Software Requirement
The software requirements for this application are:
Windows XP, Windows Vista, Windows 7 or 8,
Mysql Server,
Java Runtime (version 5.0 and above),
Net Beans IDE,
Biometric scanner setup (digital personal)
3.6 DATABASE DESIGN
A database is defined as a repository for stored data. Files stored in the System are the application user
files and these files contain information about the application user authentication details. Mysql is used
for the database design.
3.7 SYSTEM FLOW CHART
Below is the data flow diagram of this System. This data flow diagram consists of, an external
entity, a database, processes, outputs and stored data. The external entity that exists in this
System is the Student. The student fills out required information in the student registration form
which undergoes validation before submission to database. The clock-in and clock-out records
are also stored in the database along with the matching student information.
20
Wait for print Is option =check out
Accept print
Did print match a Student
Is option =check in report
Display check in report
Display check outreport
Is option =check outreport
is require input supply
Save student
C
AB
No
No
Yes
Yess
Start
Wait for input
Accept input
Did accept input match
Fingerprint enrollment Is option =clock in Wait for inputMain page
No
No
Yes
21
No
Yes
Yes
Is option logout
Is option =registersave to DBWait for input
Update student
Wait for update input
Accept input
CA B
register
Is option view
Stop
Update DB
No
No
No
No
Yes
Yes
Yes
Fig 3.1 System Flow Chart Diagram
3.8 METHOD OF DATA COLLECTION
22
is require input supply
No
Finger Print Image
Bio-data
Other records
The proposed system is developed with Visualbasic.net and Microsoft Access as the database.
Research methodology refers to the methods or tools that I used during the research of this work.
They are as follows:
For Data Gathering: the method I used in gathering of data is direct.
Technique Used: the technique that I will use in the proposed system is the parallel method of
implementation simply because parallel supports the use of a system with the existing one in
case of system failure so that it will not be back to square one for the user of such system.
Tools: The tools used are Visualbasic.Net development environment and MYSQL as the back
end.
3.9 SYSTEM DATABASE STRUCTURE
The database is the back end storage that consists of all the information in the system, the
database consists of record about every entity. An entity is something that has a distinct feature
among other. This Biometric Student Attendance System database is structured to have four
tables that fully describe every entity that exist. Below in Figure 3 is a diagram that describes the
database mode
Figure 4: The Database Model.
CHAPTER FOUR
23
IMPLEMENTATION AND EVALUATION
4.1 IMPLEMENTATION ENVIRONMENT
This project was implemented using java programming language to write codes at the backend
and also to design the front end of the application. The database used for record storage is
MYSQL. The implementation phase also involves the use of a biometric scanner which enables
users to scan their finger prints.
4.1.1 Implementation Phase
The implementation of this project is divided into two phases:
Studentand Administrator registration phase: This phase allows the students and
administrators to register their details to enable them gain access to the application’s
main page. After an administrator has successfully registered, he can then register a
student.
User authentication phase: This phase enables a user to access the application's main
page provided the user has registered and has correct login details to the applications
main page and correct finger print image to that effect.
4.2.1 Implementation Screenshots
This describes what each page of the application looks like. It describes how the user interacts
with the System and how the software communicates within itself. The followings are the
interfaces that will be interacting with the application user whenever it is run.
4.2.2 Application's Login Form
24
This is where the application user supplies his or her login details. It enables the user to access
the main page provided the user’s login details exists in the database with the proper case.
Fig 5: Application’s login form
4.2.3 Application's Main Page
This is the page that links other pages in the System. It is the first page that appears after a user’s
successful login .This page also displays different times an clocks in and out.
25
Fig 6: Application’s main page
4.2.4 Administrator Registration Form
This is a sub panel under the main page where records of new administrators are registered. The
process of registration is carried out by another administrator of the System.
26
Fig
7: Admin registration form
4.2.5 Pre-Populated Administrator Update Form
This form enables an administrator details to be updated. Before update, an administrator is
selected by username and the records of the administrator are populated in fields provided.
27
Fig 8: Pre-populated administrator update form
4.2.6 Populated Administrator Update Form
This form displays the interface of an already populated administrator form. The fields where
required details are populated are then edited. In some cases, some individuals may undergo
gender change which is why the gender filed is not disabled. This form also enables a user to
update their passport photograph.
28
Fig 9: Populated administrator update form
4.2.7 Administrator Finger Print Update Form
This form enables the thumb pint of an administrator to be updated provided the scanner was
connected to the System. The administrator may decide to start using the left thumb instead of
the right thumb should incase he/she was involved in an accident that caused his/her right thumb
to be set aside. This page enables a username to be selected such that the user’s finger print
image is fetched from database and replaced with a new image.
29
Fig 10: Form to update administrator finger print
4.2.8 Finger Print Scan Page
This form is used for the purpose of getting an image of a users finger print. This form also
furnishes the user with information about the scanner. The information enables the user to know
when the scanner is connected, when the scanner is disconnected from the System, when the
scanner interface is touched and when the scanner is refreshed.
30
Fig 11: Interface that enables user’s finger print scanning
4.2.9 Testing
After implementation, the following tests will take place:
Alpha test: is performed by the software developers before deployment.
Beta test: is performed by the users of the System.
31
CHAPTER FIVE
RECOMMENDATION AND CONCLUSION
5.1 RECOMMENDATION
The testing and evaluation of the result of this project show that adopting a student clock-in System will
greatly improve the keeping and management of Student records, reduce the cost of managing such
records, reduce paper work associated with keeping Student records, and improves data consistency. It
also ensures that Students constantly maintain punctuality at work. Therefore, I strongly recommend that
all establishments adopt and use a Student clock-in System to keep track of when students resume and
close for work.
5.2 CONCLUSION
In this project, we have presented a fingerprint-based attendance system. The developed system is an
embedded system that is part of a fingerprint recognition/authentication system based on. The system
extract the local characteristic of a fingerprint and templates are matched during both registration and
verification processes. The developed system is very helpful in saving valuable time of lecturer and
student. It also helps in generating reports at required time. The system can record the clock in and clock
out time of students in a very convenient manner using their fingerprint to prevent buddy-punching and
reduce level of absence. Also, it reduces most of the administrative jobs and minimizes human errors,
eliminates time-related disputes and helps to update and maintain attendance records.
32
REFERENCES
Automatic Face and Gesture Recognition (FG), 2000. pp. 462-467.
BioAPI (2001), BioAPI Specification, American National Standards Institute.
Brunelli (1993) - Introduction to Biometrics. Springer
In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 1999.
Vol. 1, pp. 274-280.
Jain, A. K. and Uludag, U. (2003), Hiding biometric data, IEEE Transactions on Pattern
Analysis and Machine Intelligence.
Jones M. andRehg J. (1999). Statistical color models with application to skin detection.
Kroenke, David M. and David J. Auer. Database Concepts. 3rd ed. New York: Prentice, 2007.
Kwawaguchi (2000) – Practical Demonstration of Biometric Recognition. India
Moon P. (2001) – Hand Geometry and Palm Print. USA
Rehg J. Rehg, T. Kanade. DigitEyes: Vision-Based Human Hand-Tracking. School of
Computer Science Technical Report CMU- CS-93-220, Carnegie Mellon University, December
1993.
Sato Y., Kobayashi Y., H. Koike. Fast tracking of hands and fingertips in infrared
Schneier, B. (1999), Inside Risk: The uses and abuses of biometrics.
Segen J. Segen, S. Kumar. Shadow gestures: 3D hand pose estimation using a single camera. In
Proceedings of IEEE Conference on Computer.
Trucco E. Trucco (1999), A. Verri. Introductory Techniques for 3D Computer Vision. Prentice
Hall, 1998.
Vision and Pattern Recognition (CVPR), 1999. Vol. 1, pp. 479-485.Images for augmented desk
interface. In proceedings of IEEE International Conference.
Zhang Z., Y. Wu, Y. Shan, S. Shafer (2001). Visual panel: Virtual mouse keyboard and 3rd controller
with an ordinary piece of paper. In Proceedings of Perceptual User Interfaces, 2001.
33
APPENDIX
Login Code
public class Login extends javax.swing.JFrame {
/**
* Creates new form Login
*/
Connection con;
PreparedStatementps;
ResultSetrs;
Statement st;
DefaultComboBoxModelcombomodel;
static String rowData[];
static Vector<String> data2 = new Vector<String>();
public static String fullname, user, pw;
public Login() {
initComponents();
URL url = this.getClass().getClassLoader().getResource("ThInc.png");
Image im = Toolkit.getDefaultToolkit().getImage(url);
setIconImage(im);
// jTextField1.setText(user);
}
/**
* This method is called from within the constructor to initialize the form.
* WARNING: Do NOT modify this code. The content of this method is always
* regenerated by the Form Editor.
*/
34
@SuppressWarnings("unchecked")
// <editor-fold defaultstate="collapsed" desc="Generated Code">
private void initComponents() {
jDesktopPane1 = new javax.swing.JDesktopPane();
jTextField1 = new javax.swing.JTextField();
jLabel1 = new javax.swing.JLabel();
jLabel2 = new javax.swing.JLabel();
jTextField2 = new javax.swing.JPasswordField();
jButton1 = new javax.swing.JButton();
jButton2 = new javax.swing.JButton();
jButton3 = new javax.swing.JButton();
jLabel17 = new javax.swing.JLabel();
setDefaultCloseOperation(javax.swing.WindowConstants.EXIT_ON_CLOSE);
setTitle("TheSA Login");
setLocationByPlatform(true);
setResizable(false);
addFocusListener(new java.awt.event.FocusAdapter() {
public void focusGained(java.awt.event.FocusEventevt) {
formFocusGained(evt);
}
});
addWindowListener(new java.awt.event.WindowAdapter() {
public void windowClosing(java.awt.event.WindowEventevt) {
formWindowClosing(evt);
}
});
jDesktopPane1.setBackground(new java.awt.Color(255, 255, 255));
jDesktopPane1.add(jTextField1);
35
jTextField1.setBounds(150, 40, 260, 30);
jLabel1.setFont(new java.awt.Font("BatangChe", 0, 18)); // NOI18N
jLabel1.setText("PASSWORD:");
jDesktopPane1.add(jLabel1);
jLabel1.setBounds(20, 100, 130, 30);
jLabel2.setFont(new java.awt.Font("BatangChe", 0, 18)); // NOI18N
jLabel2.setText("USERNAME:");
jDesktopPane1.add(jLabel2);
jLabel2.setBounds(20, 40, 130, 30);
jTextField2.addKeyListener(new java.awt.event.KeyAdapter() {
public void keyPressed(java.awt.event.KeyEventevt) {
jTextField2KeyPressed(evt);
}
});
jDesktopPane1.add(jTextField2);
jTextField2.setBounds(150, 100, 260, 30);
jButton1.setText("Clear");
jButton1.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEventevt) {
jButton1ActionPerformed(evt);
}
});
jDesktopPane1.add(jButton1);
jButton1.setBounds(160, 170, 140, 30);
jButton2.setText("Exit");
jButton2.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEventevt) {
36
jButton2ActionPerformed(evt);
}
});
jDesktopPane1.add(jButton2);
jButton2.setBounds(310, 170, 120, 30);
jButton3.setText("Log In");
jButton3.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEventevt) {
jButton3ActionPerformed(evt);
}
});
jDesktopPane1.add(jButton3);
jButton3.setBounds(20, 170, 130, 30);
jLabel17.setFont(new java.awt.Font("Bodoni MT Poster Compressed", 1, 18)); // NOI18N
jLabel17.setHorizontalAlignment(javax.swing.SwingConstants.CENTER);
jLabel17.setBorder(javax.swing.BorderFactory.createTitledBorder(javax.swing.BorderFactory.cr
eateLineBorder(new java.awt.Color(0, 0, 0)), "STUDENT ATTENDANCE SYSTEM"));
jDesktopPane1.add(jLabel17);
jLabel17.setBounds(0, 10, 440, 220);
javax.swing.GroupLayout layout = new javax.swing.GroupLayout(getContentPane());
getContentPane().setLayout(layout);
layout.setHorizontalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addComponent(jDesktopPane1, javax.swing.GroupLayout.DEFAULT_SIZE, 444,
Short.MAX_VALUE)
);
layout.setVerticalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
37
.addComponent(jDesktopPane1, javax.swing.GroupLayout.DEFAULT_SIZE, 235,
Short.MAX_VALUE)
);
pack();
}// </editor-fold>
private void jButton1ActionPerformed(java.awt.event.ActionEventevt) {
jTextField1.setText("");
jTextField2.setText("");// TODO add your handling code here:
}
private void jButton2ActionPerformed(java.awt.event.ActionEventevt) {
int y = JOptionPane.showConfirmDialog(this, "Are you sure", "Exit",
JOptionPane.YES_NO_OPTION);
if (y == JOptionPane.YES_OPTION) {
System.exit(0);
} else {
setDefaultCloseOperation(DO_NOTHING_ON_CLOSE);
} // TODO add your handling code here:
}
private void jButton3ActionPerformed(java.awt.event.ActionEventevt) {
try {
setCursor(Cursor.getPredefinedCursor(Cursor.WAIT_CURSOR));
Thread.sleep(400l);
} catch (Exception e) {
System.out.println("");
}
if (jTextField1.getText().isEmpty() || jTextField2.getText().isEmpty()) {
38
JOptionPane.showMessageDialog(this, "UserName Or Password Field Is Empty", "ERROR",
JOptionPane.INFORMATION_MESSAGE);
} else {
try {
LoadDriver();
String SQLCommand2 = "select username,pswd from admin";
rs = st.executeQuery(SQLCommand2);
ResultSetMetaData md2 = rs.getMetaData();
int nColumns2 = md2.getColumnCount();
while (rs.next()) {
rowData = new String[nColumns2];
for (int i = 0; i < nColumns2; i++) {
rowData[i] = rs.getObject(i + 1).toString();
data2.addElement(rowData[i]);
}
}
System.out.println("ok oo<><> " + data2);// && data2.contains(jTextField2.getText())
if (data2.contains(jTextField1.getText())) {
String SQLCommand = "select * from admin WHERE username='" +
jTextField1.getText() + "'";
rs = st.executeQuery(SQLCommand);
rs.next();
fullname = rs.getString(2);
user = rs.getString(3);
pw = rs.getString(4);
if (pw.equals(jTextField2.getText())) {
System.out.println("Na ur name be dix oo " + fullname + " " + user);
JOptionPane.showMessageDialog(this, fullname + ", Welcome User...", "Welcome",
JOptionPane.INFORMATION_MESSAGE);
jButton1ActionPerformed(evt);
jTextField1.setText("");
39
dispose();
new Main().setVisible(true);
} else {
JOptionPane.showMessageDialog(this, "Password Not Correct", "LogIn Error",
JOptionPane.ERROR_MESSAGE);
}
} else {
JOptionPane.showMessageDialog(this, "UserName Not Correct", "LogIn Error",
JOptionPane.ERROR_MESSAGE);
}
// con.commit();
con.close();
} catch (Exception g) {
System.out.print(g.getMessage());
}
}
MAINPAGE SOURCE CODE
public static DPFPTemplate template;
public static String cardno, company;
publicDPFPTemplategetTemplate() {
return template;
}
public void setTemplate(DPFPTemplate template) {
DPFPTemplate old = this.template;
this.template = template;
firePropertyChange(TEMPLATE_PROPERTY, old, template);
}
40
public void setTableAll() {
dataSet = new DefaultTableModel();
dataSet.addColumn("Fullname");
dataSet.addColumn("Matric Number");
dataSet.addColumn("Month");
dataSet.addColumn("TIME");
dis();
jTable2.setModel(dataSet);
}
public void setTable3() {
dataSet = new DefaultTableModel();
dataSet.addColumn("Fullname");
dataSet.addColumn("Matric Number");
dataSet.addColumn("Month");
dataSet.addColumn("TIME");
dis3();
jTable3.setModel(dataSet);
}
private void dis3() {
try {
LoadDriver();
int row = dataSet.getRowCount();
while (row > 0) {
row--;
dataSet.removeRow(row);
}
//execute query
// if (jList1.getSelectedValue().toString().equals("ALL")) {
41
// rs = st.executeQuery("Select surname,firstname,lastname,card,gender from member
ORDER BY card,dateOfRegistration,surname,firstname,lastname");
//
// } else {
rs = st.executeQuery("Select fullname,matricNo,month,timeOut from report where type='signout'
ORDER BY id");
// }
//get metadata
ResultSetMetaData md = rs.getMetaData();
intcolcount = md.getColumnCount();
Object[] data = new Object[colcount];
//extracting data
while (rs.next()) {
for (int i = 1; i <= colcount; i++) {
data[i - 1] = rs.getString(i);
}
dataSet.addRow(data);
}
// jLabel1.setText(" " + dataSet.getRowCount());
} catch (SQLException g) {
System.out.println(g);
}
}
private void dis() {
try {
LoadDriver();
int row = dataSet.getRowCount();
42
while (row > 0) {
row--;
dataSet.removeRow(row);
}
//execute query
// if (jList1.getSelectedValue().toString().equals("ALL")) {
// rs = st.executeQuery("Select surname,firstname,lastname,card,gender from member
ORDER BY card,dateOfRegistration,surname,firstname,lastname");
//
// } else {
rs = st.executeQuery("Select fullname,matricNo,month,timeIn from report where type='signin'
ORDER BY id");
// }
//get metadata
ResultSetMetaData md = rs.getMetaData();
intcolcount = md.getColumnCount();
Object[] data = new Object[colcount];
//extracting data
while (rs.next()) {
for (int i = 1; i <= colcount; i++) {
data[i - 1] = rs.getString(i);
}
dataSet.addRow(data);
}
//jLabel1.setText(" " + dataSet.getRowCount());
} catch (SQLException g) {
System.out.println(g);
}
}
43
STUDENT REGISTRATION SOURCE CODE
String sex;
if (jRadioButton1.isSelected()) {
sex = "Male";
} else {
sex = "Female";
}
if (jTextField10.getText().isEmpty() || jRadioButton1.getText().isEmpty() ||
jRadioButton2.getText().isEmpty()
|| jTextField9.getText().isEmpty()) {
JOptionPane.showMessageDialog(this, "Some Fields Are Empty...\n Recheck All Fields",
"Error", JOptionPane.ERROR_MESSAGE);
} else {
print = EnrollmentForm.print;
System.out.println("This is the finger gotten ma broda");
try {
LoadDriver();
// if (print == null || jLabel42.getText().isEmpty()) {
// JOptionPane.showMessageDialog(this, "Right thumb finger have not been
captured or Passport is Empty", "Error", JOptionPane.ERROR_MESSAGE);
if (jLabel42.getText().isEmpty()) {
JOptionPane.showMessageDialog(this, "Passport is Empty", "Error",
JOptionPane.ERROR_MESSAGE);
return;
} else {
File f = new File(jLabel42.getText());
FileInputStream in = new FileInputStream(f);
44
image = new byte[(int) f.length()];
in.read(image);
if (print != null) {
in.read(print);
}
}
// Below: the question marks are IN parameter placeholders.
// if (f.length() > 60000) {
// JOptionPane.showMessageDialog(this, "Cannot capture student image.", "Passport
Error", JOptionPane.WARNING_MESSAGE);
// return;
// }
String sql = "INSERT INTO registration VALUES(?,?,?,?,?,?,?,?,?,?,?,?)";
ps = con.prepareStatement(sql);
ps.setInt(1, 0);
ps.setString(2, jTextField1.getText());
ps.setString(3, jTextField10.getText().toUpperCase() + " " +
jTextField9.getText().toUpperCase());
ps.setString(4, sex);
ps.setString(5, jComboBox1.getSelectedItem().toString());
ps.setString(6, jTextField8.getText().toUpperCase());
ps.setString(7, jTextField2.getText().toUpperCase());
ps.setString(8, jFormattedTextField1.getText());
ps.setString(9, jDateChooser1.getDate().getDate() + " " + (jDateChooser1.getDate().getMonth()
+ 1) + " " + (jDateChooser1.getDate().getYear() + 1900));
ps.setString(10, jTextArea1.getText());
ps.setBytes(11, image);
ps.setBytes(12, print);
45
JOptionPane.showMessageDialog(this, "Student Has Been Registered.\n" +
jTextField10.getText().toUpperCase() + " is " + rand, "SUCCESSFUL",
JOptionPane.INFORMATION_MESSAGE);
ps.executeUpdate();
// con.commit();
System.out.println("you get muth");
con.close();
data.clear();
Orga2();
// enroller.clear();
// capturer.startCapture();
// jInternalFrame1.dispose();
// picture.setIcon(null);
// JOptionPane.showMessageDialog(rootPane,
jComboBox7.getSelectedItem().toString());
jButton1ActionPerformed(evt);
// }
} catch (HeadlessException | IOException | SQLException e) {
System.err.println("I hear Errors " + e);
}
}
FINGERPRINT CAPTURE FROM BIOMETRICS
public class CaptureFormEnrolling
extendsJDialog {
privateDPFPCapture capturer = DPFPGlobal.getCaptureFactory().createCapture();
privateJLabel picture = new JLabel();
privateJTextField prompt = new JTextField();
privateJTextArea log = new JTextArea();
46
privateJTextField status = new JTextField("[status line]");
Image im;
publicCaptureFormEnrolling(Frame owner) {
super(owner, true);
setTitle("ThInc Fingerprint Enrollment");
URL url = this.getClass().getClassLoader().getResource("ThInc.png");
Image im = Toolkit.getDefaultToolkit().getImage(url);
setIconImage(im);
log.setLineWrap(true);
setLayout(new BorderLayout());
rootPane.setBorder(BorderFactory.createEmptyBorder(10, 10, 10, 10));
picture.setPreferredSize(new Dimension(240, 280));
picture.setBorder(BorderFactory.createLoweredBevelBorder());
prompt.setFont(UIManager.getFont("Panel.font"));
prompt.setEditable(false);
prompt.setColumns(40);
prompt.setMaximumSize(prompt.getPreferredSize());
prompt.setBorder(
BorderFactory.createCompoundBorder(
BorderFactory.createTitledBorder(BorderFactory.createEmptyBorder(0, 0, 0, 0), "Prompt:"),
BorderFactory.createLoweredBevelBorder()
));
log.setColumns(40);
log.setEditable(false);
log.setFont(UIManager.getFont("Panel.font"));
JScrollPanelogpane = new JScrollPane(log);
logpane.setBorder(
BorderFactory.createCompoundBorder(
47
BorderFactory.createTitledBorder(BorderFactory.createEmptyBorder(0, 0, 0, 0), "Status:"),
BorderFactory.createLoweredBevelBorder()
));
status.setEditable(false);
status.setBorder(BorderFactory.createEmptyBorder(5, 5, 5, 5));
status.setFont(UIManager.getFont("Panel.font"));
JButton quit = new JButton("Exit");
// quit.setIcon(new javax.swing.ImageIcon(getClass().getResource("/images/Apps-session-
logout-icon.png")));
quit.addActionListener(new ActionListener() {
public void actionPerformed(ActionEvent e) {
setVisible(false);
}
});
48