Upload
alexander-fernicola
View
127
Download
3
Embed Size (px)
Citation preview
FACIAL EXPRESSION RECOGNITION APPBEB801- SEMESTER 1STUDENT: ALEXANDER FERNICOLA N8609373SUPERVISOR: VINOD CHANDRAN
PURPOSE OF PROJECT
• To design an android application, that is compatible with most models, specifically android OS 4.4 (KitKat) or later, that can recognise the users facial expression in a photo or a video.• Accuracy goal is a 50-70% positive recognition score (due to
lack of resources to train the model with a large enough data set)
WHY IS THIS PROJECT NEEDED & PRIOR WORK• Advances in facial expression recognition (FER) will lead to
advances in affective computing and human computing.• Modern studies in the field of automatic facial expression
recognition emerged in 2009.• Major advances in face detection and feature extraction, for
example the Viola-Jones classifier and local binary pattern (LBP) feature extraction algorithm.
BACKGROUND INFO ON MACHINE LEARNING• Process of creating a learned
mathematical model to predict output.• (x,y) coordinates of points are the
inputs and colours are outputs.• The model is the line that separates the
points.• Model is the equation; y = mx + b,
where m is the gradient and b is the bias.
• Optimization is training the model to match the output of the model with the desired output.
• There are many training algorithms such as support vector machines (SVM), neural networks or random forests.
PROJECT PLAN
FACE DETECTION & TRACKING• Face detected by Viola-Jones
classifier• The viola jones classifier is a
haar like feature cascade classifier that uses the adaboost feature selection algorithm to boost efficiency.
• Face tracker used in this project is the Chehra face tracker. 49 fiducial points (green asterisks) are placed on the face so that local texture features can be extracted from around each point.
• It is important to track the movements of these points as this helps us detect facial expressions.
RESEARCH FINDINGS• Table II shows the accuracy of the
combinations of feature extraction and selection algorithms.
• Feature selection on second column, feature extraction on first row.
• Most accurate are LBP+FAP (facial action points) and mRMR (minimal Redundancy Maximum Relevance criterion).
• Exclusively LBP is by far the fastest feature extraction algorithm
• (L. Zhang et al, Discovering the Best Feature Extraction and Selection Algorithms for Spontaneous Facial Expression Recognition (2012). 2012 IEEE International Conference on Multimedia and Expo
LOCAL BINARY PATTERN (LBP) FEATURE EXTRACTION• Compares intensity of 8
neighbouring pixels with intensity of centre pixel (threshold value).
• Pixels are compared within the region around m*n the fiducial point.
• ‘1’ if higher, ‘0’ if lower. • Forms a binary pattern of
‘1’s and ‘0’s.• Converted to a decimal
number• Histogram of decimal
numbers is formed.
PLAN FOR BEB802
• Research, evaluate and implement feature selection algorithm, e.g. PCA (principal component analysis), mRMR or adaboost.• Research, evaluate and train classifier model, e.g. SVM (support
vector machine), SVDA (support vector discriminant analysis).• Write android application.• Possibly train model with facial images from FEEDTUM or NVIE
facial expression databases to improve accuracy.