Upload
unique-rankin
View
215
Download
0
Embed Size (px)
Citation preview
EigenFaces and EigenPatches
• Useful model of variation in a region– Region must be fixed shape (eg rectangle)
• Developed for face recognition
• Generalised for – face location– object location/recognition
Overview
• Model of variation in a region
1b 2b
4b3b
Pbgg
Overview of Construction
Mark face regionon training set
Sample region
Normalise
Statistical Analysis
'g
g
Pbgg
Sampling a region
• Must sample at equivalent points across region
• Place grid on image and rotate/scale as necessary
• Use interpolation to sample image at each grid node
Interpolation
• Pixel values are known at integer positions – What is a suitable value at non-integer
positions?Values known at integer values
Estimate value here
),( jig
),( yxf
),(),( Require jigjif
Interpolation in 1D
• Estimate continuous function, f(x), that passes through a set of points (i,g(i))
f(x)
x
1D Interpolation techniquesf(x)
x
NearestNeighbour
f(x)
x
Linear
f(x)
x
Cubic
))(()( xroundgxf
)1()()1()( igigxf
ixxfloori ),(
)2()1()()1()( idgicgibgiagxf
2D Interpolation
• Extension of 1D case
))(),((),( yroundxroundgyxf NearestNeighbour
Bilinear 10)1(),( yy ffyxf
ixxfloori ),(
jyyfloorj ),(
)1,(),()1(0 jigjigf y
)1,1(),1()1(1 jigjigf y
y interp at x=0
y interp at x=1
Representing Regions
• Represent each region as a vector– Raster scan values n x m region: nm vector
g
Normalisation
• Allow for global lighting variations• Common linear approach
– Shift and scale so that• Mean of elements is zero
• Variance of elements is 1
• Alternative non-linear approach– Histogram equalization
• Transforms so similar numbers of each grey-scale value
st /)1'( gg
'1 ign
t22 )'(
1
1
tgn
s i
Review of Construction
Mark face regionon training set
Sample region
Normalise
Statistical Analysis
'g
g
Pbgg
The Fun Step
Multivariate Statistical Analysis
• Need to model the distribution of normalised vectors– Generate plausible new examples– Test if new region similar to training set – Classify region
Fitting a gaussian
• Mean and covariance matrix of data define a gaussian model
g
1g
2g
Principal Component Analysis
• Compute eigenvectors of covariance, S
• Eigenvectors : main directions
• Eigenvalue : variance along eigenvector
11p22 p
g
1g
2g
Eigenvector Decomposition
• If A is a square matrix then an eigenvector of A is a vector, p, such that
• Usually p is scaled to have unit length,|p|=1
pAp λp with associated eigenvalue theis λ
Eigenvector Decomposition
• If K is an n x n covariance matrix, there exist n linearly independent eigenvectors, and all the corresponding eigenvalues are non-negative.
• We can decompose K asTPDPK
)( 1 nppP
n
ndiag
00
00
00
)( 2
1
1D
Eigenvector Decomposition
• Recall that a normal pdf has
• The inverse of the covariance matrix is
)5.0exp()( 1xKxx Tp
TPPDK 11
TPDPK IPPPP TT
1
12
11
1
00
00
00
n
D
Fun with Eigenvectors
• The normal distribution has form
exp(...)||)2()( 5.02/ Kx np
n
ii
TT
1
|||||||||||| DPDPPDPK
Fun with Eigenvectors
• Consider the transformation
)( xxPb T
1p
2p
1x
2x
x1b
2b
Fun with Eigenvectors
• The exponent of the distribution becomes
M
bn
i i
i
T
TT
T
5.0
5.0
5.0
)()(5.0
)()(5.0
1
2
1
1
1
bDb
xxPPDxx
xxKxx
mean thefrom distance' is`Mahalanob theis M
Normal distribution
• Thus by applying the transformation
• The normal distribution is simplified to
)( xxPb T
)5.0exp()()( Mkpp bx
n
i i
ibM
1
2
5.0
1
5.0 )()2(
n
ii
nk
Dimensionality Reduction
• Co-ords often correllated
• Nearby points move together
11bpxx
1b
xx
1p
Dimensionality Reduction
• Data lies in subspace of reduced dim.
• However, for some t,
i
i
nnbb ppxPbxx 11
tjb j if 0
t
) is of (Variance jjb
Approximation
• Each element of the data can be writtenrbPxx t )( 1 tt ppP
n
tiir n 1
2 1 , of elements of Variance r
)( xxPb Tt
222 |||||| error,ion Approximat bxxr
bPxx t
Normal PDF
)5.0exp()( ttt Mkp x
2
2
1
2 ||
r
t
i i
it
bM
r
5.0
1
)(25.0 )()2(
t
ii
tnr
ntk
:others all along and
, directions along varianceAssuming2
i
r
it
p
Useful Trick
• If x of high dimension, S huge
• If No. samples, N<dim(x) use ),,( 1 xxxxD N
NN x T
NDDS
1 DDT T
N
1
iiλ uT r eigenvecto with of eigenvaluean is If
iiλ DuS r eigenvecto with of eigenvaluean is then
Building Eigen-Models
• Given examples
• Compute mean and eigenvectors of covar.
• Model is then
• P – First t eigenvectors of covar. matrix
• b – Shape model parameters
}{ ig
Pbgg
Eigen-Face models
• Model of variation in a region
1b 2b
4b3b
Pbgg
Applications: Locating objects
• Scan window over target region
• At each position:– Sample, normalise, evaluate p(g)
• Select position with largest p(g)
Multi-Resolution Search
• Train models at each level of pyramid– Gaussian pyramid with step size 2– Use same points but different local models
• Start search at coarse resolution– Refine at finer resolution
Application: Object Detection
• Scan image to find points with largest p(g)
• If p(g)>pmin then object is present
• Strictly should use a background model:
• This only works if the PDFs are good approximations – often not the case
)background()()model()( backgroundmodel PpPp gg
Application: Face Recognition
• Eigenfaces developed for face recognition– More about this later