23
EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Embed Size (px)

Citation preview

Page 1: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

EE459 Neural Networks

Examples of using Neural Networks

Kasin PrakobwaitayakitDepartment of Electrical Engineering

Chiangmai University

Page 2: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

APPLICATIONS

• Two examples of real life applications of neural networks for pattern classification:– RBF networks for face recognition– FF networks for handwritten recognition

Page 3: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

FACE RECOGNITION

• The problem:– Face recognition of persons of a known

group in an indoor environment.

• The approach:– Learn face classes over a wide range of

poses using an RBF network.

Page 4: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Dataset

• Sussex database (university of Sussex)– 100 images of 10 people (8-bit grayscale, resolution

384 x 287)– for each individual, 10 images of head in different

pose from face-on to profile– Designed to asses performance of face recognition

techniques when pose variations occur

Page 5: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Datasets (Sussex)

All ten images for classes 0-3 from the Sussex database, nose-centred and subsampled to 25x25 before preprocessing

Page 6: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Approach: Face unit RBF

• A face recognition unit RBF neural networks is trained to recognize a single person.

• Training uses examples of images of the person to be recognized as positive evidence, together with selected confusable images of other people as negative evidence.

Page 7: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Network Architecture

• Input layer contains 25*25 inputs which represent the pixel intensities (normalized) of an image.

• Hidden layer contains p+a neurons:– p hidden pro neurons (receptors for positive evidence)

– a hidden anti neurons (receptors for negative evidence)

• Output layer contains two neurons:– One for the particular person.– One for all the others.

The output is discarded if the absolute difference of the two output neurons is smaller than a parameter R.

Page 8: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

RBF Architecture for one face recognition

Output unitsLinear

RBF unitsNon-linear

Input units

Supervised

Unsupervised

Page 9: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Hidden Layer

• Hidden nodes can be:– Pro neurons: Evidence for that person.– Anti neurons: Negative evidence.

• The number of pro neurons is equal to the positive examples of the training set. For each pro neuron there is either one or two anti neurons.

• Hidden neuron model: Gaussian RBF function.

Page 10: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

The Parameters

• Centers: – of a pro neuron: the corresponding positive example– of an anti neuron: the negative example which is most similar to the

corresponding pro neuron, with respect to the Euclidean distance.

• Spread: average distance of the center vector from all other centers. If , h hidden nodes, H total number of hidden nodes then:

• Weights: determined using the pseudo-inverse method.• A RBF network with 6 pro neurons, 12 anti neurons, and R equal to

0.3, discarded 23 pro cent of the images of the test set and classified correctly 96 pro cent of the non discarded images.

h

httH

||||2

1

Page 11: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

HANDWRITTEN DIGIT RECOGNITION

Page 12: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Dataset and NN architecture

7,291 for training• 9,298 examples used

2,007 for testing• Both training and test set contain ambiguous and

unclassifiable examples.

• 256 input neurons. (16x16) • 10 output neurons. (each represents a numeral)

• A feedforward NN with 3 hidden layers is used.

Page 13: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

The Idea

– Shared weights: all neurons in a feature share the same weights (but not the biases). In this way all neurons detect the same feature at different positions in the input image. The detected features are combined to reach shift-invariant feature detection.

– This is combined with layer implementing subsampling to decrease the resolution and the sensitivity to distorsions.

Page 14: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

NN Architecture

Page 15: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Convolutional NN

– If a neuron in the feature map fires, this corresponds to a match with the template.

– Neurons of the feature map react to the same pattern at different positions in the input image.

– For neurons in the feature map that are one neuron apart (in the matrix representation of the feature map) their templates in the input image are two pixels apart. Thus the input image is undersampled, and some position information is eliminated.

– A similar 2-to-1 undersampling occurs as one goes from H1 to H2. The rationale is that although high resolution may be needed to detect a feature, its exact position need not be determined at equally high precision.

Page 16: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

A Feature Map

Page 17: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

A Sub-sampling Map

Page 18: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Architecture

• Input layer: 256 neurons with input values in range [-1, 1].

• Hidden layer H1: consists of 12 feature maps H1.1, … , H1.12.

• Feature map:– It consists of 8x8 neurons.– Each neuron in the feature map has the same incoming

weights , but is connected to a square at a unique position in the input image. This square is called a template.

Page 19: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Architecture

• Hidden layer H2: consists of 12 sub-sampling maps H2.1, … , H2.12.

• Sub-sampling map:– Consists of 4x4 neurons.– Each neuron of the sub-sampling map is

connected to a 5x5 square of H1.j, for each j in 8 of the 12 feature maps.

– All neurons of the sub-sampling map share the same 25 weights.

Page 20: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Architecture

• Hidden layer H3:– Consists of 30 neurons.– H3 is completely connected to the sub-sampling

layer (H2).

• Output layer: consists of 10 neurons, numbered 0, … , 9 and the neuron with the highest activation value is chosen. The digit recognized is equal to the cell number.

• A Dutch master thesis on Le Cun shared weights NN:

master thesis of D. de Ridder:

“shared weights NN’s in image anlyses”, 1996

http://www.ph.tn.tudelft.nl/~dick

Page 21: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Le Cun NN Architecture

#Cells #Connections #WeightsL1 Input Layer

16 x 16 25612 x (8 x 8) x (5 x 5 + 1) 1996812 x (8 x 8) x (1) + 12 x (5 x 5) 1068

L2 Feature Maps12 x (8 x 8) 76812 x (4 x 4) x (8 x (5 x 5) + 1) 3859212 x (4 x 4) x (1) 12 x (8 x (5 x 5) ) 2592

L3 Subsampling maps12 x (4 x 4) 19230 x (12 x (4 x 4) + 1) 5790 5790

L4 Hidden Layer30 3010 x (30 + 1) 310 310

L5 Ouput layer10 10

Total 1256 64660 9760

Page 22: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Le Cun NN Architecture

Page 23: EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University

Training

– Backpropagation was used to learn the weights.– The hyperbolic tangent was chosen as neuron model.– After training the number of misclassified patterns was 10

on the training set and 102 on the test set.