A rank based ensemble classifier for image classification

  • View
    113

  • Download
    0

  • Category

    Science

Preview:

DESCRIPTION

2013 8 TH I R ANIAN CONF E R ENC E ON MACHINE V I S ION AND IMAGE P ROC E S S ING (MVIP)

Citation preview

A Rank based Ensemble Classifier for Image Classification using Color and Texture Features2013 8 T H IRAN IAN CONFERENCE ON MACHINE V IS ION AND IMAGE

PROCESS ING (MVIP )

FATEMEH AHMADI 、 MOHAMAD-HOSEYN S IGARI 、 MOHAMAD-EBRAHIM SH IR I

Outline Introduction Proposed Method

◦ Feature extraction◦ Ensemble classifier◦ Final decision maker

Experimental Results Conclusion

Introduction

Image Classification contains two main steps:

1. Extraction of low-level features from input image.

2. Classification of input image based on the extracted features.

A color image classification method using rank based ensemble classifier.

Feature Extraction Classification

Features: Color: Color histograms are invariant to orientation and

scale, and these properties makes it more powerful in image classification.

Texture: Texture is one of the most important characteristics of an image.

Classifiers: Nearest Neighbor Multi Layer Perceptron

Proposed Method

a. Feature Extractionb. Ensemble Classifierc. Final Decision maker

Color Feature Texture Feature

A. Feature Extraction 1) Color Histogram: We extract color histogram in five color spaces:

◦RGB, HSV, CMY, YCbCr, 3D RGB quantize histogram in 10 bin for each color channel, therefore a feature vector of length 30 is acquired for each color space.

Feature extraction -> Ensemble classifier -> Final decision maker

2) Gabor Wavelet: Gabor wavelet operates like a local edge detector.

θ: determines orientation of the wavelet.

λ : specifies wavelength of cosine signal.

ψ: is phase of the cosine signal.

σ: denotes radius of the Gaussian function.

ϒ: specifies aspect ratio of the Gaussian function.

Feature extraction -> Ensemble classifier -> Final decision maker

In the proposed system:◦ rotation angles : {0, /4, /2, 3 /4} ◦ Wavelengths: {2,2,4}

There are 12 different Gabor filters. After convolving the image by all Gabor filters, 12 2D coefficient matrixes are obtained, which are denoted by Ci while i {1,…,12}.

a) The First Feature set:◦ Histogram of AM and counts the dominant edges in different width and

orientation.

b) The Second Feature Set: ◦ We compute mean and variance of coefficient matrixes. Therefor, length of

the second texture feature set is 24 for each image.

Feature extraction -> Ensemble classifier -> Final decision maker

B. Ensemble Classifier

We use two classifiers as simple classifier in ensemble:

(1) Nearest Neighbor (NN) ◦ Class labels of these 3 nearest neighbors are listed as output in an ordered

list

(2) Multi Layer Perceptron (MLP)◦ Output of MLP is an ordered list of 3 classes that have higher values in the

corresponding neurons in output layer

Feature extraction -> Ensemble classifier -> Final decision maker

Do not learn a single classifier but learn a set of classifiers combine the predictions of multiple classifiers. (https://www.ke.tu-darmstadt.de/lehre/archiv/ws0405/mldm/ensembles.pdf)

Supplement

Proposed System:In the Color(5): RGB, HSV, YCbCr ,CMY, 3D RGB Texture(2): Dominant edges and statistical moments of Gabor coefficients. Classifier(2):NN, MLP

(5+2)*2 = 14

Therefore, our ensemble classifier contains 14 simple decision makers

Feature extraction -> Ensemble classifier -> Final decision maker

C. Final Decision Maker To combine outputs and make the final decision in an ensemble classifier.

Simple majority vote◦ all simple decision makers have equal importance in the ensemble.

Weighted majority vote◦ the importance of each simple classifier is different and votes of each

classifier is weighted by a coefficient in range of (0,1)

Feature extraction -> Ensemble classifier -> Final decision maker

Experimental Results

Corel dataset: 1000 images 10 Classes.(each class contains 100 images)

In each test iteration, 100 images of 1000 images are used as test data and the remainders are used as training data.

Therefore, test iterations are repeated for 10 times.

Experimental Results

A. Experiments on Simple Decision Makers

B. Experiments on The Ensemble Classifier using Simple Majority Vote

We compare two different conditions for majority vote: (1) using only one label as output of each simple decision maker (2) using 3 labels as output of each simple decision maker.

C. Experiments on The Ensemble Classifier using Wieghted Majority Vote

Conslusion & Feature work

Rank based ensemble classification of extracted feature sets work very good for color image classification.

For improvement of the proposed system, we suggest to use other features like shape base features and other classifiers like decision tree and Support Vector Machine (SVM).

Additionally, proposing an adaptive method for weighting of ordered list of labels may lead to achieve a more robust and efficient system for image classification.

EndThank you

Recommended