127
AKSHARA P BYJU March, 2015 ITC SUPERVISOR IIRS SUPERVISOR Prof. Dr. Ir. A. Stein Dr. Anil Kumar Non-Linear Separation of classes using a Kernel based Fuzzy c-Means (KFCM) Approach

Non-Linear Separation of classes using a Kernel based

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Non-Linear Separation of classes using a Kernel based

AKSHARA P BYJU

March, 2015

ITC SUPERVISOR IIRS SUPERVISOR

Prof. Dr. Ir. A. Stein Dr. Anil Kumar

Non-Linear Separation of

classes using a Kernel

based Fuzzy c-Means

(KFCM) Approach

Page 2: Non-Linear Separation of classes using a Kernel based
Page 3: Non-Linear Separation of classes using a Kernel based

Thesis submitted to the Faculty of Geo-information Science and

Earth Observation of the University of Twente in partial

fulfilment of the requirements for the degree of Master of

Science in Geo-information Science and Earth Observation.

Specialization: Geoinformatics

THESIS ASSESSMENT BOARD: Chairperson : Prof. Dr. Ir. M. G. Vosselman

External Examiner : Dr. S. K. Ghosh

ITC Supervisor : Prof. Dr. Ir. A. Stein

ITC Professor : Prof. Dr. Ir. A. Stein

IIRS Supervisor : Dr. Anil Kumar

OBSERVERS:

ITC Observer : Dr. Nicholas Hamm

IIRS Observer : Dr. S. K. Srivastav

Non-Linear Separation of

classes using a Kernel

based Fuzzy c-Means

(KFCM) Approach

Akshara P. Byju

Enschede, the Netherlands [2015]

Page 4: Non-Linear Separation of classes using a Kernel based

DISCLAIMER This document describes work undertaken as part of a programme of study at the Faculty of

Geo-information Science and Earth Observation (ITC), University of Twente, The

Netherlands. All views and opinions expressed therein remain the sole responsibility of the

author, and do not necessarily represent those of the institute.

Page 5: Non-Linear Separation of classes using a Kernel based

Dedicated to my loving grandmother

Shanta Gopinath, mother and father…..

Page 6: Non-Linear Separation of classes using a Kernel based
Page 7: Non-Linear Separation of classes using a Kernel based

i

ABSTRACT

Fuzzy classification of remote sensing image allows the characterization and classification of land covers

with improved robustness and accuracy. Coarser resolution images contain mixed pixels as well as non-

linearly separable data. Presence of these mixed pixels and non-linear data deteriorates the classification

accuracy and computational complexity. Kernels were used for clustering and classification problems

based on the similarity between any two samples and these samples are implicitly mapped to a feature

space where they are linearly separable. In this research, Kernel based fuzzy clustering has been used to

handle both the problem of non-linearity and mixed pixel. A supervised Kernel based Fuzzy c-Means

classifier has been used to improve the performance of FCM classification technique. Eight kernel

functions are incorporated to the objective function of the FCM classifier. As a result, the effects of

different kernel functions can be visualized in generated fraction images. The best single kernels are

selected by optimizing the weight constant which controls the degree of fuzziness using an entropy and

mean membership difference calculation. These are combined to study the effect of composite kernels

which includes both the spatial and spectral properties. Fuzzy Error Matrix (FERM) was used to assess

the accuracy assessment results and was studied for AWiFS, LISS-III and LISS-IV datasets from

Resourcesat-1 and Resourcesat-2. Inverse Multiquadratic and Gaussian kernel using Euclidean norm from

Resourcesat-1 and Resourcesat-2 respectively were found to have an overall highest fuzzy accuracy

97.03% and 86.03% for LISS-III dataset. Among the composite kernels Gaussian-Spectral kernel was

found to have an overall accuracy of 59.27% for LISS-III. Classification accuracy in the case of untrained

classifier was also studied were a decrease in average user’s accuracy was observed when compared to

trained case.

Keywords: Classification, Kernels, Kernel Fuzzy clustering, Feature Space, Fuzzy Error Matrix

Page 8: Non-Linear Separation of classes using a Kernel based

ii

ACKNOWLEDGEMENTS

Firstly I would like to thank God Almighty for his abundant blessings throughout my research work. I

would also like to thank my parents for always being there for me and also the encouragement and

support they have given throughout my life.

I would like to appreciate Prof. Dr. Alfred Stein for the suggestions and valuable remarks throughout my

research work. I am honoured to have a supervisor like him. His in-depth knowledge and lucidness in his

words helped me to successfully carry out this research. I owe my sincere gratitude for helping me

throughout.

I express my sincere gratitude to my IIRS supervisor Dr. Anil Kumar for his valuable guidance and

assistance he has rendered towards the completion of my research work. He has inspired me and helped

me in all the ways a best teacher can do for his student. I am highly obliged for the help and support from

him.

I would also like to thank Dr. S K Srivastav for giving valuable suggestions and making sure of good

research progress throughout for all the students. I would also like to thank Mr. P L N Raju, Group Head

for the suggestions and support he has given for completing the course. I express my sincere gratitude to

all the IIRS faculties for helping me complete my modules successfully.

I would also like to thank Nicholas Hamm for his support and kindness he has given during the course.

Special thanks to all my dear friends and all the IIRS members for the help and encouragement they have

given throughout my course.

Akshara P Byju

Page 9: Non-Linear Separation of classes using a Kernel based

iii

TABLE OF CONTENTS

1. INTRODUCTION .................................................................................................................... 1

1.1. MOTIVATION AND PROBLEM STATEMENT ............................................................................. 3

1.2. RESEARCH OBJECTIVE: ...................................................................................................................... 3

1.3. RESEARCH QUESTIONS ...................................................................................................................... 4

1.4. INNOVATION AIMED AT ................................................................................................................... 4

1.5. THESIS STRUCTURE .............................................................................................................................. 4

2. LITERATURE REVIEW ......................................................................................................... 5

2.1. LAND COVER CLASSIFICATION METHODS: ............................................................................. 5

2.2. FUZZY C-MEANS (FCM) ........................................................................................................................ 6

2.3. KERNELS ................................................................................................................................................... 7

2.4. ACCURACY ASSESSMENT ................................................................................................................... 9

3. CLASSIFICATION APPROACHES, STUDY AREA AND METHODOLOGY.................. 11

3.1. CLASSIFICATION APPROACHES AND ACCURACY ASSESSMENT ................................. 11

3.1.1. CLUSTERING ............................................................................................................................... 12

3.1.2. THE FUZZY c-MEANS (FCM) CLASSIFIER ....................................................................... 12

3.1.3. KERNELS ....................................................................................................................................... 15

3.1.4. KERNEL BASED FUZZY C-MEANS (KFCM) CLASSIFIER .......................................... 18

3.1.5. ACCURACY ASSESSMENT ...................................................................................................... 19

3.1.5.1. FUZZY ERROR MATRIX (FERM) ..................................................................................... 20

3.1.5.2. SUB-PIXEL CONFUSION UNCERTAINTY MATRIX ................................................ 20

3.1.5.3. ROOT MEAN SQUARE ERROR (RMSE)......................................................................... 21

3.1.5.4. ENTROPY MEASURE ........................................................................................................... 22

3.6.4.1. MEAN MEMBERSHIP DIFFERENCE METHOD ........................................................ 23

3.2. STUDY AREA AND MATERIALS USED ....................................................................................... 24

3.2.1. STUDY AREA ................................................................................................................................ 24

3.2.2. MATERIALS USED ..................................................................................................................... 25

3.2.3. DATASET PREPROCESSING.................................................................................................. 27

3.2.4. REFERENCE DATASET GENERATION ........................................................................... 27

3.3. METHODOLOGY ................................................................................................................................ 28

3.3.1. GEO-REFERENCING ................................................................................................................ 28

3.3.2. PREPERATION OF REFERENCE DATASET ................................................................... 29

3.3.3. SUB-PIXEL CLASSIFICATION ALGORITHMS ................................................................. 29

3.3.3.1. FUZZY C-MEANS (FCM): ..................................................................................................... 29

3.3.3.2. KERNEL BASED FUZZY c-MEANS (KFCM): ............................................................... 29

3.3.3.3. FCM WITH COMOSITE KERNELS .................................................................................. 30

3.3.4. ACCURACY ASSESSMENT ...................................................................................................... 30

4. RESULTS ................................................................................................................................ 31

4.1. PARAMETER ESTIMATION ............................................................................................................. 31

4.2. RESULTS OF SUPERVISED FCM CLASSIFIER ........................................................................... 35

4.3. RESULTS OF FCM CLASSIFIER USING SINGLE KERNELS ................................................. 37

4.4. RESULTS OF FCM CLASSIFIER USING COMPOSITE KERNELS ....................................... 44

Page 10: Non-Linear Separation of classes using a Kernel based

iv

4.5. ACCURACY ASSESSMENT RESULTS ............................................................................................. 48

4.6. UNTRAINED CLASSES ....................................................................................................................... 49

5. DISCUSSION .......................................................................................................................... 52

6. CONCLUSIONS AND RECOMMENDATIONS ................................................................ 55

6.1. CONCLUSIONS ...................................................................................................................................... 55

6.2. ANSWERS TO RESEARCH QUESTIONS ...................................................................................... 56

6.3. RECOMMENDATIONS ....................................................................................................................... 58

REFERENCES ............................................................................................................................... 59

APPENDIX A ................................................................................................................................. 64

APPENDIX B ................................................................................................................................. 71

APPENDIX C ................................................................................................................................. 99

Page 11: Non-Linear Separation of classes using a Kernel based

v

LIST OF FIGURES

FIGURE 2-1: TWO CLUSTERS IN INPUT SPACE DENOTED IN DIFFERENT SHAPE SHOWING THE

NON-LINEARLY AND LINEARLY SEPARABLE CASE………………………………………..………............7

FIGURE 3-1: CLUSTERING……………………………………………………..………...……..……….............12

FIGURE 3-2: MAPPING OF KERNELS TO A HIGHER DIMENSIONAL SPACE...............................................15

FIGURE 3-3: AN IMAGE WITH SIX CLASSES IDENTIFIED ALONG WITH THE GENERATED

FRACTIONAL IMAGES ……………………………………………………………………………….....…........23

FIGURE 3-4: GEOGRAPHICAL LOCATION OF STUDY AREA ………………….…………………..…….26

FIGURE 3-5: LISS IV (RESOURCESAT-2) IMAGE OF SITARGANJ’S TEHSIL WITH CLASSES (A)

AGRICULTURAL FIELD WITH CROP (B) SAL FOREST (C) EUCALYPTUS PLANTATIONS (D) DRY

AGRICULTURAL FIELD (E) WATER…………………………………….………………………………..........26

FIGURE 3-6: METHODOLOGY ADOPTED…………………………………………….…………..….….......28

FIGURE 4-1: VARIATION IN ENTROPY WITH RESPECT TO WEIGHT CONSTANT 𝑚 FOR GAUSSIAN

KERNEL USING EUCLIDEAN NORM (RESOURCESAT-1 AWIFS)…………………………... …...…….…32

FIGURE 4-2: VARIATION IN MEAN MEMBERSHIP DIFFERENCE WITH RESPECT TO WEIGHT

CONSTANT 𝑚 FOR GAUSSIAN KERNEL USING EUCLIDEAN NORM (RESOURCESAT-1

AWIFS)………………………………………………………………………………………………………...…..33

FIGURE 4-3: ESTIMATION OF WEIGHT GIVEN TO EACH KERNEL (𝜆) USING (A) ENTROPY AND (B)

MEAN MEMBERSHIP DIFFERENCE PLOT FOR GAUSSIAN-SPECTRAL KERNEL FROM AWIFS

(RESOURCESAT-1)…………………………………………………………………… ………………………...34

FIGURE 4-4: MISCLASSIFIED OUTPUTS FOR GAUSSIAN-SPECTRAL RESOURCESAT-1 AWIFS FOR

𝑚=1.04 AND 𝜆=0.80 FOR (A) AGRICULTURAL FIELD WITH CROP (B) SAL FOREST (C) EUCALYPTUS

PLANTATIONS (D) DRY AGRICULTURAL FIELD WITHOUT CROP (E) MOIST AGRICULTURAL FIELD

WITHOUT CROP (F) WATER……………………………………………………………….………………… 34

FIGURE 4-5: FRACTIONAL IMAGES GENERATED FOR OPTIMIZED 𝑚 VALUES FOR FCM

CLASSIFIER FOR (1) LISS-IV, (2) LISS-III AND (3) AWIFS (RESOURCESAT-1) IMAGES WITH

IDENTIFIED CLASSES (A) AGRICULTURAL FIELD WITH CROP (B) SAL FOREST (C) EUCALYPTUS

PLANTATION (D) DRY AGRICULTURAL FIELD WITHOUT CROP (E) MOIST AGRICULTURAL FIELD

WITHOUT CROP AND (F) WATER……………………………………………………………...…………......36

FIGURE 4-6: FRACTIONAL IMAGES GENERATED FOR OPTIMIZED 𝑚 VALUES FOR FCM

CLASSIFIER OF (1) LISS-IV, (2) LISS-III AND (3) AWIFS (RESOURCESAT-2) IMAGES WITH IDENTIFIED

CLASSES (A) AGRICULTURAL FIELD WITH CROP (B) EUCALYPTUS PLANTATION (C) FALLOW

LAND (D) SAL FOREST (E) WATER…………………………………….…………………………………...…37

Agricultural

field with crop

Page 12: Non-Linear Separation of classes using a Kernel based

vi

FIGURE 4-7: GENERATED FRACTIONAL IMAGES FOR OPTIMIZED 𝑚 VALUES FOR

RESOURCESAT-1 LISS-IV FOR (I) LINEAR (II) POLYNOMIAL (III) SIGMOID (IV) GAUSSIAN KERNEL

USING EUCLIDEAN NORM (V) RADIAL BASIS (VI) KMOD (VII) INVERSE MULTIQUADRATIC AND

(VIII) SPECTRAL ANGLE KERNELS FOR CLASSES IDENTIFIED AS (A) AGRICULTURAL FIELD WITH

CROP (B) SAL FOREST (C) EUCALYPTUS PLANTATIONS (D) DRY AGRICULTURAL FIELD WITHOUT

CROP (E) MOIST AGRICULTURAL FIELD WITHOUT CROP AND (F) WATER……………………….…..40

FIGURE 4-8: GENERATED FRACTIONAL IMAGES FOR OPTIMIZED 𝑚 VALUES FOR

RESOURCESAT-2 LISS-IV FOR (I) LINEAR (II) POLYNOMIAL (III) SIGMOID (IV) GAUSSIAN KERNEL

USING EUCLIDEAN NORM (V) RADIAL BASIS (VI) KMOD (VII) INVERSE MULTIQUADRATIC AND

(VIII) SPECTRAL ANGLE KERNELS FOR CLASSES IDENTIFIED AS (A) AGRICULTURAL FIELD WITH

CROP (B) EUCALYPTUS PLANTATION (C) FALLOW LAND(D) SAL FOREST (E) WATER…………...…42

FIGURE 4-9: GENERATED FRACTIONAL IMAGES FOR OPTIMIZED 𝑚 VALUES OF RESOURCESAT-

1 LISS-IV FOR (I) GAUSSIAN-SPECTRAL (II) IM-SPECTRAL(III) GAUSSIAN-LINEAR (IV) IM-LINEAR(V)

LINEAR-SPECTRAL FOR CLASSES IDENTIFIED AS (A) AGRICULTURAL FIELD WITH CROP (B) SAL

FOREST (C) EUCALYPTUS PLANTATION (D) DRY AGRICULTURAL FIELD WITHOUT CROP (E)

MOIST AGRICULTURAL FIELD WITHOUT CROP (F) WATER………………………………………..……47

FIGURE 4-10: GRAPHICAL REPRESENTATION OF AVERAGE USER’S ACCURACY FOR UNTRAINED

AND TRAINED CASE FOR IM AND FCM RESOURCESAT-1 (A) AWIFS (B) LISS-III AT OPTIMIZED 𝑚

FOR RESOURCESAT-1…………………………………………………………………………………..………51

FIGURE 6-1: NON-LINEARITY IN DIFFERENT CLASSES AS 2D SCATTERPLOT FOR RESOURCESAT-1

LISS-IV IMAGE IN (A) BAND 1-BAND2 (B) BAND2-BAND3 (C) BAND1-BAND3 FOR CLASSES

IDENTIFIED……………………………………………………………………………………...……………....56

FIGURE A-1: GENERATED FRACTIONAL IMAGES FOR BEST SINGLE KERNELS FROM LISS-III

(RESOURCESAT-1) FOR (I) LINEAR (II) INVERSE MULTIQUADRATIC (III) SPECTRAL ANGLE

KERNEL FOR CLASSES IDENTIFIED AS (A) AGRICULTURE FIELD WITH CROP (B) SAL FOREST (C)

EUCALYPTUS PLANTATION (D) DRY AGRICULTURE FIELD WITH CROP (E) MOIST AGRICULTURE

FIELD WITH CROP (F)WATER….……………………………………………………………………………...64

FIGURE A-2: GENERATED FRACTIONAL IMAGES FOR BEST SINGLE KERNELS FROM LISS-III

(RESOURCESAT-2) FOR (I) LINEAR (II) GAUSSIAN KERNEL USING EUCLIDEAN NORM (III)

SPECTRAL ANGLE KERNEL FOR CLASSES IDENTIFIED AS (A) AGRICULTURE FIELD WITH CROP

(B) EUCALYPTUS PLANTATION (C) FALLOW LAND (D) SAL FOREST (E)WATER……………………...65

FIGURE A-3: GENERATED FRACTIONAL IMAGES FOR BEST SINGLE KERNELS FROM AWIFS

(RESOURCESAT-1) FOR (I) LINEAR (II) INVERSE MULTIQUADRATIC (III) SPECTRAL ANGLE

KERNEL FOR CLASSES IDENTIFIED AS (A) AGRICULTURE FIELD WITH CROP (B) SAL FOREST (C)

Page 13: Non-Linear Separation of classes using a Kernel based

vii

EUCALYPTUS PLANTATION (D) DRY AGRICULTURE FIELD WITH CROP (E) MOIST AGRICULTURE

FIELD WITH CROP (F) WATER……………………………………………………...……………..…………..66

FIGURE A-4: GENERATED FRACTIONAL IMAGES FOR BEST SINGLE KERNELS FROM AWIFS

(RESOURCESAT-2) FOR (I) LINEAR (II) GAUSSIAN KERNEL USING EUCLIDEAN NORM (III)

SPECTRAL ANGLE KERNEL FOR CLASSES IDENTIFIED AS (A) AGRICULTURE FIELD WITH CROP

(B) EUCALYPTUS PLANTATION (C) FALLOW LAND (D) SAL FOREST (E) WATER……………..………67

FIGURE A-5: VARIATION IN ENTROPY(𝐸) AND MEAN MEMBERSHIP DIFFERENCE AGAINST THE

WEIGHT CONSTANT ( 𝑚 ) FOR FCM FOR RESOURCESAT-1 AWIFS FOR (I) FCM) (II) LINEAR

(III)POLYNOMIAL (IV)SIGMOID(V)GAUSSIAN KERNEL WITH EUCLIDEAN NORM(VI) RADIAL

BASIS (VII) KMOD) (VIII)IM (IX) SPECTRAL ANGLE………………………………………………………..68

FIGURE A-6: VARIATION IN ENTROPY(𝐸) AND MEAN MEMBERSHIP DIFFERENCE AGAINST THE

WEIGHT CONSTANT (𝑚) FOR GAUSSIAN-SPECTRAL ANGLE KERNEL FOR (A) RESOURCESAT-2

AWIFS (B) RESOURCESAT-1 LISS-III AND (C) RESOURCESAT-2 LISS-III….……………………....………70

FIGURE A-7: VARIATION IN ENTROPY(𝐸) AND MEAN MEMBERSHIP DIFFERENCE AGAINST THE

WEIGHT CONSTANT (𝑚) FOR IM-SPECTRAL ANGLE KERNEL FOR (A) RESOURCESAT-2 AWIFS (B)

RESOURCESAT-1 LISS-III AND (C) RESOURCESAT-2 LISS-III……………………………...…..………...…71

Page 14: Non-Linear Separation of classes using a Kernel based

viii

LIST OF TABLES

TABLE 3-1: RESOURCESAT-1 AND RESOURCESAT-2 SENSOR SPECIFICATION……………………….25

TABLE 4-1: CLASSES IDENTIFIED AT SITARGANJ’S TEHSIL IN AWIFS, LISS-III AND LISS-IV

SENSORS FOR RESOURCESAT-1 AND RESOURCESAT-2…………………………………………………..32

TABLE 4-2: ESTIMATED OPTIMIZED 𝑚 VALUES FOR FCM CLASSIFIER ALONG WITH THE

CALCULATED MEAN MEMBERSHIP DIFFERENCE (𝜆) AND ENTROPY……………….………………..35

TABLE 4-3: OPTIMIZED 𝑚 VALUES FOR LOCAL, GLOBAL AND SPECTRAL ANGLE KERNELS FOR

AWIFS, LISS-III AND LISS-IV IMAGES (RESOURCESAT-1) ALONG WITH THE CALCULATED MEAN

MEMBERSHIP DIFFERENCE (Δ) AND ENTROPY(E)………………………………………………………..39

TABLE 4-4: OPTIMIZED 𝑚 VALUES FOR LOCAL, GLOBAL AND SPECTRAL ANGLE KERNELS FOR

AWIFS, LISS-III AND LISS-IV IMAGES (RESOURCESAT-2) ALONG WITH THE CALCULATED MEAN

MEMBERSHIP DIFFERENCE (Δ) AND ENTROPY(E)………………………………………………………..39

TABLE 4-5: MAXIMUM MEAN MEMBERSHIP DIFFERENCE VALUES ESTIMATED FOR OPTIMIZED

VALUES OF M (RESOURCESAT-1 AWIFS)…………………………………………………………………….44

TABLE 4-6: OPTIMIZED 𝑚 VALUES FOR COMPOSITE KERNELS FOR AWIFS, LISS-III AND LISS-IV

IMAGES (RESOURCESAT-1) ALONG WITH THE CALCULATED MEAN MEMBERSHIP DIFFERENCE

(Δ), ENTROPY(E) AND WEIGHT GIVEN TO EACH KERNEL (𝜆)…………………………………….…...45

TABLE 4-7: OPTIMIZED M VALUES FOR COMPOSITE KERNELS FOR AWIFS, LISS-III AND LISS-IV

IMAGES (RESOURCESAT-1) ALONG WITH THE CALCULATED MEAN MEMBERSHIP DIFFERENCE

(Δ), ENTROPY(E) AND WEIGHT GIVEN TO EACH KERNEL (Λ)………….……………………………...46

TABLE 4-8: ACCURACY ASSESSMENT RESULTS FOR FCM, BEST SINGLE KERNEL AND BEST

COMPOSITE KERNELS…………………………………………………………………………………………48

TABLE 4-9: COMPARISON OF ACCURACY ASSESSMENT IN TRAINED AS WELL AS UNTRAINED

CASE FOR IM KERNEL AND FCM FOR AWIFS WITH LISS-III IMAGE (RESOURCESAT-1)……………..49

TABLE 4-10: COMPARISON OF ACCURACY ASSESSMENT IN TRAINED AS WELL AS UNTRAINED

CASE FOR IM KERNEL AND FCM FOR LISS-III WITH LISS-IV IMAGE (RESOURCESAT-1)……………50

TABLE 4-11: COMPARISON OF ACCURACY ASSESSMENT IN TRAINED AS WELL AS UNTRAINED

CASE FOR GAUSSIAN KERNEL USING EUCLIDEAN NORM AND FCM FOR AWIFS WITH LISS-III

IMAGE (RESOURCESAT-2)……………………………………………………………………………………..50

TABLE 4-12: COMPARISON OF ACCURACY ASSESSMENT IN TRAINED AS WELL AS UNTRAINED

CASE FOR GAUSSIAN KERNEL USING EUCLIDEAN NORM AND FCM FOR LISS-III WITH LISS-IV

IMAGE (RESOURCESAT-2)……………………………………………………………………………………..51

Page 15: Non-Linear Separation of classes using a Kernel based

ix

TABLE B-1: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED AWIFS (RESOURCESAT-1)

AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………………………………72

TABLE B-2: ACCURACY ASSESSMENT RESULTS FOR INVERSE MULTIQUADRATIC KERNEL

CLASSIFIED AWIFS (RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA….72

TABLE B-3: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED AWIFS

(RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………..73

TABLE B-4: ACCURACY ASSESSMENT RESULTS FOR SPECTRAL ANGLE KERNEL CLASSIFIED

AWIFS (RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………73

TABLE B-5: ACCURACY ASSESSMENT RESULTS FOR IM-SPECTRAL KERNEL CLASSIFIED AWIFS

(RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………..74

TABLE B-6: ACCURACY ASSESSMENT RESULTS FOR IM-SPECTRAL KERNEL CLASSIFIED AWIFS

(RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………..74

TABLE B-7: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED AWIFS (RESOURCESAT-1)

AGAINST LISS-III (RESOURCESAT-1) REFERENCE DATA…………………….…………………………...75

TABLE B-8: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED AWIFS (RESOURCESAT-1)

AGAINST LISS-III (RESOURCESAT-1) REFERENCE DATA………….……………………………...………75

TABLE B-9: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED AWIFS

(RESOURCESAT-1) AGAINST LISS-III (RESOURCESAT-1) REFERENCE DATA………………………….76

TABLE B-10: ACCURACY ASSESSMENT RESULTS FOR SPECTRAL ANGLE KERNEL CLASSIFIED

AWIFS (RESOURCESAT-1) AGAINST LISS-III (RESOURCESAT-1) REFERENCE DATA…………………76

TABLE B-11: ACCURACY ASSESSMENT RESULTS FOR IM-SPECTRAL ANGLE KERNEL CLASSIFIED

AWIFS (RESOURCESAT-1) AGAINST LISS-III (RESOURCESAT-1) REFERENCE DATA…………………77

TABLE B-12: ACCURACY ASSESSMENT RESULTS FOR LINEAR-SPECTRAL ANGLE KERNEL

CLASSIFIED AWIFS (RESOURCESAT-1) AGAINST LISS-III (RESOURCESAT-1) REFERENCE DATA….77

TABLE B-13: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED LISS-III (RESOURCESAT-1)

AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………………………………78

TABLE B-14: ACCURACY ASSESSMENT RESULTS FOR IM KERNEL CLASSIFIED LISS-III

(RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………..78

TABLE B-15: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED LISS-III

(RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………..79

TABLE B-16: ACCURACY ASSESSMENT RESULTS FOR SPECTRAL ANGLE KERNEL CLASSIFIED

LISS-III (RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA………………...79

Page 16: Non-Linear Separation of classes using a Kernel based

x

TABLE B-17: ACCURACY ASSESSMENT RESULTS FOR IM-SPECTRAL ANGLE KERNEL CLASSIFIED

LISS-III (RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………….…...80

TABLE B-18: ACCURACY ASSESSMENT RESULTS FOR LINEAR-SPECTRAL ANGLE KERNEL

CLASSIFIED LISS-III (RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…80

TABLE B-19: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED AWIFS (RESOURCESAT-2)

AGAINST LISS-III (RESOURCESAT-2) REFERENCE DATA………………………………………………....81

TABLE B-20: ACCURACY ASSESSMENT RESULTS FOR GAUSSIAN KERNEL CLASSIFIED AWIFS

(RESOURCESAT-2) AGAINST LISS-III (RESOURCESAT-2) REFERENCE DATA………………………….81

TABLE B-21: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED AWIFS

(RESOURCESAT-2) AGAINST LISS-III (RESOURCESAT-2) REFERENCE DATA…………………….……82

TABLE B-22: ACCURACY ASSESSMENT RESULTS FOR SPECTRAL-ANGLE KERNEL CLASSIFIED

AWIFS (RESOURCESAT-2) AGAINST LISS-III (RESOURCESAT-2) REFERENCE DATA…………………82

TABLE B-23: ACCURACY ASSESSMENT RESULTS FOR SPECTRAL-GAUSSIAN KERNEL CLASSIFIED

AWIFS (RESOURCESAT-2) AGAINST LISS-III (RESOURCESAT-2) REFERENCE DATA…………………83

TABLE B-24: ACCURACY ASSESSMENT RESULTS FOR LINEAR-SPECTRAL ANGLE KERNEL

CLASSIFIED AWIFS (RESOURCESAT-2) AGAINST LISS-III (RESOURCESAT-2) REFERENCE DATA…83

TABLE B-25: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED AWIFS (RESOURCESAT-2)

AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…………………………………………………84

TABLE B-26: ACCURACY ASSESSMENT RESULTS FOR GAUSSIAN KERNEL CLASSIFIED AWIFS

(RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…………………………..84

TABLE B-27: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED AWIFS

(RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…………………………..85

TABLE B-28: ACCURACY ASSESSMENT RESULTS FOR SPECTRAL ANGLE KERNEL CLASSIFIED

AWIFS (RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA………………….85

TABLE B-29: ACCURACY ASSESSMENT RESULTS FOR GAUSSIAN-SPECTRAL ANGLE KERNEL

CLASSIFIED AWIFS (RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…..86

TABLE B-30: ACCURACY ASSESSMENT RESULTS FOR LINEAR-SPECTRAL ANGLE KERNEL

CLASSIFIED AWIFS (RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…..86

TABLE B-31: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED LISS-III (RESOURCESAT-2)

AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…………………………………………………87

TABLE B-32: ACCURACY ASSESSMENT RESULTS FOR GAUSSIAN KERNEL CLASSIFIED LISS-III

(RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA………………………….87

TABLE B-33: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED LISS-III

(RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA………………………….88

Page 17: Non-Linear Separation of classes using a Kernel based

xi

TABLE B-34: ACCURACY ASSESSMENT RESULTS FOR SPECTRAL ANGLE KERNEL CLASSIFIED

LISS-III (RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA………………...88

TABLE B-35: ACCURACY ASSESSMENT RESULTS FOR GAUSSIAN-SPECTRAL ANGLE KERNEL

CLASSIFIED LISS-III (RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…89

TABLE B-36: ACCURACY ASSESSMENT RESULTS FOR LINEAR-SPECTRAL ANGLE KERNEL

CLASSIFIED LISS-III (RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…89

TABLE B-37: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED AWIFS (RESOURCESAT-1)

AGAINST LISS-III (RESOURCESAT-1) REFERENCE DATA……….……………………………...…………90

TABLE B-38: ACCURACY ASSESSMENT RESULTS FOR IM KERNEL CLASSIFIED AWIFS

(RESOURCESAT-1) AGAINST LISS-III (RESOURCESAT-1) REFERENCE DATA…….……………………90

TABLE B-39: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED AWIFS

(RESOURCESAT-1) AGAINST LISS-III (RESOURCESAT-1) REFERENCE DATA………………………….91

TABLE B-40: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED AWIFS (RESOURCESAT-1)

AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………………………………91

TABLE B-41: ACCURACY ASSESSMENT RESULTS FOR IM KERNEL CLASSIFIED AWIFS

(RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………..92

TABLE B-42: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED AWIFS

(RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………..92

TABLE B-43: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED LISS-III (RESOURCESAT-1)

AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………………………………93

TABLE B-44: ACCURACY ASSESSMENT RESULTS FOR IM KERNEL CLASSIFIED LISS-III

(RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………..93

TABLE B-45: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED LISS-III

(RESOURCESAT-1) AGAINST LISS-IV (RESOURCESAT-1) REFERENCE DATA…………………………..94

TABLE B-46: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED AWIFS (RESOURCESAT-2)

AGAINST LISS-III (RESOURCESAT-2) REFERENCE DATA………….………...……………………………94

TABLE B-47: ACCURACY ASSESSMENT RESULTS FOR GAUSSIAN KERNEL CLASSIFIED AWIFS

(RESOURCESAT-2) AGAINST LISS-III (RESOURCESAT-2) REFERENCE DATA………………………….95

TABLE B-48: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED AWIFS

(RESOURCESAT-2) AGAINST LISS-III (RESOURCESAT-2) REFERENCE DATA………………………….95

TABLE B-49: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED AWIFS (RESOURCESAT-2)

AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…………………………………………………96

Page 18: Non-Linear Separation of classes using a Kernel based

xii

TABLE B-50: ACCURACY ASSESSMENT RESULTS FOR GAUSSIAN KERNEL CLASSIFIED AWIFS

(RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…………………………..96

TABLE B-51: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED AWIFS

(RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…………………………..97

TABLE B-52: ACCURACY ASSESSMENT RESULTS FOR FCM CLASSIFIED LISS-III (RESOURCESAT-2)

AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…………………………………………………97

TABLE B-53: ACCURACY ASSESSMENT RESULTS FOR GAUSSIAN KERNEL CLASSIFIED LISS-III

(RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…………………………..98

TABLE B-54: ACCURACY ASSESSMENT RESULTS FOR LINEAR KERNEL CLASSIFIED LISS-III

(RESOURCESAT-2) AGAINST LISS-IV (RESOURCESAT-2) REFERENCE DATA…………………………..98

Page 19: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 1

1. INTRODUCTION

Remote Sensing techniques have been widely used to obtain useful information for detection and

discrimination of Earth surface cover. Digital images acquired by various sensors are used for a wide range

of applications such as disaster management, natural resource monitoring, urban planning, land use/land

cover (LULC) mapping and many others. For regional or global level LULC mapping, these digital images

have become an effective source of information. Interpreting these raw digital images acquired from

various sensors with human interpretation however, has resulted in lower quantitative accuracy. A higher

accuracy can be achieved with the intervention of computers to process a digital image (Richards and Jia,

2005).

Lillesand and Kiefer (1979) have mentioned digital image classification as a quantitative technique to

classify image data into various categories. Results of image classification are summarized in the form of

thematic maps by assigning class labels to each pixel in the image. These thematic maps are used in turn

for mapping the surface cover information, e.g. for conservation and development purposes. Supervised

and Unsupervised image classifications are two broad categories of classification procedure (Campbell,

1996).

In unsupervised classification a sample point is assigned to a cluster based on the similarity in spectral

values of a pixel. Over time, spectral properties of a class change and at times the procedure identifies

samples that do not correspond to that particular class; those are major limitations of this technique

(Campbell, 1996). In supervised classification the analyst has control over assigning informational classes

based on field data. Several statistical classification algorithms have been developed such as k-means

classifier, the minimum Distance to mean classifier and the maximum likelihood classifier (Tso and

Mather, 2000). All these classifiers have as a single objective to improve classification accuracy.

Traditional classification techniques allocate each pixel to a single land cover class resulting in a hard (or

‘crisp’) partitioning ( Zhang and Foody, 1998). Hard classification techniques assume that a single pixel in

the image accounts for a uniform land cover class on the ground corresponding to the pixel size. It is

rarely the case in reality; however, that such a pixel on the ground corresponds to a single and uniform

land use class. For regional or global level studies coarse resolution remote sensing images are used that

are dominated by mixed pixels. Several land cover types or information classes are then contained in a

single pixel. Conventional image classification techniques assign these mixed pixels to a single class, thus

introducing error in the classified image and resulting in a reduction in classification accuracy. The main

reasons for the presence of mixed pixels are the following (Zhang and Foody, 1998 ; Chawla, 2010):

A coarse spatial resolution of a sensor results in including several classes in a particular pixel. This

results in a composite spectral response which may differ from each of its component classes.

Page 20: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 2

With time, land cover classes degrade from one class to another. For example water can change to

moist grassland or with the seasonal change agricultural crops are harvested. As a result these land

cover classes are mixed.

The value of a pixel recorded by the sensor may be different for highly similar entities and similar

for different entities. Pixel values can change based on the interaction of the electromagnetic

waves with the atmosphere or objects.

Presence of these mixed pixels reduces the classification accuracy in large proportion. Bezdek et al. (1984)

introduced Fuzzy c-Means (FCM) with the idea of fuzzy sets put forward by Zadeh (1965) to solve the

mixed pixel problem. Zadeh’s idea was to assign a particular sample or pixel to more than one cluster with

the help of a membership grade varying between 0 and 1. A grade close to 1 indicates a high possibility

that the sample belongs to that particular cluster and vice-versa (Bezdek et al., 1984). Fuzzy or soft

classification techniques increase the accuracy of classification results for coarser resolution images. It is an

alternative to c-means clustering algorithm for pattern recognition. FCM is a flexible approach as it assigns

sample points in to more than one cluster but it performs well only for spherical clusters (Suganya and

Shanthi, 2012).

Krishnapuram and Keller (1996) introduced the Possiblistic c-Means (PCM) classifier, which is an

improvement to the FCM as PCM is more robust to noise errors. Linearly separable classes are the

simplest cases in image classification. In pattern analysis, certain samples appear to be non-linear in nature

due to redundancy in spectral values. A recent development was to use these kernel methods in FCM to

implement a non-linear version of the algorithm. A Kernel based Fuzzy c-Means (KFCM) classifier was

developed in order to classify non-linear data. For the KFCM, sample data that appear to be non-linear in

the input space are mapped to a higher dimensional feature space where the sample points are considered

to be linearly separable (Yang et al., 2007) . In the original input space these computations become

complex and cost-effective. Mercers kernel for clustering introduced kernel functions for the Support

Vector Machine classification of non-linear data to calculate the number of clusters within the data and

perform classification in the feature space (Girolami, 2002).

Because of the ability of KFCM methods to cluster more shapes in the input dataset, their classification

accuracies are much higher as compared to FCM (Yang et al., 2007). Different types of kernels such as

positive definite kernels and stationary kernels have been discussed in Ben-hur (2001).A total of eight

kernel functions was considered in this study categorized as: local kernel, global kernel or spectral kernel.

Four local kernels considered are: the Gaussian kernel using the Euclidean Norm, the radial basis kernel,

the inverse multiquadratic kernel and kernel with the moderately decreasing with distance (KMOD). Three

global kernels are: the linear kernel, the polynomial kernel and the sigmoid kernel (Kumar, 2007).

Page 21: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 3

Two single kernels can be combined to form a composite kernel. This gives a better classification as

compared to a single kernel (Camps-valls et al., 2006). Different combinations of single kernels can be

adopted for inheriting both the spectral and spatial properties of a single kernel (Camps-valls et al., 2006).

Kernel based clustering is more robust to noise and outliers and also tolerates unequal sized clusters which

is a major drawback of the FCM algorithm (Zhang and Chen, 2003).

To assess the accuracy of soft classified outputs many methods have been put forward (Binaghi et al.,

1999; Congalton, 1991; Zhang and Foody, 1998). The traditional error matrix cannot be used because it

assigns one-pixel-one-class method. Binaghi et al., (1999) introduced the Fuzzy Error Matrix (FERM) to

assess the accuracy of soft classified results. Even though it is appealing, it is not considered as a standard

method. In this current research work, classified results are compared based on the minimum entropy and

maximum mean membership difference method between the single kernels and composite kernels and the

best among them is chosen.

1.1. MOTIVATION AND PROBLEM STATEMENT

Coarse resolution remote sensing images are used for mapping purposes at the regional or global level.

These images may have mixed pixels as well as non-linearity in data, resulting in an incorrectly classified

image. Soft classification methods have been found superior when compared to hard classification in the

presence of mixed pixels. Fuzzy classifiers are able to handle mixed pixels whereas kernels are used to

handle nonlinear data. In the light of the properties of both FCM and kernels the current research was

proposed to study the behaviour of different kernels using a Kernel based Fuzzy c-Means (KFCM)

classifier. A comparative approach was taken to analyse the performance between single and composite

kernels. The combination of the best single kernels is then taken as the composite kernel.

1.2. RESEARCH OBJECTIVE:

The main objective of this research work is to optimally separate non-linear classes using a Kernel based

Fuzzy c-Means approach. The specific objectives are:

To develop an objective function for Kernel based Fuzzy-c-Means Classifier (KFCM) to

handle non-linear class separation.

To select the best single or composite kernel to be used within the KFCM classifier.

To evaluate the performance of this classifier in the case of untrained classes.

To study the best kernel model with the best possible parameter.

Finally, soft outputs of KFCM classifier were studied using an image to image accuracy assessment

approach.

Page 22: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 4

1.3. RESEARCH QUESTIONS

The following research questions are formulated from the research objectives:

1. How can non-linearity within a class boundary in feature space be handled effectively

using KFCM?

2. How can mixed pixels be handled using KFCM?

3. How can the performance of single/composite kernels be evaluated using KFCM?

4. To which degree is the FCM classification algorithm capable to handle non-linear feature

vectors of different classes for classification?

5. What will be the effect of using composite kernels on KFCM as compared to single

kernels?

1.4. INNOVATION AIMED AT

Kernel based Fuzzy c-Means approach is studied with eight single kernels and the best among them is

selected for the best composite kernel. In this research work, comparative analysis is made between FCM

and KFCM performance by optimizing the value of different parameters used in these algorithms.

1.5. THESIS STRUCTURE

The thesis accounts for the work done for this particular research work in five chapters. The First chapter

gives a brief introduction about this research, the objectives to be accomplished and research questions

formulated from the research objectives. The Second chapter describes about the previous work that has

been done related to this research work. The Third chapter explains about the concepts and formulas used

along with the study area used and methodology adopted. The Fourth chapter deals with the results

obtained from the classifier so developed. The Fifth chapter discussed the results so obtained along with

the accuracy assessment results. Finally the Sixth chapter concludes the research along with the answers to

this research questions and the possibilities of further research work.

Page 23: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 5

2. LITERATURE REVIEW

2.1. LAND COVER CLASSIFICATION METHODS:

The literature about Land Use/Land Cover (LULC) information is exceedingly broad. Multi-spectral

image classification techniques are used for the information extraction for various environmental studies.

Image classification approaches can be classified based on supervised and unsupervised, or crisp and fuzzy

as well as parametric and non-parametric(Lu and Weng, 2007). As the spectral properties of information

classes change over time and some spectral properties did not correspond to information classes,

unsupervised classification was not considered as advantageous when compared to supervised

approach(Campbell, 1996).

The k-means algorithm and fuzzy clustering constituted the unsupervised classification methods (Tso and

Mather, 2000). Supervised classification algorithms such as the Maximum Likelihood (ML) classifier and

the Minimum Distance to mean classifier were introduced taking one-pixel-one-class approach

(Choodarathnakara et al.,2012a). In the parallelepiped method, a parallelepiped-like subspace is defined for

each class. Even though this method is easy to implement errors occurs in two cases: 1) when a pixel lies

in more than one parallelepiped and 2) when a pixel lies outside all parallelepiped (Tso and Mather, 2000).

The ML and the Minimum Distance to Means classifiers are based on the evaluation of various spectral

response patterns when classifying an unknown pixel. The Minimum Distance to Means classifier is one of

the simplest classification approaches but it is insensitive to different degrees of variance in the spectral

response data (Lillesand and Kiefer, 1979). The ML classifier is a statistical method that quantitatively

evaluates both covariance and correlation of spectral response patterns when an unknown pixel is

classified. But the major drawback of the ML classifier is the large number of computations required to

classify each pixel (Lillesand and Kiefer, 1979). The ML classifier cannot perform better in the presence of

mixed pixels because of the difficulty to differentiate between features in similar spectrum (e.g. forest and

grassland) (Tan et al., 2011).

For regional or global level studies coarse resolution images are widely used that contain mixed pixels.

Conventional ‘crisp’ classification algorithms are incapable of mapping sub-pixel level information (Settle

and Drake, 1993). Statistical or traditional image classifiers such as ML classifier do not take into account

the presence of mixed pixels thus, resulting in a low classification accuracy (Kavzoglu and Reis, 2008). A

fuzzy classifier addresses the problem that a pixel is assigned to more than one land cover classes.

Advanced soft image classification techniques such as Artificial Neural Networks (ANN), Genetic

Algorithms (GA), and Decision Tree classifiers are trending research areas. A comparative study between

ML classifier and Artificial Neural Network (ANN) by (Kavzoglu and Reis, 2008) showed that the ANN

provides better classification accuracy when compared to the ML classifier. Due to spectral similarity and

Page 24: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 6

superposition of spectral regions of several classes, the ML algorithm which relies on the statistical

estimates has wrongly identified many pixels in the resulting image.

2.2. FUZZY c-MEANS (FCM)

The introduction of fuzzy logic gave way to Fuzzy c-Means (FCM) clustering technique. FCM permits a

sample data point to belong to several clusters. Zadeh (1965) introduced fuzzy sets, where each sample

data point is assigned to a cluster based on a membership grade (degree of sharing) which can range

between zero to unity. Fuzzy logic is an effective method in image classification and collateral data can also

be classified well (Choodarathnakara et al., 2012b). In FCM, the sample data point is assigned to a cluster

based on high intra cluster resemblance (Bezdek et al. 1984). The FCM algorithm used by Bezdek et al.,

(1984) is unsupervised in nature. When the information about the classes of interest is known a priori,

supervised image classification techniques are most widely used (Campbell, 1996).

Wang (1990) introduced fuzzy supervised classification of remote sensing images with higher

classification accuracy. Supervised FCM classification was used for the estimation and mapping of sub-

pixel land cover composition (Foody, 2000, Atkinson et al., 1997). In FCM, the proportion of the land

cover types are reflected in the fuzzy membership values (Fisher and Pathirana, 1990). Earlier, fuzzy

approaches dealt with fuzziness only in the class allocation stage but not in the testing or training stage. A

classification approach which accommodates fuzziness in allocation, training and testing stage are

considered to be fully-fuzzy classified whereas a fuzzy approach which takes fuzziness only at the

allocation stage are termed to be partially-fuzzy classified (Zhang and Foody, 1998). Zhang and Foody

(2002) showed an improvement in accuracy from 6.6% to 5.0% when a fully fuzzy supervised classification

was used rather than partially-fuzzy classifications. FCM generates membership that represents degree of

sharing but not degree of typicality (Krishnapuram and Keller, 1996). FCM showed poor performance in

the presence of noise and outliers.

FCM is one of the most popular techniques used in the field of medical image segmentation. Based upon

the concept of data compression an improved FCM (IFCM) was introduced where the dimensionality of

the input data was reduced with a change in the cluster and membership value criterion (Hemanth et al.,

2009). It has pointed out in Vinushree et al. (2014) that FCM is effective only in clustering crisp, spherical

and non-overlapping data. Suganya and Shanthi (2012) has concluded that the algorithm performs well in

the case of spherical clustering and sensitive to noise and expects low degree of membership for outliers.

The data in an image exhibit different pattern that may or may not be clearly visible. Pattern analysis refers

to a class of machine learning algorithms that classifies data based on the properties of different patterns.

Linearly separable classes are the simplest case that can appear in the pattern of a data (Isaacs et al., 2007).

If the data appear to be non-linearly separable the classification will be computationally intricate in the

Page 25: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 7

original input space. For separating these non-linear data many kernels based methods were introduced in

recent years (Girolami, 2002;Camps-Valls and Bruzzone, 2009). These methods map the input data to a

higher dimensional space where the data turn out to be linearly separable (Awan and Sap, 2005).

Mostly kernel based algorithms were used in Support Vector Machines (SVM). It is a statistical learning

approach which uses kernels for remote sensing classification (Pal, 2009). Kernel methods are used in wide

range of applications. Hao Huang and Zhu (2006) proposed a non-linear feature extraction algorithm for

speech recognition. It has also shown its importance in classification, face recognition, speech recognition

and many others. For classifying non-linearly separable data KFCM was introduced and also deals with the

drawbacks in fuzzy clustering. (Ravindraiah and Tejaswini, 2013) has studied the hierarchical evolution of

different types of fuzzy clustering techniques for image segmentation.

2.3. KERNELS

Kernels are machine learning algorithms for pattern analysis which was introduced for SVM clustering

which can generate cluster boundaries of arbitrary shape (Ben-hur et al., 2001). Kernel functions maps

sample data from the initial sample space into a higher dimensional space where the sample data are

linearly separable and allows interpreting data in feature space. When transforming data to a higher

dimension it should be ensured that non-linear transformations do not introduce structure to the inherent

data (Girolami, 2002). It has been adopted for unsupervised learning method and just suits hyper-spherical

or hyper-ellipsoidal clusters. There are different classes of kernels: positive-definite kernels, stationary

kernels, locally stationary kernels, non-stationary kernels and reducible kernels based on a statistical

perspective (Genton, 2001).

Zhang and Chen (2002) introduced fuzzy clustering using kernel methods where both spherical and

overlapping datasets have been used to evaluate the performance of KFCM and FCM. Spectral and kernel

clustering were found to have a unifying theory where in spectral methods there is an adjacency between

Figure 2-1: Two clusters in input space denoted in different shape showing the non-linearly and linearly separable case.

Page 26: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 8

patterns which is analogous of the kernel functions (Filippone at al., 2008). KFCM can be divided into two

categories: 1) a prototype of FCM algorithm resides in feature space and is implicitly mapped to the kernel

space by means of a kernel function 2) a prototypes is directly constructed in kernel space, which allows

more freedom for prototypes in the feature space (Huang at al., 2012). Graves and Pedrycz (2007)

evaluated the performance of kernel based fuzzy clustering where they concluded that the performance

with kernels was better but required fine tuning of the parameters. These methods are well suited for

clustering ring data set and similar structures like square data set.

Kim et al. (2001) used Kernel Principal Component Analysis (KPCA) applying a polynomial kernel for the

analysis of texture classification. They showed that a kernel PCA gave an overall good performance. Many

other uses were also introduced for kernels in the field of image classification (Camps-valls et al., 2004).

Camps-valls and Bruzzone (2005) has used linear, polynomial and Radial Basis function kernels for

hyperspectral image classification. A polynomial kernel showed an overall good performance and robust to

common levels of noise. Bhatt and Mishra (2013) and Bhatt and Mishra (2014) used local kernels i.e. the

KMOD and the inverse multiquadratic kernel as well as the global kernels i.e. Linear, Polynomial and

Sigmoid kernels to classify water and vegetation. Huang et al. (2011) introduced a weighting matrix to

Radial Basis Function (RBF) kernel to weigh the training samples according to their information

significance.

Composite kernels sum up the spectral and textural information in the input image to the classified output

and they gave excellent performance results when compared to a single kernel (Camps-valls et al., 2006).

Kernels can be combined based on the stacked approach, direct summation, weighted summation and

cross information kernel. Different types of kernels were used for multi-temporal classification of remote

sensing images and have been used for change detection, tackling real world problems such as urban

monitoring (Camps-valls et al., 2008). Composite kernels resulted with the best results in the case of urban

monitoring. The overall accuracy of various mixtures of kernel functions varies with change in the weight

given to each kernel in SVM (Kumar et al., 2006). The best combination of kernels will change with the

datasets used. KFCM algorithms have been used to achieve optimization of clustering and classification

(KOCC) simultaneously.

Even though, KFCM outperforms FCM, sometimes the clustering also depends on the densities and

shape of the datasets used (Tsai and Lin, 2011). The computational load of KFCM is very high if the total

number of data points is large, especially if these methods are used for image segmentation. KFCM can

partition datasets only up to quadratic functions and it’s still a research area for higher polynomial

functions. Kernel with Moderate Decrease of spatial distance (KMOD) class preserves the whole data

closeness information while still penalizing the far neighbourhood in the case of sparse data (Ayat et al.,

2001). Such kernels are more reliable when compared to the others. KMOD gives the best results in

separating patterns when compared to the RBF or polynomial kernels.

Page 27: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 9

2.4. ACCURACY ASSESSMENT

Accuracy assessment of a remote sensing data provides a measure of confidence on the quality of product

to the end users. Many approaches have been introduced to assess the classification accuracy. Congalton

(1991) introduced error matrix or confusion matrix or contingency table which is a square array of

numbers set out in rows and columns which express the number of sample units assigned to a particular

category relative to the actual category as verified on the ground (Congalton, 1991). The error matrix not

only represents a tabular form of accuracy but also presents the overall accuracy, users as well as producers

accuracy (Congalton, 1991). The confusion matrix simply tells how well the classifier can classify the

training area and nothing more (Lillesand and Kiefer, 1979).

The Kappa statistics was considered to be a fundamental measure of accuracy (Smits et al., 1999). This

statistic serves as an indicator of the extent to which the percentage correct values of an error matrix are

due to true agreement or chance agreement (Lillesand and Kiefer, 1979). But these measures are used to

ascertain hard classification results. Due to the presence of sub-pixel class boundaries these measures

hardly represents the actual value of the quality of the classified image. Need for accuracy assessment of

sub-pixel classified images are shown in Latifovic and Olthof (2004). For assessing the accuracy of soft

classification outputs, no regular assessment technique is available (Harikumar, 2014). If no soft reference

dataset is available then the output of fuzzy classification can be hardened which may lead to data loss

(Binaghi et al., 1999; Silvan-Cardenas and Wang, 2008;Okeke and Karnieli, 2006; Harikumar, 2014).

Silvan-Cardenas and Wang (2008) discussed the various basic operators used for sub-pixel classification

where MIN operator gives the maximum sub-pixel overlap among classes, PROD operator measures the

expected class overlap between the reference and assessed sub-pixel partitions and LEAST operator

measures the minimum possible sub-pixel overlap. If the assessed data matched perfectly with reference

data then the error matrix should appear diagonal which was not in the case of composite operators. Thus

to satisfy the property of diagonalization, composite operators MIN-PROD, MIN-LEAST and MIN-

PROD (Pontius and Cheuk, 2006)were introduced. Based on the traditional error matrix, (Binaghi et al.,

1999) introduced Fuzzy Error Matrix (FERM) for the evaluation of soft classifiers but even this method

cannot be considered a standard one. FERM provided a more accurate measure of Overall Accuracy (OA)

with multimembership grades which proved useful than a conventional OA based on hardened values

(Binaghi et al., 1999). Silvan-Cardenas and Wang (2008) proposed a sub-pixel confusion-uncertainty matrix

(SCM) for the confusion created in sub-pixel area allocation which reports the confusion intervals in the

form of a center-value plus-minus maximum error to account for the sub-pixel uncertainty.

The Mean Relative Error (MRE), Root Mean Square Error (RMSE) and the Correlation Coefficient (CC)

criteria depends on the actual and desired outputs of the classifier and hence it is more dependent on the

error in the results. Dehghan and Ghassemian (2006) proposed entropy measure which depends on the

Page 28: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 10

actual outputs of the classifier and they are sensitive to uncertainty. When the ground data are fuzzy the

interpretation of entropy values will be difficult, in these cases cross entropy value helps (Foody, 1995).

Using fuzzy classification and fuzzy ground data the results of cross entropy indicates closeness in land

cover composition.

(Yun-song and Yu-feng, 2010) compared the accuracy of KFCM and FCM algorithm using an error matrix

where the classification accuracy of KFCM was 3% higher than that of FCM. In the presence of mixed

pixels the FERM gives a better result when compared to the traditional error matrix. Hence, in this

research work the accuracy of the generated classified outputs has been assessed using the FERM for all

the kernels. The performance of KFCM and FCM is compared so as to determine which classifier gives

good result.

Page 29: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 11

3. CLASSIFICATION APPROACHES, STUDY

AREA AND METHODOLOGY

The first section discuss about the various concepts and approaches used in this research work along with

the different kernels used. The second section describes about the study area used, various sensors and

also explains in detail about the processing steps for the datasets used. The third section explains the

methodology adopted to carry out this research work.

3.1. CLASSIFICATION APPROACHES AND ACCURACY

ASSESSMENT

The lists of symbols used for this section are as follows:

𝑋 = {𝑥1, 𝑥2, … . 𝑥𝑛} : set of 𝑛 sample points

𝑥𝑖 : spectral response of a pixel

𝑌 : subset of set X

𝑐 : number of clusters

𝑁 : number of pixels

𝑚 : weighting component

𝑈 : membership matrix of size (c × n)

𝜇(𝑥) : membership grade of sample point 𝑥

𝜇𝑖𝑗 : membership value of a pixel in ith row and jth column

𝑉 = {𝑣1, 𝑣2 … . 𝑣2} : set vector of cluster centres

𝐴 : is the weight matrix

𝐼 : identity matrix

‖ ‖𝐴2 : squared norm of A

𝑑𝑖𝑗 : squared distance norm between the sample point and a cluster center

K(.,.) : kernel function

Page 30: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 12

3.1.1. CLUSTERING

Clustering refers to grouping of pixels that are spectrally similar in multispectral space (Richards and Jia,

2005). Clustering partitions the data into different clusters based on the similar properties (Figure 3-1).

Different algorithms for clustering have been introduced such as single pass clustering algorithms and

hierarchical clustering. Clustering can also be divided into ‘hard’ and ‘soft’ clustering (Richards and Jia,

2005). In the case of hard clustering each pixel in the input image is assigned to a single cluster whereas in

fuzzy clustering due each pixel is assigned to more than one cluster with a membership grade to each class,

thus showing the degree of belongingness of a particular class in a pixel (Zadeh, 1965).

Figure 3-1: Clustering

We now consider the Fuzzy c-Means(FCM) classifier which is a widely used soft clustering technique

introduced by Bezdek et al. (1984). FCM operates by assigning sample data to different cluster using a

membership grade that varies between 0 and 1 (Bezdek et al., 1984).

3.1.2. THE FUZZY c-MEANS (FCM) CLASSIFIER

A fuzzy set is characterized by a membership function that associates each sample data point to a value in

the interval [0, 1] symbolizing the membership grade. Let 𝑌 represent a set (class) in 𝑋 (space of points)

then; the fuzzy set 𝑌 is represented as in equation (3.1) (Camps-Valls and Bruzzone, 2009),

𝑌 = { 𝑓(𝑥, 𝜇(𝑥)) | 𝑥 ∈ 𝑋 }

(3.1)

Here 𝜇(𝑥) represents the membership grade and 𝑥 represents sample object in 𝑋 (Zadeh, 1965). Each

sample data point has a membership value between zero and one. A membership value close to one

represents a high degree of similarity between the sample point and the cluster (Bezdek et al., 1984).

Fuzzy clustering is an alternative to unsupervised classification using k-means. In fuzzy clustering, each

pixel may belong to two or more clusters and will have a membership value for each cluster. FCM is one of

the most widely accepted iterative unsupervised fuzzy clustering algorithms which allows sample data

point to belong to more than one cluster. FCM algorithm partitions dataset 𝑋 = {𝑥1, 𝑥2 … 𝑥𝑛} into 𝑐

fuzzy subsets subject to a few constraints. A fuzzy 𝑐 partition of 𝑋 can be represented by a (𝑐 × 𝑛) 𝑈

Page 31: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 13

matrix where each entry 𝜇𝑖𝑗 represents the class membership of a pixel (Tso and Mather, 2000). The

matrix 𝑈 satisfies the two constraints mentioned in equation (3.2a) and (3.2b) (Tso and Mather, 2000);

𝜇𝑖𝑗 ∈ [0, 1] (3.2a)

and

∑ 𝜇𝑖𝑗

𝑐

𝑗=1

= 1 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑖 (3.2b)

The clustering criterion used in FCM is attained by minimizing the least square error objective function

mentioned in equation (3.3) (Tso and Mather, 2000):

𝐽𝐹𝐶𝑀(𝑈, 𝑉) = ∑ ∑(𝜇𝑖𝑗)𝑚 ‖𝑥𝑖 −𝑣𝑗‖2

𝐴

𝑐

𝑗=1

𝑁

𝑖=1

, 1< m <∞

(3.3)

where 𝑚 is the membership weighting component which controls the degree of fuzziness, 𝑉 =

{𝑣1, 𝑣2 … 𝑣𝑛} represents the vector of cluster centers (mean feature vector from training sites), 𝑥𝑖

represents the spectral response of a pixel (feature vector), 𝑐 is the number of cluster centers and 𝑁

represents the number of pixels. ‖𝑥𝑖 −𝑣𝑗‖𝟐is the squared distance (𝑑𝑖𝑗) norm between measured value

and cluster center which is given in equation(3.4)(Kumar, 2007);

𝑑𝑖𝑗2 = ‖𝑥𝑖 −𝑣𝑗‖

2= (𝑥𝑖 − 𝑣𝑗)

𝑇 𝐴 (𝑥𝑖 − 𝑣𝑗)

(3.4)

where 𝐴 is the weight matrix. In Equation (3.5b) and (3.5c) represents the distance calculated for cluster 𝑗.

Several norms are applicable for use in equation (3.4). Amongst the available norms, mainly three norms

are widely used in particular the Euclidean norm, the diagonal Norm and the Mahalonobis norm (Bezdek

et al., 1984). The formulations of each norm are as mentioned in equations (3.5a), (3.5b) and (3.5c)

(Bezdek et al., 1984),

𝐴 = 𝐼 Euclidean Norm (3.5a)

𝐴 = 𝐷𝑗−1 Diagonal Norm (3.5b)

Page 32: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 14

where 𝐼 is the identity matrix, 𝐷𝑗 is the diagonal matrix with diagonal elements eigen values of variance

covariance matrix of 𝐶𝑗 given in equation (3.6) (Bezdek et al., 1984),

𝐶𝑗 = ∑(𝑥𝑖 − 𝑣𝑗)(𝑥𝑖 − 𝑣𝑗)𝑇

𝑁

𝑖=1

(3.6)

where

𝑣𝑗 = ∑ 𝑥𝑖 𝑁⁄

𝑁

𝑖=1

(3.7)

If 𝐴 = 𝐼 , then the objective function 𝐽𝐹𝐶𝑀 identifies hyper spherical clusters. For any other norm the

clusters identified are hyper ellipsoidal. One of the drawback of using any norm is that the preference of

clusters of a certain data even though it is not present in the input dataset. For each class there represents a

corresponding membership matrix, thus updating the values for each matrix is necessary. The class

membership matrix 𝜇𝑖𝑗 is updated by equation (3.8) (Tso and Mather, 2000),

𝜇𝑖𝑗 = 1

∑ (𝑑𝑖𝑗

2

𝑑𝑖𝑘2 )𝑐

𝑘=1

1 (𝑚−1)⁄

(3.8)

and the cluster centers are obtained by equation (3.9)(Tso & Mather, 2000),

𝑣𝑗 =

∑ 𝜇𝑖𝑗𝑚. 𝑥𝑖

𝑁𝑖=1

∑ 𝜇𝑖𝑗𝑚𝑁

𝑖=1

(3.9)

Class membership values designate the proportions of different classes to a particular pixel. The FCM

algorithm (unsupervised) is summarized in steps 1 to step 4 (Tso and Mather, 2000),

1. Initialize the matrix = [𝑢𝑖𝑗] , 𝑈(0).

2. Compute the cluster center using Equation (3.9).

3. Update the membership matrix using Equation (3.8).

4. Repeat steps 2 and 3 until (‖𝑈𝑛𝑒𝑤 − 𝑈𝑜𝑙𝑑‖ < 𝜀).

where 𝜖 represents the false tolerance value whose usual value is given as 0.001.

𝐴 = 𝐶𝑗−1 Mahalonobis Norm (3.5c)

Page 33: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 15

Weighting Component 𝒎: The value of 𝑚 controls the degree of fuzziness and is also known as

fuzzifier. As 𝑚 changes from one to infinity FCM tends to change from a crisp classifier to an entirely

fuzzy classifier. Cannon et al. (1986) proposed that the value of ‘m’ ranges between 1.3 to 1.8. Generally,

the optimized value of 𝑚 ranges between values 1.5 to 2.0. Zimmermann (2001) suggested to take the

value of 𝑚 equal to 2, but there has been no theoretical justification of choosing the value.

Number of cluster centers 𝒄: When the user does not know about the number of information classes,

more knowledge of the number of cluster centers becomes necessary. Kim et al. (2009) proposed a cluster

validity index method which determines the optimal number of clusters for fuzzy partitions.

3.1.3. KERNELS

Kernels are used in machine learning for data analysis, in particular in SVM classifiers. The kernel concept

is based on an optimal linear separating hyperplane fitted between training samples in a higher dimensional

feature space (Camps-Valls and Bruzzone, 2009). All samples that belong to the same class are separated

along the side of the hyperplane. Boser et al. (1992) concluded that maximizing the margin between a class

boundary and the training samples is a better method and optimizes the cost functions such as the mean

squared error. When classes are not linearly separable the training samples are mapped to a higher

dimensional space where they are considered to be linearly separable (Figure 3-2).

Figure 3-2: Mapping of kernels to a higher dimensional space

For illustration of a kernel mapping, consider a few sample data of a two non-empty sets 𝑋 × 𝑇 as in

equation (3.10) (Camps-Valls and Bruzzone, 2009),

(𝑥1, 𝑡1), (𝑥2, 𝑡2) … … … (𝑥𝑛, 𝑡𝑛) ∈ 𝑋 × 𝑇 (3.10)

Input Space

Feature Map

𝝋

Higher Dimensional Space

Separating hyperplane

Page 34: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 16

where 𝑥𝑖 represents input data from a set 𝑋 and 𝑡𝑖 ∈ 𝑇 represents the target elements. Original samples in

𝑋 is mapped into a higher dimensional feature space 𝐹 as in equation (3.11) (Camps-Valls and Bruzzone,

2009),

𝜑 ∶ 𝑋 → 𝐹, 𝑥 → 𝜑(𝑥) (3.11)

Suppose we take any two samples 𝑠, 𝑠𝑖 in the input space then,

𝐾(𝑥, 𝑥𝑖) = ⟨𝜑(𝑥), 𝜑(𝑥𝑖)⟩𝐹 (3.12)

The function 𝐾 is called a kernel and ⟨. , . ⟩ is the inner product between 𝑥 and 𝑥𝑖. Mapping 𝜑 is referred as

the feature map and the dot product space 𝐹 is the feature space (Camps-Valls and Bruzzone, 2009).

Computational complexity in original input space is reduced to a considerable amount with the use of a

kernel function.

Mercer’s condition for kernels states that:

𝐾(𝑥, 𝑥𝑖) ≥ 0 (3.13)

Every function 𝐾(𝑥, 𝑥𝑖) which satisfies Mercers condition is called an eligible kernel (Kumar, 2007).

Different types of kernels are present in the machine learning algorithms. In this research work mainly

three types of kernels are considered: local kernels, global kernels and spectral angle kernel which are

discussed below.

1. Local Kernels: Local kernels are based on the evaluation of quadratic distance between any two

training samples. Only the data that are close or in the proximity of each others have an influence

on the kernel values (Kumar, 2007). All kernels which are based on a distance function are local

kernels. A few examples of local kernels are mentioned in equations (3.14) to (3.18) (Kumar,

2007):

a) Gaussian kernel with the Euclidean norm:

𝐾(𝑥, 𝑥𝑖) = exp (−0.5(𝑥 − 𝑥𝑖)𝐴−1(𝑥 − 𝑥𝑖)𝑇 (Mohamed and Farag,

2004)

(3.14)

where 𝐴 is a weight matrix and is given by:

Euclidean Norm 𝐴 = 𝐼 (3.15a)

b) Radial basis kernel:

𝐾(𝑥, 𝑥𝑖) = exp (−‖𝑥 − 𝑥𝑖‖2) (3.16)

c) Kernel with moderate decreasing (KMOD):

𝐾(𝑥, 𝑥𝑖) = 𝑒𝑥𝑝 (

1

1 + ‖𝑥 − 𝑥𝑖‖2) − 1

(3.17)

Page 35: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 17

d) Inverse multiquadratic kernel

𝐾(𝑥, 𝑥𝑖) =

1

√‖𝑥 − 𝑥𝑖‖2 + 1

(3.18)

2. Global kernel: Those samples that are far away from each other have an influence on the kernel

value (Kumar, 2007). All kernels which are based on the dot-product are global. Few global

kernels considered for this study were as mentioned in equations (3.19) to (3.21) :

a) Linear kernel: One of the simple kernels is based on the dot product.

𝐾(𝑥, 𝑥𝑖) = 𝑥. 𝑥𝑖 (3.19)

b) Polynomial kernel: This kernel computes the inner product of all monomials up to degree p.

𝐾(𝑥, 𝑥𝑖) = (𝑥. 𝑥𝑖 + 1)𝑝 (3.20)

c) Sigmoid kernel:

𝐾(𝑥, 𝑥𝑖) = tanh ( 𝑥. 𝑥𝑖 + 1) (3.21)

3. Spectral Kernel: To fit the hyperspectral point of view, we consider other criteria that take the

spectral signature into consideration. The spectral angle (SA) 𝛼(𝑥, 𝑥𝑖) is defined in order to

measure the spectral difference between 𝑥 and 𝑥𝑖 while being robust to differences of the overall

energy (e.g. illumination, shadows) as mentioned in equation (3.22) (Kumar, 2007; Mercier and

Lennon, 2003),

𝛼(𝑥, 𝑥𝑖) = arccos (

𝑥. 𝑥𝑖

‖𝑥‖‖𝑥𝑖‖)

(3.22)

Composite Kernels: A mixture of kernels can be used to mix the dual characteristics i.e. the characteristics of

the dot product or the Euclidean distance with the spectral angle (Kumar, 2007; Mercier and Lennon,

2003). Mercer’s single kernels can be combined to include the spatial and spectral properties to a new

family of kernels termed as composite kernels. This family of kernels (Camps-valls et al., 2006):

can enhance the classification accuracy when compared to the traditional single kernels

can make the classification more flexible by considering both the spectral and spatial

properties.

can increase the computational efficiency.

There are different methods for combining two different kernels such as stacked approach, direct

summation kernel, weighted summation kernel and cross-Information kernel (Camps-valls et al., 2006). In

Page 36: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 18

this research work weighted summation kernel method has been adopted for composite kernel. Composite

kernels can be expressed as (3.23) (Kumar, 2007);

𝐾(𝑥, 𝑥𝑖) = 𝜆𝐾𝑎(𝑥, 𝑥𝑖) + (1 − 𝜆)𝐾𝑏(𝑥, 𝑥𝑖) (3.23)

where 𝐾𝑎(𝑥, 𝑥𝑖) and 𝐾𝑏(𝑥, 𝑥𝑖) can be any two local, global or spectral kernels and 𝜆 represents a positive

real-valued free parameter (0 < 𝜆 < 1) which represents the weight given for each kernel. When using

composite kernels, fine tuning of 𝜆 is also necessary along with the degree of fuzziness. As 𝐾𝑎(𝑥, 𝑥𝑖) and

𝐾𝑏(𝑥, 𝑥𝑖) satisfy both Mercers condition for eligible kernels, the linear combinations is also an eligible

kernels. In this study the best single kernel among the local as well as global category are combined with

the spectral kernel. Also, a combination of two global and a local and global kernels performance has also

been considered.

3.1.4. KERNEL BASED FUZZY C-MEANS (KFCM) CLASSIFIER

The FCM classifier assigns sample data points to multiple clusters thus overcoming the drawback of hard

classifiers. The FCM classifier is effective in the presence of spherical and non-overlapping data clusters.

For non-spherical overlapping data clusters, Kernel based Fuzzy c-Means (KFCM) classifier was introduced.

The idea of KFCM maps the input data to the high dimensional feature space and performs the FCM

classifier in this space. Literature has shown that KFCM performs better than FCM by reducing the

computational complexity (Jain and Srivastava, 2013;Kaur et al., 2012).

The FCM classifier is performed by minimizing the objective function as mentioned in equation (3.3). Let

𝜑 be an implicit map function where 𝑥 represents the samples in feature space ℋ of equation (3.12).

KFCM is based on the minimization of objective function (Yang et al., 2007) equation (3.24),

𝐽𝐾𝐹𝐶𝑀(𝑈, 𝑉) = ∑ ∑(𝜇𝑖𝑗)𝑚 ‖𝜑(𝑥𝑖) − 𝜑(𝑣𝑗)‖2

𝐴

𝑐

𝑗=1

𝑁

𝑖=1

, 1 < 𝑚 < ∞

(3.24)

where,

‖𝜑(𝑥𝑖) − 𝜑(𝑣𝑗))‖2

= (𝜑(𝑥𝑖) − 𝜑(𝑣𝑗))𝑇 . (𝜑(𝑥𝑖) − 𝜑(𝑣𝑗))

= 𝜑(𝑥𝑖)𝑇 . 𝜑(𝑥𝑖) − 𝜑(𝑥𝑖)𝑇𝜑(𝑣𝑗) − 𝜑(𝑣𝑗)𝑇

𝜑(𝑥𝑖) + 𝜑(𝑣𝑗)𝑇

. 𝜑(𝑣𝑗)

= 𝐾(𝑥𝑖 , 𝑥𝑖) + 𝐾(𝑣𝑗, 𝑣𝑗) − 2(𝐾(𝑥𝑖 , 𝑣𝑗))

(3.25)

If 𝐾(𝑥, 𝑥) = 1, then equation 3.26 can be written as equation (3.27),

‖𝜑(𝑥𝑖) − 𝜑(𝑣𝑗))‖2

= 2 − 2 (𝐾(𝑥𝑖 , 𝑣𝑗)) = 2 (1 − 𝐾(𝑥𝑖 , 𝑣𝑗)) (3.26)

Substituting equation (3.26) in equation (3.24), we get,

Page 37: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 19

𝐽𝐾𝐹𝐶𝑀(𝑈, 𝑉) = 2 ∑ ∑(𝜇𝑖𝑗)𝑚 (1 − 𝐾(𝑥𝑖 , 𝑣𝑗))

𝑐

𝑗=1

𝑁

𝑖=1

, 1< m <∞

(3.27)

and the class membership matrix is updated by equation (3.28),

𝜇𝑖𝑗 = 1

∑ (1−𝐾(𝑥𝑖,𝑣𝑗)

1−𝐾(𝑥𝑘,𝑣𝑗))𝑐

𝑘=1

1 (𝑚−1)⁄

(3.28)

We then obtain the cluster center using the equation mentioned in (3.29).

𝑣𝑗 = ∑ 𝜇𝑖𝑗

𝑚. 𝐾(𝑥𝑖, 𝑣𝑗) 𝑥𝑖𝑁𝑖=1

∑ 𝜇𝑖𝑗𝑚𝑁

𝑖=1 𝐾(𝑥𝑖, 𝑣𝑗)

(3.29)

Here, the function 𝐾(𝑥𝑖, 𝑣𝑗) can be replaced by any of the eight kernel function discussed in Section 3.3.

The KFCM classifier is carried out in the following steps 1 to 5 (Yang et al., 2007):

1. Choose the number of cluster centers and determine the termination criteria.

2. Choose a kernel function 𝐾(. , . ) and determine its parameters.

3. Initialize the cluster center 𝑣𝑗 and calculate the membership matrix.

4. Update the cluster center 𝑣𝑗 using equation (3.30) and calculate the membership matrix by

equation (3.29).

5. If (‖𝑈𝑛𝑒𝑤 − 𝑈𝑜𝑙𝑑‖ < 𝜀) then Stop otherwise go to Step 4.

3.1.5. ACCURACY ASSESSMENT

Accuracy assessment is important in order to assess the quality of classified outputs and to compare

different classification algorithms (Okeke and Karnieli, 2006). One way to represent the accuracy of the

classification results is the error matrix also termed the confusion matrix or the contingency table. The

error matrix gives the agreement of accuracy assessment between the classified and reference data along

with the misclassified results. Based on the error matrix several statistical measures have been introduced

such as the Kappa coefficient, user’s accuracy and producer’s accuracy that are all used for summarizing

information about the accuracy assessment. The error matrix can only be used in the case of hard

classification i.e. when a pixel represents a single class and not when a pixel covers more than one

class(Silvan-Cardenas and Wang, 2008). For soft classification therefore, it cannot be applied. To assess

the accuracy of a soft classification other methods were introduced (Binaghi et al., 1999 ; Congalton, 1991;

Jr and Cheuk, 2006). The Fuzzy Error Matrix (FERM) was the most appealing approach used. This section

describes about the methods introduced to assess the accuracy of soft classified results.

Page 38: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 20

3.1.5.1. FUZZY ERROR MATRIX (FERM)

An error matrix is a square array of number which is set out in rows and columns where the rows

represent the sample elements of the classified data and the columns represent the number of sample

elements corresponding to the reference data. In an error matrix the diagonal elements show the number

of pixels that are correctly classified and the off diagonal elements show misclassification. In the case of

FERM the set of classified as well as reference data are considered as fuzzy sets which have the

membership matrix between [0, 1] where the interval denotes the interval of real numbers from 0 to 1.The

fuzzy set operator ‘min’ is used in the building of error matrix to provide FERM which provides a

maximum sub-pixel overlap between the classified and the reference image as in equation (3.30) (Binaghi

et al., 1999):

𝜇𝐶𝑚⋂𝑅𝑛(𝑥) = min (𝜇𝐶𝑚

(𝑥), 𝜇𝑅𝑛(𝑥)) (3.30)

Here 𝑅𝑛 represents the set of the reference data assigned to class 𝑛, 𝐶𝑚 represents the set of classified

data assigned to class 𝑚 and 𝜇 represents the membership grade of the class within a pixel. The overall

accuracy is considered as the simplest value form of information for accuracy assessment. In the case of

error matrix the overall accuracy is calculated as the sum of the total number of diagonal elements by the

total number of sample elements whereas in the case of FERM the overall accuracy is calculated by

summing the diagonal elements by the total membership grade found in the reference data as given in

equation (3.31) (Kumar, 2007).

𝑂𝐴𝐹𝐸𝑅𝑀 =

∑ 𝑀(𝑖, 𝑗)𝑐𝑖=1

∑ 𝑅𝑗𝑐𝑖=1

(3.31)

Where OA represents the overall accuracy, M (i, j) represents the member in the 𝑚𝑡ℎclass in the soft

classified output and 𝑛𝑡ℎ class in the soft reference data, c represents the number of classes and 𝑅𝑗

represents the sum of the membership grade of class n from the soft reference data.

3.1.5.2. SUB-PIXEL CONFUSION UNCERTAINTY MATRIX

It is difficult to determine the actual overlap among the classes which are based on land-cover fractions.

This is usually termed as sub-pixel area allocation problem (Silvan-Cardenas and Wang, 2008). The

minimum and maximum overlap between any two classes depends upon the spatial distribution of these

classes within a pixel. This problem gives a unique solution when more than one class is either

overestimated or underestimated at each pixel where the sub-pixel confusion can be determined uniquely.

In the other case when there is no unique solution the solution can be represented by confusion intervals.

If no solution exists, a sub-pixel Confusion Matrix (SCM) contains confusion intervals in the form of a

Page 39: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 21

center value plus-minus the maximum error. The confusion matrix of a soft classification satisfies a) the

diagonalization property where the matrix is diagonal if the assessed data matches the classified data and b)

marginal sums property where the marginal sums match the total grades from the classified as well as

assessed data(Silvan-Cardenas and Wang, 2008).

For assessing the pixel-class relationship in sub-pixel classifications, various operators were defined. The

MIN operator gives the maximum possible overlap between the classified and the assessed data. It may

overestimate the actual sub-pixel agreement and disagreement, however resulting in greater marginal sums.

The Similarity Index (SI) is a variant of the MIN operator and gives a normalized sub-pixel overlap. The

PROD operator gives the expected overlap between the assessed and reference sub-pixel partitions. The

LEAST operator gives the minimum possible sub-pixel overlap between two classes (Silvan-Cardenas and

Wang, 2008).

The various basic operators however cannot satisfy the property of diagonalization and hence composite

operators MIN-PROD, MIN-MIN and MIN-LEAST were put forth. The MIN-MIN operator assigns the

diagonal elements first followed by the off diagonal elements. The MIN-LEAST operator uses the MIN

operator for the diagonal elements and the LEAST operator for the off-diagonal elements. The MIN-

PROD uses the MIN operator for the diagonal elements and normalized PROD operator for the off-

diagonal elements. The MIN-MIN and the MIN-LEAST operators were introduced to provide minimum

and maximum sub pixel overlap. When at most one class is either underestimated or overestimated in such

cases the MIN-PROD composite operator is used (Silvan-Cardenas and Wang, 2008).

3.1.5.3. ROOT MEAN SQUARE ERROR (RMSE)

The Root Mean Squared Error (RMSE) is the squared difference between the membership values of the

classified and reference image. It is calculated (3.32) as (Dehghan and Ghassemian, 2006),

𝑅𝑀𝑆𝐸 = √1

𝑁∑ ∑(𝜇𝑖𝑗 − 𝜇𝑖𝑗

′ )2

𝑁

𝑖=1

𝑐

𝑗=1

(3.32)

where 𝜇𝑖𝑗 represents the membership values pixel in the classified image, 𝜇𝑖𝑗′ represents the membership

values of the pixels in the reference image, c is the total number of classes and N represents the number of

pixels in the image. A lower RMSE value represents a low uncertainty and vice versa. The RMSE can be

calculated in two ways: 1) for complete image- Global RMSE and 2) for per class fractional images- per

class RMSE (Chawla, 2010). The global RMSE is calculated by equation (3.33) and the per-class RMSE is

calculated by equation (3.33),

𝑅𝑀𝑆𝐸 = √1

𝑁∑ ∑(𝜇𝑖𝑗 − 𝜇𝑖𝑗

′ )2

𝑁

𝑖=1

𝑐

𝑗=1

(3.33)

Page 40: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 22

3.1.5.4. ENTROPY MEASURE

Dehghan and Ghassemian (2006) introduced the entropy measure to assess the quality of classification.

The reason was the Root Mean Square Error (RMSE), the Mean Relative Error (MRE) and the

Correlation Coefficient (CC) measures for accuracy assessment depend upon the actual and desired

outputs of the classifier and hence are depend on the error whereas the entropy measure is dependent only

upon the actual outputs of the classifier and thus it is less sensitive to error variations. This measure

determines the accuracy of the classification result based on a single number per pixel. The entropy

measure is expressed as mentioned in equation (3.34) (Dehghan and Ghassemian, 2006),

𝐸𝑛𝑡𝑟𝑜𝑝𝑦, 𝐸 = ∑ ∑ 𝜇𝑖𝑗 log2(𝜇𝑖𝑗)

𝑐

𝑗=1

𝑁

𝑖=1

(3.34)

where 𝑁 represents the number of pixels in the image, 𝑐 represents the number of classes, 𝜇𝑖𝑗 represents

the membership value assigned for 𝑖𝑡ℎ pixel of class j.

Fuzzy classifiers generate soft classified outputs in the form of fractional images. The representation of

membership values calculated for a particular dataset for each class is shown in as fractional images

(Harikumar, 2014). For five classes, five fractional images are generated. In a fractional image of a

particular class, the membership values for that class will be high and the membership values for all the

other classes will be low.

For calculating the entropy of a particular class,

1. The mean of a few training samples is calculated for the class under consideration where it

appears to be homogeneous.

2. Using equation (3.35) entropy values are calculated using the membership values of that sample in

all fractional images.

If there are for example three classes and membership values from the testing sites of the fractional images

are equal to 0.8, 0.3 and 0.2 for each of the three classes then using equation (3.35) the entropy values can

be calculated as:

𝐸 = −(0.8 ∗ log2 0.8) + −(0.3 ∗ log2 0.3) + −(0.2 ∗ log2 0.2) = 1.1260

High entropy value represents higher uncertainty and vice-versa. In this work, entropy value is used to

optimize the value of the parameters 𝑚 and 𝜆. Here, for both FCM and KFCM the fractional images were

generated for each class for all values of m varying from 1.1 to 2.0. As 𝑚 moves to a greater value the

generated fractional images were not meaningful. A low entropy result shows the quality of the classified

image. In a fractional image low entropy is obtained when the difference between the membership values

Page 41: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 23

of the class under consideration (favourable class) is high and the membership values for all other classes

(unfavourable classes) are very low. In the case of composite kernels the uncertainty has been calculated to

optimize 𝜆 also along with 𝑚.

3.6.4.1. MEAN MEMBERSHIP DIFFERENCE METHOD

The entropy measure alone cannot be used for the optimization of various parameters used in this

research work. This may result in misclassification in the generated outputs. Thus, the mean membership

difference calculation method is adopted. It also helps to match fuzziness in the image to the fuzziness in

the ground. In this method the optimization of 𝑚 was found by calculating the difference between the

membership values of the class of interest and the average of the membership values in other classes. The

calculated value should be maximum or tending to 1.000. The method can be explained with an example.

For example, consider the analyst identified six classes (class 1, 2, 3, 4, 5, 6) for a dataset. For six classes,

six fractional images are generated. The membership values in the fractional images will be high when the

class is present and low in other regions (Harikumar, 2014) Suppose the class of interest is class 1 as

shown in Figure 3-3.

This method can be concluded in the following steps:

1. Consider the fraction image generated for the class of interest (Class 1-water).

2. Consider seven to eight pixels from the homogeneous areas of the class under consideration and

for all other classes.

3. Calculate the mean of the pixels for all the classes (class 1 to class 6) from the testing site for each

class.

4. Calculate the membership value difference between the class under consideration and the

membership values of all the other classes in the same fraction image. e.g. (Δ12 = class 1-class 2,

Δ13=class 1-class 3, Δ14=class 1-class 4, Δ15 = class 1-class 5, Δ16 = class 1-class 6).

5. Calculate the mean of all the differences calculated in step 4 ((Δ12+Δ13+Δ14+Δ15+Δ16)/6).

1

2

3

4

5

6

Figure 3-3: An image with six classes identified along with the generated fractional images

1

2

3

4

5

6

Page 42: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 24

From the above steps, if we consider fraction image of class 1 the membership values of the pixels for that

class will be high i.e. ideally equal to 1.000 or close to 1.000 for homogeneous areas and the membership

values for all the other classes the membership value will be ideally equal to zero or practically approaching

to zero. Thus, if we calculate the mean of the membership value difference between these two then the

value should be tending to 1.000 or the mean membership value difference calculated should be highest.

This procedure has to be done for each fraction images generated for the given parameters.

The class of interest can be selected based on the homogeneity. When a class is more homogeneous the

probability that the membership grade tends to 1 is high. When the class is less homogeneous the

probability that the membership grade tends to 1.000 is very low. Thus for the optimization of the weight

component 𝑚 the selection of homogeneous class was necessary. For this research work, water class is

considered to check the mean membership difference as this class have been identified more

homogeneous when compared to the other classes. Values of 𝑚 and 𝜆 are optimized considering

minimum entropy and maximum mean difference.

3.2. STUDY AREA AND MATERIALS USED

This section identifies the study area, gives an explanation for selecting this particular study area and

describes the materials used. The specifications of each sensor and the pre-processing stages of datasets

have been included. Steps for generating soft LISS-IV reference data for the validation of AWiFS and

LISS-III images have also been included.

3.2.1. STUDY AREA

Selection of a study area in any research is important for evaluating the efficiency and performance of

adopted methodology. The study area considered for this particular research work was Sitarganj’s Tehsil,

Udham Singh Nagar district, Uttarakhand state, India (Singha, 2013). The considered area extend from

28°53´N to 28°56´N latitudes and 79°34´E to 79°36´E longitudes (Singha, 2013). Sitagarnj’s Tehsil was

recognized as it contained six land cover classes e.g. agricultural fields with a crop, agricultural fields

without crop both dry and moist, Sal and Eucalyptus forests and two water reservoirs: the Baigul (Sukhi)

and Dhora reservoirs. The reasons for selecting this study area include:

Presence of mixed pixels which occurs because of degradation of land cover classes from one to

another (water to grassland) will help to assess the capability of Kernel based Fuzzy c-Means

(KFCM) classifier.

Data from the sensors AWiFS, LISS-III and LISS-IV from Resourcesat-1 and Resourcesat-2 were

available of the same date to perform image to image accuracy assessment.

A field visit for the study area was conducted in November, 2009.

Page 43: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 25

Final results of KFCM can be compared with the final results of Fuzzy-c-Means (FCM) (Singha,

2013) .

3.2.2. MATERIALS USED

Appropriate use of Remote Sensing (RS) data which defers in spectral, spatial and temporal properties

depends on the suitable algorithms used in any research work. In this study, AWiFS (Advanced Wide Field

Sensor), LISS-III (Linear Imaging Self-Scanning System-III) and LISS-IV (Linear Imaging Self-Scanning

System-IV) images of both Resourcesat-1 of IRS-P6 (Indian Remote sensing Satellite) and Resourcesat-2

were used. Resourcesat-1 (IRS-P6) was launched in 2003, with the objective of natural resource

management with a 5-24 day repeat cycle. The images from AWiFS, LISS-III and LISS-IV were acquired

at the same time. The dataset available from Resourcesat-1 was captured at 15th October 2007 and from

Resourcesat-2 at 23rd November 2011 (Chawla, 2010). The soft classified outputs from finer resolution

LISS-IV image were used as for the validation of the soft outputs of LISS-III and AWiFS. The

specifications of the satellite data used are shown in Table 3-1.

Table 3-1: Resourcesat-1 and Resourcesat-2 sensor specification

Specification

AWiFS LISS-III LISS-IV

Resourcesat-

1

Resourcesat-

2

Resourcesat-

1

Resourcesat-

2

Resourcesat-

1

Resourcesat-

2

Spatial

Resolution(m) 56 56 23.5 23.5 5.8 5.8

Radiometric

Resolution 10 12 7 10 7 10

Swath(km)

740 740 141 141

23.9 (Max

Mode)

70.3 (Pan

Mode)

70 (Max

Mode)

70 (Mono

Mode)

Spectral

Resolution

(µm)

0.52-0.59

0.62-0.68

0.77-0.86

1.55-1.70

0.52-0.59

0.62-0.68

0.77-0.86

1.55-1.70

0.52-0.59

0.62-0.68

0.77-0.86

1.55-1.70

0.52-0.59

0.62-0.68

0.77-0.86

1.55-1.70

0.52-0.59

0.62-0.68

0.77-0.86

0.52-0.59

0.62-0.68

0.77-0.86

Page 44: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 26

Figure 3-4: Geographical Location of Study Area

Figure

3-5: LISS

IV

(Resourcesat-2) image of Sitarganj’s Tehsil with classes (a) Agricultural field with crop (b) Sal Forest (c)

Eucalyptus Plantations (d) Dry Agricultural field (e) Water

Page 45: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 27

3.2.3. DATASET PREPROCESSING

Geo-rectification is necessary when an accurate area, distance and direction measurements are required to

be made from imagery as well as overlaying images to have a pixel to pixel correspondence. Here, LISS-IV

image is used as a reference image for rectification of LISS-III and AWiFS datasets. The first step was the

image-to-map rectification of LISS-IV image with digital form of Survey of India (SOI) toposheet,

numbered 53𝑃

9. The LISS-IV image was geo-registered in UTM projection spheroid and vertical datum

being Everest North, Zone 44. Geo-registration of LISS-III and AWiFS images were done with the

geometrically corrected LISS-IV image. Outputs from finer resolution LISS-IV image was used as

reference data for the evaluation of coarser resolution AWiFS and LISS-III images, resampling is necessary

for accuracy assessment purpose. For this purpose all three images AWiFS, LISS-III and LISS-IV images

were resampled in such a way that the pixel size in all three images were in ratio 1:4:12, respectively. This

pixel size ratio was maintained to have full pixel correspondence for applying FERM accuracy assessment

approach. Thus, finer resolution pixels were integrated to form a coarser resolution i.e. 4×4 =16 LISS-IV

pixels were combined to form coarser resolution pixel for LISS-III. The aggregated LISS-IV pixels are

used for the accuracy assessment of AWiFS and LISS-III. Being easy and fast to use the Nearest Neighbor

Resampling technique is used here. Also, it retains the original data file values (Chawla, 2010).

3.2.4. REFERENCE DATASET GENERATION

For this research, the classified outputs of finer resolution LISS-IV image are used as reference dataset.

The reference data used for AWiFS were the LISS-III and LISS-IV images. Similarly, the reference dataset

used for LISS-III image was LISS-IV. Because of the following reasons the soft ground data were not

acquired (Chawla, 2010):

It is not possible to locate sub-pixel classes on the ground.

Some areas were inaccessible and thus, obtaining ground data in soft mode was difficult.

In this research classified output images were generated in the form of fraction images for each class under

consideration. Hence, the fraction images of finer resolution LISS-IV was used as the reference image for

the accuracy assessment of AWiFS and LISS-III fractional images. To avoid errors in the test and

reference datasets both images were used with same date of acquisition. Kloditz et al. (1998) proposed a

multi-resolution method to estimate the classification accuracy of low resolution image by using a high

resolution image where every high resolution pixel within a defined area contributes to the corresponding

low resolution pixel. It has been shown that there is no loss of information in the lower resolution images

rather the pattern is preserved. Multi-resolution technique was used to classify finer resolution reference

dataset as their resolutions were not the same and cannot be used for direct accuracy assessment.

Page 46: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 28

3.3. METHODOLOGY

The main objective of this research work was to develop a Kernel based Fuzzy c-Means classifier. This

chapter deals with the detailed explanation of steps adopted to achieve the objectives mentioned in section

1.2.

The flowchart of the adopted methodology is shown in Figure 3-6.

3.3.1. GEO-REFERENCING

Initially, the AWiFS, LISS-III and LISS-IV images of Resourcesat-1 and Resourcesat-2 were geometrically

rectified and geo-registered. Using Survey of India (SOI) toposheet the finer resolution LISS-IV images

Figure 3-6: Methodology Adopted

AWiFS, LISS-III and LISS-IV images

Pre-processing (Geo-Registration)

Supervised Soft Classification

Approaches:

Fuzzy c-Means Classification (FCM)

Kernel based Fuzzy c-Means (KFCM)

FCM with combination of best kernels

Image-to-Image Accuracy Assessment

PROPOSED KERNELS:

LOCAL KERNELS

Gaussian Kernel

Using Euclidean

Norm

Radial Basis Kernel

KMOD Kernel

Inverse

Multiquadratic

Kernel

GLOBAL KERNELS

Linear Kernel

Polynomial Kernel

Sigmoid Kernel

SPECTRAL ANGLE

KERNEL

Page 47: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 29

were geo-registered followed by the geo-registration of AWiFS and LISS-III. The process of geometric

correction and geo-registration of datasets are explained in detail in 4-3.

3.3.2. PREPERATION OF REFERENCE DATASET

From the KFCM classifier soft classified outputs were generated, so for accuracy assessment the

generation of soft reference dataset was necessary. The soft outputs were in the form of fraction images

generated for each class under consideration. The results of the LISS-IV image were used as reference

dataset for both AWiFS AND LISS-III. The detailed explanation of the reference dataset generation has

been given in 4-4.

3.3.3. SUB-PIXEL CLASSIFICATION ALGORITHMS

Supervised KFCM classifier was adopted to generate the outputs of sub-pixel classification outputs. Three

approaches i.e. Fuzzy c-Means (FCM), FCM with single kernels (KFCM) and FCM with composite

kernels, considered for this study are explained in detail in the following sections.

3.3.3.1. FUZZY C-MEANS (FCM):

Different algorithms are known for the fuzzy based clustering. The output of these sub-pixel classification

algorithms were obtained in the form of fraction images for each class under consideration. Weight

component 𝑚 controls the degree of fuzziness which was optimized based on the maximum mean

membership difference between favourable and unfavourable classes and minimum entropy. Out of the

three norms introduced by Bezdek et al. (1984) only one is considered i.e. Euclidean norm as the Diagonal

and Mahalonobis norms are sensitive to noise and thus reduce the classification accuracy (Kumar,

2007).This approach was adopted for a comparative analysis between simple FCM results and KFCM

approach.

3.3.3.2. KERNEL BASED FUZZY c-MEANS (KFCM):

Mainly three categories of kernels were considered: Local Kernels, Global Kernels and Spectral Angle

Kernel. In this study, four Local Kernels were used: Gaussian Kernel using Euclidean Norm, Radial Basis

Kernel, Kernel with Moderate Decreasing (KMOD) and Inverse Multiquadratic Kernel. Global kernels

used were three: Linear Kernel, Polynomial Kernel and Sigmoid Kernel. Overall eight single kernels were

studied using FCM approach. Followed by the implementation of eight single kernels, the next step was to

optimize the weight component ‘m’ using mean membership difference between favourable and

unfavourable class method and entropy method. The best single kernels for each global and local category

were selected based on the maximum mean membership difference between favourable and unfavourable

classes and minimum entropy.

Page 48: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 30

3.3.3.3. FCM WITH COMOSITE KERNELS

The composite kernels were obtained from the best single kernels. In composite kernels, the weight factor

𝜆 is given for each kernel which varies from 0.1 to 0.9. For composite kernel the, optimization of 𝑚 and 𝜆

was necessary and this is done considering maximum mean membership difference between favorable and

unfavorable class and minimum entropy from where the best composite kernel was concluded. Untrained

case outputs were also obtained by not training the KFCM classifier with the signature data of a class, here

in this study agricultural field with crop under was considered as untrained class.

As the approach used was fuzzy, the classified outputs were generated in the form of fractional images.

Fractional images are the pictorial representation of the membership values generated for a particular class

(Harikumar, 2014). Number of fractional images generated equals the number of classes considered. After

the generation of fraction images the entropy measure and mean membership difference values are

analysed to select the best kernels. Selections of training samples were important for all the three

classification approaches. Hence, mean of the membership values of the samples thus collected were

calculated for each class. This mean values was used to find the difference between a favourable and non-

favourable class.

3.3.4. ACCURACY ASSESSMENT

Accuracy Assessment is an important for assessing the quality of the classified outputs. Image to image

accuracy assessment was done with reference dataset as LISS-IV for both AWiFS and LISS-III. For this

here, Fuzzy Error Matrix (FERM) was used to generate overall accuracy. The overall classification

accuracy of KFCM classifier was compared with that of FCM classifier. Accuracy in the case of untrained

case has been also evaluated.

Page 49: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 31

4. RESULTS

4.1. PARAMETER ESTIMATION

To assure the best classified outputs from the algorithms used in this research work, it was required to

estimate optimal values for weight constant 𝑚 and the weight given to each kernel 𝜆 for FCM, FCM using

single kernel as well as FCM using the composite kernel. Outputs from these classifiers were obtained as

fractional images because the classification approach was fuzzy. Also, it was necessary to optimize the

parameters 𝑚 and 𝜆 to match the fuzziness in the image to the fuzziness in the ground. Optimizations of

both parameters were based upon the calculation of entropy measure and mean membership difference

(uncertainty) calculation discussed in sections 3.5.4 and 3.5.5 respectively. 𝑚 was optimized in the case of

both single as well as composite kernel. The next two subsections explain the two cases.

Why both uncertainty and entropy calculation method to optimize 𝒎?

Figure 4-1 shows a lower entropy value calculated for Gaussian kernel. It is observed that as 𝑚 varies

from 4.0 to 10.0 the entropy values reaches a saturation point. An increase in entropy can be seen between

range 1.0 and 4.0. But, it was difficult to find the optimal 𝑚 just by considering the low entropy values.

Thus, mean membership difference method was also considered. In Figure 4-2 we can see that mean

membership difference reaches maximum or reaches 1.0 for 𝑚 values in range between 1.0 and 4.0.

According to the criteria for parameter optimization (minimum entropy and maximum mean membership

difference), it can be concluded from this that optimized 𝑚 lies within the range 1.0 and 4.0. As the value

of 𝑚 increases, a decrease in mean membership difference is seen. This occurs because as the value of 𝑚

increases fuzziness increases. Out of all the classes identified among Resourcesat-1 and Resoourcesat-2

water class is more homogeneous. Also, from Figure 4-1 it is observed that class water has the least

entropy measure as compared to the other classes. Similarly, from Figure 4-2 it is observed that water class

reaches maximum membership value 1.0 and remains constant for lower values of 𝑚. Thus, water class

fraction image generated by the classifier have been considered for the optimization of various parameters

used in this research work. Parameter 𝑚 was optimized for all the kernels (Table 4-3; Table 4-4). The

entropy and mean membership differences generated for FCM and all the single kernels for Resourcesat-1

AWiFS are given in Figure A-5 (Appendix A).

Page 50: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 32

Table 4-1: Classes Identified at Sitarganj’s Tehsil in AWiFS, LISS-III and LISS-IV sensors for

Resourcesat-1 and Resourcesat-2.

Resourcesat-1 Resoucesat-2

(a) Agricultural field with crop (a) Crop

(b) Sal Forest (b) Eucalyptus Plantation

(c) Eucalyptus plantations (c) Fallow Land

(d) Dry Agricultural field without crop (d) Sal Forest

(e) Moist Agricultural field without crop (e) Water

(f) Water

Figure 4-1: Variation in Entropy with respect to weight constant 𝑚 for Gaussian kernel using Euclidean

norm (Resourcesat-1 AWiFS)

Page 51: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 33

Optimization of weight factor (𝝀) for composite kernels

As discussed in section 3.2.2 the composite kernel requires a weight factor which gives weight 𝜆 to kernel

𝐾𝑎 and 1 − 𝜆 to kernel 𝐾𝑏 . In the case of composite kernels, it was necessary to optimize both the

parameters 𝜆 as well as 𝑚. For this 𝜆 values were considered within the range between 0.90 and 0.99. But

when the weight given to kernel 𝐾𝑎 is higher misclassified outputs are generated as shown in Figure 4-4.

Misclassification occurs because among single kernels, if 𝐾𝑎 has better performance than 𝐾𝑏 then a high

weight to 𝐾𝑏 in a composite case may result in a kernel with a lower performance. Figure 4-3 shows the

entropy and mean membership difference graph generated for a Gaussian-Spectral kernel. From Figure 4-

3 it can be seen that Δ reaches 1.000 for 𝜆=0.8 and m=1.04. But when the fractional mages are interpreted

we get misclassification. Figure 4-4 shows misclassification for all the classes except water. Thus,

optimization of parameter 𝑚 and 𝜆 for composite kernels was also based on interpreting the generated

fractional images. As the value of 𝑚 decreases it can be seen that there is a steep increase in the mean

membership difference. With the increase in 𝑚 and increase in 𝜆 the entropy value also decreases. It can

be understood from Figure 4-4 that a lower weight given to Gaussian kernel will give misclassified outputs.

It is observed that agriculture field with crop is misclassified as moist agriculture, sal forest as agriculture

field with crop (Figure 4-4). The fraction image generated for eucalyptus plantations do not show high

membership values. If a higher weight was given to Gaussian kernel, then lower entropy values and

Figure 4-2: Variation in mean membership difference with respect to weight constant 𝑚 for Gaussian

kernel using Euclidean norm (Resourcesat-1 AWiFS)

Page 52: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 34

maximum mean membership differences were observed. The optimized 𝑚 and 𝜆 values for the composite

kernels are given in section 4.4.

Figure 4-4: Misclassified outputs for Gaussian-Spectral Resourcesat-1 AWiFS for 𝑚=1.04 and 𝜆=0.80 for

(a) Agricultural field with crop (b) Sal Forest (c) Eucalyptus plantations (d) Dry agricultural field without

crop (e) Moist agricultural field without crop (f) Water

Figure 4-3: Estimation of weight given to each kernel (𝜆) using (a) entropy and (b) mean membership

difference plot for Gaussian-Spectral kernel from AWiFS (Resourcesat-1)

(a) (b) (c)

(d) (e) (f)

Misclassification

in Sal Forest as

Agriculture field

with crop

(a) (b)

Misclassification

in Agriculture

field with crop

as moist

agriculture

Eucalyptus

plantation

Misclassification

in moist

agriculture as sal

plantations

Page 53: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 35

4.2. RESULTS OF SUPERVISED FCM CLASSIFIER

To compare the performance of FCM classifier with KFCM classifier it was necessary to generate the

fractional images for FCM classifier. Using the entropy and mean membership difference method 𝑚 was

optimized for the FCM classifier. Optimized 𝑚 values for all the six images are shown in Table 4-2. Here

it was observed that maximum mean membership difference obtained for Resourcesat 1 and 2 were 1.0

and 0.80 respectively. The maximum mean membership difference obtained for Resourcesat-2 is slightly

less than Resourcesat-1 due to its higher radiometric resolution. Generated fraction images for FCM

Resourcesat-1 and -2 have been shown in Figure 4-5 and Figure 4-6.

Table 4-2: Estimated optimized 𝑚 values for FCM classifier along with the calculated mean membership

difference (𝜆) and entropy

AWiFS LISS-III LISS-IV

𝜟 Entropy 𝒎 𝜟 Entropy 𝒎 𝜟 Entropy 𝒎

RESOURCESAT-1

1.00 8.20𝑒−009 1.35 1.00 3.54𝑒−008 1.39 1.00 5.94𝑒−005 1.34

RESOURCESAT-2

0.80 2.57𝑒−136 1.02 0.80 2.29𝑒−139 1.02 0.80 3.04𝑒−136 1.01

While interpreting the fractional images it can be seen that Resourcesat-2 classifies the land cover classes

much better when compared to Resourcesat-1 due to lower radiometric resolution. The optimized 𝑚 value

from Resourcesat-2 was 1.01 for LISS-IV imagery. When the value of 𝑚 tends to 1.01 fuzziness decreases.

In FCM, higher values of entropy are obtained as 𝑚 varies from 3.0 to 10.0. Higher values of entropy

indicate higher uncertainty (Figure A-5 (i), Appendix A). Similarly, we can see that the mean membership

difference approaches 1.0 for lesser values of 𝑚 (Figure A-5 (i), Appendix A). While considering

eucalyptus plantation, it can be seen that among the datasets used the Resourcesat-2 LISS-III image has

the least entropy that is uncertainty is less. But even then merging of classes for all the three fraction

images of classes agriculture field with crop, sal forest and eucalyptus plantation were found. The fraction

image for the eucalyptus plantation highlights all the three vegetation classes. The mean membership

difference calculated for all the classes were having maximum value of 1.000.

Page 54: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 36

(a) (b) (c) (d) (e) (f)

(1)

(2)

(3)

Figure 4-5: Fractional images generated for optimized m values for FCM classifier for (1) LISS-IV, (2)

LISS-III and (3) AWiFS (Resourcesat-1) images with identified classes (a) Agricultural field with crop (b)

Sal Forest (c) Eucalyptus Plantation (d) Dry agricultural field without crop (e) Moist agricultural field

without crop and (f) Water

Page 55: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 37

4.3. RESULTS OF FCM CLASSIFIER USING SINGLE

KERNELS

Using entropy method and mean membership difference method the value 𝑚 was optimized for the

KFCM classifier. Optimized 𝑚 values along with their estimated entropy and mean membership

difference measures are given in Table 4-3 and Table 4-4 for Resourcesat-1 and Resourcesat-2 datasets.

Global kernels have a lower entropy than local kernels and spectral angle kernel and reaches maximum

mean membership difference (Δ = 1.0). It was observed that for Resourcesat-1 the Inverse Multiquadratic

(IM) kernel has the lowest entropy and the maximum mean membership difference (Δ) 1.0. As the three

classes in the dataset are placed under the category vegetation their mean feature vectors are almost the

same. This may be the reason why IM and SA kernel misclassify agriculture field as either eucalyptus or Sal

or a combination of three. Thus, IM was concluded the best single kernel for Resourcesat 1.0.

In Resourcesat-2, the Gaussian kernel with the Euclidean Norm resulted into the best results. Fraction

images generated for Gaussian kernel with Resoucesat-2 show no misclassification. The maximum mean

Figure 4-6: Fractional images generated for optimized 𝑚 values for FCM classifier of (1) LISS-IV,

(2) LISS-III and (3) AWiFS (Resourcesat-2) images with identified classes (a) Agricultural field with

crop (b) Eucalyptus Plantation (c) Fallow Land (d) Sal Forest (e) Water

(a) (b) (c) (d) (e)

(4)

(5)

(6)

Page 56: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 38

membership difference, however, was equal to 0.80 which shows less fuzziness as compared to IM

(Resourcesat-1). The radiometric resolution of Resourcesat-2 is higher than that of Resourcesat-1 and as a

result the maximum mean membership differences of Resourcesat-2 for the three images were equal to

0.80. Overall, the Gaussian kernel has the lowest entropy; corresponding fraction images generated are

given in Figure 4-7.

While considering the fraction images of the linear, polynomial and sigmoid kernels, the three vegetation

classes: agricultural field with crop, sal forest and eucalyptus plantation do not highlight their

corresponding feature classes but instead it merges all the three classes with high membership values. For

the class ‘water’ these local kernels also merge the patches of moist agricultural field without crop. Even

the entropy measure for these kernels was higher. The entropy measure and fractional images generated by

the global kernels and spectral angle kernel conveys a much better classified output as compared to the

local kernels. Even then the fractional images generated for the classes agricultural field with crop, sal and

eucalyptus are merged in a few cases, because of similarity in spectral values.

Entropy measure of the classified outputs: Uncertainty of the classified results can be assessed using

the entropy values. In this study, classified outputs were generated in the form of fractional images. For

Resourcesat-1, the lowest entropy was obtained for the Inverse Multiquadratic kernel which comes under

the category of local kernels. The highest entropy values were observed for the local kernels i.e.,

uncertainty is more in their case. It is clearer from the fraction images generated by the global kernels.

Fraction images generated by all the three global kernels do not highlight their feature classes which come

under the vegetation category. Agriculture field with crop, sal forest and eucalyptus plantations are all

merged in their fractional images. These kernels are not able to differentiate between the spectral values of

these classes. Neither the local kernels nor the spectral angle kernel shows this misclassification which

indicates a poor performance of global kernels. For Resourcesat-2, Gaussian kernel Euclidean norm has

the overall lower entropy value.

Page 57: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 39

Table 4-4: Optimized 𝑚 values for local, global and spectral angle kernels for AWiFS, LISS-III and LISS-

IV images (Resourcesat-2) along with the calculated Mean Membership Difference (∆) and Entropy (𝐸)

Table 4-3: Optimized 𝑚 values for local, global and spectral angle kernels for AWiFS, LISS-III and LISS-

IV images (Resourcesat-1) along with the calculated Mean Membership Difference (∆) and Entropy(𝐸)

Sensors/

Kernels

AWiFS LISS-III LISS-IV

𝜟 𝑬 𝒎 𝜟 𝑬 𝒎 𝜟 𝑬 𝒎

RESOURCESAT-1

GL

OB

AL

Linear 0.79 9.98𝑒−004 1.01 0.79 4.17𝑒−005 1.01 0.60 2.129 1.01

Polynomial 0.76 0.2496 1.01 0.79 0.0115 1.01 0.40 2.145 1.01

Sigmoid 0.79 2.43𝑒−003 1.01 0.78 4.51𝑒−005 1.01 0.49 2.316 1.01

LO

CA

L

Gaussian

(Euclidean) 1.0 2.56𝑒−012 1.27 1.0 4.41𝑒−009 1.36 1.0 1.74𝑒−004 1.36

Radial Basis 1.0 3.26𝑒−012 1.27 1.0 5.14𝑒−009 1.36 1.0 1.84𝑒−004 1.36

KMOD 1.0 2.65𝑒−013 1.24 1.0 6.31𝑒−009 1.35 1.0 1.58𝑒−004 1.35

Inverse

Multi-

quadratic

1.0 9.29𝑒−019 1.01 1.0 7.68𝑒−054 1.01 1.0 1.09𝑒−010 1.01

Spectral Angle 1.0 2.87𝑒−011 1.14 1.0 2.25𝑒−007 1.25 1.0 0.0055 1.17

Sensors/

Kernels

AWiFS LISS-III LISS-IV

𝜟 𝑬 𝒎 𝜟 𝑬 𝒎 𝜟 𝑬 𝒎

RESOURCESAT-2

GL

OB

AL

Linear 0.79 8.94𝑒−003 1.01 0.7 0.2556 1.01 0.75 0.3047 1.01

Polynomial 0.43 0.4384 1.01 0.63 1.176 1.01 0.52 1.418 1.01

Sigmoid 0.79 0.01781 1.01 0.75 0.3727 1.01 0.74 0.3844 1.01

LO

CA

L

Gaussian

(Euclidean) 0.80 5.04𝑒−138 1.02 0.80 1.05𝑒−152 1.02 0.80 1.46𝑒−153 1.02

Radial Basis 0.80 1.05𝑒−137 1.02 0.80 2.58𝑒−151 1.02 0.80 4.52𝑒−153 1.02

KMOD 0.80 2.34𝑒−131 1.02 0.80 2.93𝑒−146 1.02 0.80 7.82𝑒−147 1.02

Inverse

Multi-

quadratic

0.79 1.38𝑒−004 1.01 0.79 7.88𝑒−018 1.02 0.80 4.21𝑒−014 1.01

Spectral Angle 0.80 9.76𝑒−125 1.01 0.80 9.72𝑒−104 1.01 0.80 1.59𝑒−126 1.01

Page 58: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 40

(a) (b) (c) (d) (e) (f)

(i)

(ii)

(iii)

(iv)

(v)

Page 59: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 41

Figure 4-7: Generated fractional images for optimized 𝑚 values for Resourcesat-1 LISS-IV for (i) Linear

(ii) Polynomial (iii) Sigmoid (iv) Gaussian kernel using Euclidean norm (v) Radial Basis (vi) KMOD (vii)

Inverse Multiquadratic and (viii) Spectral Angle kernels for classes identified as (a) Agricultural field with

crop (b) Sal forest (c) Eucalyptus plantations (d) Dry agricultural field without crop (e) Moist agricultural

field without crop and (f) Water

(vi)

(vii)

(viii)

(a) (b) (c) (d) (e)

Page 60: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 42

(i)

(ii)

(iii)

(iv)

(v)

Page 61: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 43

Figure 4-8: Generated fractional images for optimized m values for Resourcesat-2 LISS-IV for (i) Linear

(ii) Polynomial (iii) Sigmoid (iv) Gaussian kernel using Euclidean norm (v) Radial Basis (vi) KMOD (vii)

Inverse Multiquadratic and (viii) Spectral Angle kernels for classes identified as (a) Agricultural field with

crop (b) Eucalyptus Plantation (c) Fallow Land(d) Sal Forest (e) Water

Table 4-5 shows the maximum mean membership difference values for optimized 𝑚 values for each

kernel for the Resourcesat-1 AWiFS dataset. The global kernels give the lowest values as compared to

local kernels and spectral angle kernel. Water class being more homogeneous it gives high values even for

global kernels. When analysing the mean membership difference values calculated for other classes,

however, it is much lower compared to ‘Water’. Fractional images generated for the local kernel and

spectral angle kernel highlight features with respect to their corresponding feature class. For instance, for

sal forest, patches were better visible for local kernels as compared to the global and spectral angle kernel.

Even though the spectral angle kernel has the highest mean membership difference, it can be seen from

the fraction images that different classes merge with other classes. ‘Water’ being more homogeneous than

other classes gives higher mean membership difference values for all kernels.

(vi)

(vii)

(viii)

Page 62: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 44

Table 4-5: Maximum mean membership difference values estimated for optimized values of m

(Resourcesat-1 AWiFS)

Kernels

Agricultural

field with

crop

Sal

Forest

Eucalyptus

Plantations

Dry

Agricultural

field without

crop

Moist

Agricultural

Field without

crop

Water

Linear 0.342 0.984 0.233 0.975 0.647 1.000

Polynomial 0.341 0.976 0.232 0.973 0.645 1.000

Sigmoid 0.323 0.728 0.228 0.384 0.393 0.999

Gaussian

(Euclidean) 1.000 1.000 1.000 1.000 1.000 1.000

Radial

Basis 1.000 1.000 1.000 1.000 1.000 1.000

IM 1.000 1.000 1.000 1.000 1.000 1.000

KMOD 1.000 1.000 1.000 1.000 1.000 1.000

SA 1.000 1.000 1.000 1.000 1.000 1.000

Note: Highlighted values for kernels denotes the acceptable values.

4.4. RESULTS OF FCM CLASSIFIER USING COMPOSITE

KERNELS

Composite kernels were tested to incorporate the spatial properties of global/local kernels and spectral

properties of spectral angle kernel. For this study five combinations of composite kernels have been

studied. For Resoucesat-1 and 2, the IM kernel and Gaussian kernel with the Euclidean norm were the

best single local kernels. Thus, to mix the spectral properties these kernels were added to the spectral angle

kernel. Also, a combination of a local kernel and a global kernel were considered. Even though the linear

kernel did not give good results, it has been added to the spectral angle kernel to check improvement in

performance. Table 4-6 and Table 4-7 shows five combinations of composite kernels and their optimized

𝑚 and 𝜆 values along with the calculated entropy and mean membership difference for Resourcesat- 1 and

Resourcesat-2 respectively.

Among the different combinations of composite kernels, the lowest entropy value was obtained for

Resourcesat-1 with the IM-Spectral kernel, which is a combination of a global and spectral angle kernel.

When a combination of a local kernel and spectral angle kernel (a local-spectral kernel) was compared with

a combination of a global kernel and a spectral angle kernel (a global-spectral kernel) the performance of

the latter was better. There was little difference between the entropy values of single linear kernel and the

Page 63: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 45

composite linear-spectral kernel. The fractional images generated for the combination of different kernels

for Resourcesat-1 has been shown in Figure 4-9. It can be seen that for the Gaussian-spectral kernel there

is a misclassification between the sal forest and eucalyptus plantations. Visually, the fractional images

generated by the linear-spectral kernel do not highlight the classes considered thus indicating

misclassification.

Considering the entropy values, the best composite kernels considered are: Resourcesat-1 IM-Spectral

kernel and Resourcesat-2 Gaussian-Spectral kernel, Resourcesat-1 IM-Linear kernel and Resourcesat-1 and

2 Linear-Spectral kernel. The accuracy assessment results for selected kernels are given in the next section.

Table 4-6: Optimized 𝑚 values for composite kernels for AWiFS, LISS-III and LISS-IV images

(Resourcesat-1) along with the calculated Mean Membership Difference (Δ), Entropy(E) and weight given

to each kernel (𝜆)

Sensors

/

Kernels

AWiFS LISS-III LISS-IV

∆ 𝑬 𝝀 𝒎 ∆ 𝑬 𝝀 𝒎 ∆ 𝑬 𝝀 𝒎

RESOURCESAT-1

Gaussian

Spectral 1.00

0.92

𝑒−005 0.91 1.30 1.00

1.06

𝑒−003 0.94 1.30 1.00

3.62

𝑒−004 0.90 1.34

IM

Spectral 0.99

2.29

𝑒−018 0.99 1.01 1.00

1.91

𝑒−051 0.99 1.01 0.99

1.91

𝑒−010 0.99 1.01

Gaussian

Linear 1.00

1.68

𝑒−243 0.90 1.26 1.00

1.52

𝑒−007 0.92 1.27 1.00

3.11

𝑒−007 0.90 1.24

IM

Linear 1.00

3.11

𝑒−019 0.99 1.01 1.00

2.61

𝑒−054 0.91 1.01 1.00

1.03

𝑒−010 0.95 1.01

Linear

Spectral 0.78

0.781

8 0.99 1.01 0.79

2.19

𝑒−004 0.99 1.01 0.99

2.48

8 0.99 1.01

Page 64: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 46

Table 4-7: Optimized m values for composite kernels for AWiFS, LISS-III and LISS-IV images

(Resourcesat-1) along with the calculated Mean Membership Difference (Δ), Entropy(E) and weight given

to each kernel (λ)

Sensors

/

Kernels

AWiFS LISS-III LISS-IV

∆ 𝐸 𝝀 𝒎 ∆ 𝐸 𝝀 𝒎 ∆ 𝐸 𝝀 𝒎

RESOURCESAT-2

Gaussian

Spectral 0.80

1.86

𝑒−006 0.95 1.30 0.80

2.65

𝑒−007 0.91 1.27 0.80

8.00

𝑒−011 0.95 1.21

IM

Spectral 0.79

2.99

𝑒−004 0.99 1.01 0.79

1.69

𝑒−017 0.99 1.01 0.79

8.64

𝑒−017 0.99 1.01

Gaussian

Linear 0.80

3.13

𝑒−194 0.90 1.30 0.80

2.45

𝑒−008 0.91 1.24 0.80

5.46

𝑒−012 0.96 1.19

IM

Linear 0.79

8.94

𝑒−005 0.94 1.01 0.79

4.97

𝑒−018 0.91 1.01 0.79

2.55

𝑒−014 0.90 1.01

Linear

Spectral 0.79

0.02

9 0.95 1.01 0.74

0.49

39 0.99 1.01 0.71

0.54

0 0.99 1.01

Page 65: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 47

(i)

(ii)

(iii)

(iv)

(v)

Figure 4-9: Generated fractional images for optimized m values of Resourcesat-1 LISS-IV for (i)

Gaussian-Spectral (ii) IM-Spectral(iii) Gaussian-Linear (iv) IM-Linear(v) Linear-Spectral for classes

identified as (a) Agricultural field with crop (b) Sal Forest (c) Eucalyptus Plantation (d) Dry agricultural

field without crop (e) Moist agricultural field without crop (f) Water

Page 66: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 48

4.5. ACCURACY ASSESSMENT RESULTS

In order to assess the accuracy of the classified outputs generated by FCM and FCM using single or

composite kernels an image to image accuracy assessment approach was selected. The high resolution

LISS-IV image is used for the reference dataset generation. Section 3.5 explains various accuracy

assessment methods. Among them FERM was used in this research study because of the generation of

soft outputs. The assessed fuzzy overall accuracy measures for the selected best kernels, FCM and

composite kernels were shown in Table 4-8. It is assessed for AWiFS image with LISS-III and LISS-IV as

reference images and for LISS-III image with LISS-IV image used as the reference dataset. The accuracy

measure obtained for FCM classifier helps to compare its performance with KFCM classifier. It is assessed

for AWiFS image with LISS-III and LISS-IV as reference images and for LISS-III image with LISS-IV

image used as the reference dataset.

Table 4-8: Accuracy assessment results for FCM, best single kernel and best composite kernels

CLASSIFIER AWiFS v/s LISS-III AWiFS v/s LISS-IV LISS-III v/s LISS-IV

R1 (%) R2 (%) R1 (%) R2 (%) R1 (%) R2 (%)

FCM 84.43 82.57 77.95 80.21 82.35 84.07

IM kernel 97.30 - 96.97 - 97.63 -

Gaussian kernel - 78.01 - 76.42 - 83.03

SA kernel 68.40 55.14 65.22 54.24 67.36 76.07

Linear Kernel 84.42 76.27 67.49 51.09 70.63 53.72

IM Spectral 45.80 - 29.37 - 55.22 -

Gaussian Spectral - 61.56 - 29.37 59.27

Linear Spectral 83.93 65.10 56.30 46.73 57.55 66.53

The accuracy measure obtained for FCM classifier helps to compare its performance with KFCM

classifier. Using the entropy and mean membership difference method, the best single kernels concluded

come under the category of local kernels. So as to combine the global kernel with local kernels, a

combination of a local and global kernel was considered. Similarly, so as to mix the global properties with

spectral the assessment of accuracy was necessary for a spectral-global kernel was considered. When

interpreting the fractional images generated using the linear kernel, a higher misclassification rate is

observed for the vegetation class. Among the two best kernels it can be seen that the IM kernel has a

higher fuzzy accuracy equal to 97.30% and 96.97% for AWiFS Resourcesat-1 dataset respectively. This is

higher than that of FCM classifier being equal to 84.43% and 77.95% for Resourcesat-1 AWiFS,

respectively. The accuracy for LISS-III dataset was higher as compared to the AWiFS dataset due to an

Page 67: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 49

increase in spatial resolution. KFCM classifier has a higher accuracy as compared to the FCM classifier.

Using the Gaussian kernel overall accuracy equal 76.42% and 83.03% respectively were obtained. These

are slightly lower as compared to the accuracies obtained with the FCM classifier. The composite kernel

shows overall the lowest accuracy as compared to single kernels and FCM. Among the considered

composite kernels, the Gaussian-Spectral showed the highest overall accuracy of 29.37% and 59.27% for

AWiFS and LISS-III Resourcesat-2 respectively (Table B-1 to Table B-36, Appendix B).

4.6. UNTRAINED CLASSES

Several classes were ignored by the analyst during the training stage of a classifier these correspond to

untrained classes. The untrained classes show high membership values for spectrally different class and

thus a decrease of the classification accuracy (Foody, 2000). In this research work, the KFCM classifier

was ignored the mean class values of agricultural field with crop for both Resourcesat-1 and Resourcesat-2

datasets. Table 4-9 to 4-12 compares the fuzzy user’s accuracy of the best single kernels Inverse

Multiquadratic and Gaussian for both trained as well as untrained case. The detailed measures for accuracy

assessment were given in Appendix B.

Table 4-9: Comparison of accuracy assessment in trained as well as untrained case for IM kernel and

FCM for AWiFS with LISS-III image (Resourcesat-1)

Accuracy Assessment Method Inverse Multiquadratic FCM

Trained Untrained Trained Untrained

Fuzzy User’s Accuracy

Sal Forest 99.26 51.16 85.60 49.38

Eucalyptus Plantation 97.84 67.44 88.43 74.13

Dry Agricultural Field Without Crop 97.20 45.05 73.11 31.48

Moist Agricultural Field Without Crop 93.97 25.56 65.23 43.27

Water 96.31 88.63 95.30 97.62

Average Users Accuracy 97.31 55.57 82.85 59.18

Page 68: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 50

Table 4-10: Comparison of accuracy assessment in trained as well as untrained case for IM kernel and

FCM for LISS-III with LISS-IV image (Resourcesat-1)

Accuracy Assessment Method Inverse Multiquadratic FCM

Trained Untrained Trained Untrained

Fuzzy User’s Accuracy

Sal Forest 98.90 61.61 84.66 65.29

Eucalyptus Plantation 98.87 81.32 91.24 83.63

Dry Agricultural Field Without Crop 96.87 24.12 73.02 65.29

Moist Agricultural Field Without Crop 97.11 30.96 69.35 55.14

Water 97.54 94.43 94.69 94.06

Average Users Accuracy 97.68 58.49 82.35 68.95

The Inverse Multiquadratic (IM) kernel was identified earlier to be the best single kernel in Resourcesat-2.

When we compare the average user’s accuracy in the case of untrained case to the trained classifier, a

decrease in the average user’s accuracy is observed. For AWiFS Resourcesat-1, the average user’s accuracy

decreased to 41.74% and 23.67% in the case of IM and simple FCM respectively. For Resoucesat-2, there

is at 18.94% and 21.14% decrease in the average user’s accuracy of Gaussian kernel using Euclidean Norm

and FCM respectively for LISS III images respectively. For Resoucesat-1 AWiFS using the spectral Angle

kernel there is a decrease of 11.74 % in the untrained case. More detail accuracy assessment for untrained

case is explained in Appendix B (Appendix B.7 to B.10). Figure 4-10 shows the graphical representation

between the trained and untrained values.

Table 4-11: Comparison of accuracy assessment in trained as well as untrained case for Gaussian kernel

using Euclidean norm and FCM for AWiFS with LISS-III image (Resourcesat-2).

Accuracy Assessment Method

Gaussian kernel using

Euclidean Norm FCM

Trained Untrained Trained Untrained

Fuzzy User’s Accuracy

Eucalyptus Plantation 90.66 70.08 84.62 70.07

Fallow Land 61.97 50.99 73.51 58.20

Sal Plantations 86.62 29.60 86.26 24.59

Water 97.88 95.82 95.61 96.09

Average Users Accuracy 80.56 61.62 83.38 62.24

Page 69: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 51

Figure 4-10: Graphical representation of average user’s accuracy for untrained and trained case for IM

and FCM Resourcesat-1 (a) AWiFS (b) LISS-III at optimized 𝑚 for Resourcesat-1.

0

20

40

60

80

100

120

IM FCM

Trained

Untrained

Table 4-12: Comparison of accuracy assessment in trained as well as untrained case for Gaussian kernel

using Euclidean norm and FCM for LISS-III with LISS-IV image (Resourcesat-2)

Accuracy Assessment Method

Gaussian kernel using

Euclidean Norm FCM

Trained Untrained Trained Untrained

Fuzzy User’s Accuracy

Eucalyptus Plantation 84.16 76.56 84.88 75.90

Fallow Land 85.39 77.30 78.83 67.69

Sal Plantations 78.10 29.51 79.14 21.93

Water 94.90 97.05 97.10 96.63

Average Users Accuracy 82.96 70.10 84.21 65.54

0

20

40

60

80

100

120

IM FCM

Trained

Untrained

(a) (b)

Page 70: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 52

5. DISCUSSION

The present chapter discusses the various results obtained from the three classification approaches used.

In this study different single as well as composite kernels were incorporated in the FCM objective function

to handle non-linearity in data. The main objective of this research was to optimally separate non-linear

classes using KFCM approach.

Classification problems can be resolved by the use of various classifiers that may be suitable for a specific

datasets. Spectral characteristics of various class labels can differ in their geometric structure for different

bands. Classes that can be separated using a linear decision boundary are the simplest case. Non-linear

nature of the data structures can exist due to the variation in the spectral values. A high variation observed

in the spectral values of one band may be lower for another band which leads to non-linearity in data.

Figure 6-1 shows the presence of non-linearity in the datasets used in this study. Non-linearity in classes:

Agricultural field with crop, Sal plantations, Eucalyptus plantation and water were observed in band1-

band2. It is clear that these three classes cannot be separated using a linear decision boundary.

Choice of selecting most pertinent kernel relies on the problem under study. Pal (2009) used five kernels

namely linear, polynomial, sigmoid, radial basis and linear spline kernel for image classification.

Considering the problem of non-linearity eight kernels (Table 4-3) were considered for this study. Three

types of single kernels were considered which exhibit dissimilar properties and these were integrated to

give composite kernels. Camps-valls et al. (2006) showed different methods to combine these single

kernels for hyperspectral image classification. Among them weighted summation method have been

adopted in this study.

The initial focus in this research work was given for optimizing different parameters of the classifiers.

Setting optimum values for different classifiers is important for their successful performance. These values

may change with the dataset used. Optimal values of 𝑚 was obtained based on the minimum entropy and

maximum mean membership difference criteria. For FCM, FCM using single kernels and FCM using

composite kernels the entropy is very high for higher value more than 4.0. Similarly, the maximum mean

membership difference was obtained for values of 𝑚 i.e. between 1.01 and 2.0. Based on this

interpretation, from Table- 4-2 for Resourcesat-1 AWiFS, LISS-III and LISS-IV 𝑚 was optimized at 1.35,

1.39 and 1.34 respectively (Appendix A.5). Optimized values of 𝑚 for single as well as composite kernels

for all the six images are given on Table 4-3, Table 4-4 and Table 4-6.

FCM resulted in an overall accuracy of 77.95% and 82.35% for AWiFS and LISS-III for Resourcesat-1. In

the past studies it has been shown that FCM resulted in an overall accuracy of 80.89% and 81.83% for the

AWiFS image of Resourceat-1 and Resourcesat-2 respectively (Singha, 2013). Among the single kernels,

Page 71: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 53

IM kernel produced higher overall classification accuracy of 96.97% and 97.63% (Table 4-8) for AWiFS

and LISS-III, Resousesat-1. When comparing the classification accuracy of KFCM with FCM there has

been an overall increase of about 21% and 15% for AWiFS and LISS-III datasets respectively. There was

an overall decrease in average user’s accuracy from 97.31% to 55.57% for Resourcesat-1 AWiFS IM kernel

in untrained case (Table 4-9). The composite kernels resulted in the least classification accuracy when

compared to FCM and KFCM. From Table 4-8, Gaussian-Spectral kernel has the highest overall accuracy

of 29.37% and 59.27% respectively. Using weighted summation combination approach (Camps-valls et al.,

2006), the composite kernels gave the least overall accuracy. This depends on the performance of the

single kernels taken in the combination which results in the lower accuracy.

Local kernels affect the kernel values if the sample point resides to the same cluster as its closest

neighbours. This may be true for the dataset used in this research work as it has more homogeneous area

and thus, the performance of local kernel is better as compared to that of global category. Those sample

points which resides far away from each other of the same class still have influence on the kernel values

are the global kernels. Certain sample point may be far but may be present in the sub cluster of another

class. Because of more heterogeneous area this may be true for global kernels and thus the performance

was lowest as compared to other categories. In agricultural field with crop sowing or harvesting of crops is

done at different times which provide variation within agriculture fields. Thus heterogeneity can be found

between agriculture fields. Also sal and eucalyptus planation has heterogeneity due to small grasslands

within these forest patches as well as variation with in sal or eucalyptus trees. This could be the reason why

the behaviour of a few kernels gives poor results for vegetation class.

When the generated fractional images were interpreted for the linear, polynomial and sigmoid kernels the

low membership values were found. As well as all the vegetation classes have shown similar membership

values in their corresponding fractional images. From Figure 4-7, it can be observed that in water fraction

image moist agricultural field with crop class shows high membership values. This shows that, global

kernels cannot classify the classes with a small variation in spectral values as local kernel. This is the reason

why the fractional images for agricultural field with crop; sal forest and eucalyptus plantation shows similar

membership values for global kernels. Thus, it can be concluded that the performance of a particular

kernel depends on how well it can differentiate small changes in their spectral nature.

Classification was also tested for untrained classes where the classifier was not trained using a class (in this

work, agricultural field with crop was not used for training). There is an overall decrease in the average

user’s accuracy in the untrained case as compared to the trained case (Table 4-9). Figure 4-10 shows the

graphical representation of the trained and untrained case for FCM and IM kernel. Untrained agricultural

field with crop pixels has been merged to sal or eucalyptus classes due to which average user accuracy has

reduced (Table B.37 to Table B.54 Appendix B.7. to B.10).

Page 72: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 54

When the overall classification accuracy was considered it can be concluded that KFCM with IM kernel

performs better than FCM classifier. KFCM classifier also reduces the mixed pixel problem because of its

fuzzy nature. To conclude KFCM performs better than FCM; it is required to perform the classification

with images of varying resolutions. In this study all the kernels were tested for both medium and coarser

resolution images. The behaviour of different kernels may also differ with the datasets used. Still these

kernels with fuzzy classifiers may be tested for large number of different datasets.

Page 73: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 55

6. CONCLUSIONS AND RECOMMENDATIONS

6.1. CONCLUSIONS

Resolution of remote sensing images plays a significant role in occurrence of mixed piexls. Presence of

mixed pixels has been a problem which may results in inaccurate classification results. Sub-pixel

classification methods such as FCM, Artificial Neural Network (ANN) are a solution for these uncertain

situations. Also, classes may be difficult to separate from each other using a straight line or a hyperplane

where they appear to be non-linear. Doing so may lead to reduced classification. Thus, to solve the

problem of non-linearity and mixed pixels a kernel based fuzzy approach have been tested in this study.

The main objective of this research work was to optimally separate non-linear classes using KFCM

approach. From the comparative evaluation of various sub-pixel classifiers used, KFCM classifier with IM

kernel achieved the overall highest classification accuracy. It was also observed that optimal values of

different parameters weighted constant 𝑚 and weight given to each kernel 𝜆 played a significant role in

the performance of KFCM based classifier.

To assess the accuracy of soft classification, choices of methods are available. Among them Fuzzy Error

Matrix (FERM) was recommended. A decrease in accuracy values were seen when a coarser resolution

AWiFS image was assessed with a finer resolution LISS-IV image due to an increase in spatial resolution.

This shows that information extracted from a finer resolution image is more adjacent to its ground truth

information. It was also observed a change in the accuracy assessment and mean membership values were

reflected due to an increase in radiometric resolution of Resourcesat-2 in comparison to Resourcesat-1.

Among the various single kernels used, IM and Gaussian kernel with Euclidean norm has the highest

overall performance. IM kernel has the highest overall accuracy of 97.31% for Resourcesat-1 AWiFS as

compared to the other (Table 4-8). Among the composite kernels, Gaussian-Spectral kernel has about

61.56% overall accuracy which is less in comparison to single kernel. Composite kernel performance

depends on the performance of single kernels considered to frame composite kernel. If a best single kernel

with lower entropy is combined with a kernel having higher entropy, the resulting composite kernel will

have a lower performance. Other methods such as stacked approach, direct summation, cross-information

methods are recommended to combine two single kernels for making composite kernels (Camps-valls et

al., 2006). In this study, it was also managed to carry out the effect on accuracy assessment results while

dropping agriculture field with crop as untrained class.

To conclude up KFCM classifier performed better than FCM classifier. This study may be concluded as,

the presence of non-linear data and mixed pixels may not be considered as a problem. But these are the

reasons for less classification accuracy.

Page 74: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 56

6.2. ANSWERS TO RESEARCH QUESTIONS

A-1 How can non-linearity within class boundaries be effectively handled using KFCM?

Answer: The samples in a data which cannot be separated using a straight line or a hyperplane they appear

to be non-linear. The Resourcesat-1 LISS-IV image used in this research work has non-linear data which is

clear from 2D scatterplot as shown in Figure 6-1 taking two bands at a time. Considering Figure 6-1(b),

the samples taken for agricultural field with crop, sal forest and eucalyptus plantation appear non-linear

and cannot be separated using a hyperplane.

Figure 6-1: Non-Linearity in different classes as 2D scatterplot for Resourcesat-1 LISS-IV image in (a)

band 1-band2 (b) band2-band3 (c) band1-band3 for classes identified

(a) (b) (c)

Agricultural field with crop

Sal forest

Eucalyptus plantations

Dry agricultural field without

crop

Moist agricultural field without

crop

Water

Thus, these non-linear samples are mapped to a higher dimensional space using kernel functions where

they are linearly separated and the nonlinearity in input space is removed for separating the different

classes. Even though one cannot visualize the linear separation in higher dimensional space but this can be

proved comparing the classification accuracy between FCM and KFCM.

B-1 How can mixed pixels be handled using KFCM?

Answer: Mixed pixels occur when more than one land cover corresponds within a single pixel. FCM

algorithm handles the occurrence of mixed pixels by estimating the membership values for each land cover

Page 75: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 57

classes within a pixel and thus increases the classification accuracy. As a fuzzy approach is used in this

study, KFCM handles mixed pixels as FCM.

C-1 How to evaluate the performance of single/composite kernels in KFCM?

Answer: The uncertainty in the different single or composite kernels can be found using the entropy

values calculated for both the case. The accuracy can be improved by optimizing the value of 𝑚 which

matches the fuzziness in the ground with the fuzziness in the image. Optimization of 𝑚 was done by

selecting kernel with minimum entropy and maximum mean membership difference. As the approach

used was fuzzy, the classified output images were in the form of fractional images. The performance of

single or composite kernel can be evaluated using an image to image accuracy assessment technique where

a high resolution image was used to evaluate the performance of coarser resolution images. Fuzzy Error

Matrix (FERM) was used to assess the classification accuracy.

D-1 To which degree is the FCM classification algorithm capable to handle non-linear feature vectors of

different classes for classification?

Answer: FCM classifier has a reduction in accuracy of about 15% and 21% for AWiFS and LISS-III

datasets when compared with that of KFCM. This decrease in overall accuracy shows the drawback of

FCM algorithm to handle non-linear feature vectors in the input space. This happened for the trained

classifier. For the untrained case, there is a decrease in average user’s accuracy when compared with their

corresponding trained case.

E-1 What will be the effect of using composite kernels as compared to single kernels?

Answer: Composite kernels are used in order to incorporate the spectral as well as properties of local or

global kernels which come in local proximity and global proximity in a classified image. Also, in this

research work, a local as well as global kernel was added to a spectral kernel and combinations of local as

well as global kernels were also studied. But it was found that a composite kernel has a reduced accuracy as

compared to FCM as well as FCM using single kernel.

Page 76: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 58

6.3. RECOMMENDATIONS

For every research, it is of high importance to assess the quality of the end product. For effective

classification of data various researchers have introduced many algorithms. Even though the KFCM

classifier solves the problem of non-linearity, it does not solve the problem of overlap between different

classes. Thus there are many limitations with the classifier used for this research work. The KFCM

classification technique can be improved with the following points under consideration:

Possiblistic c-Means (PCM) algorithm have been proved to deal with noises and outliers

(Krishnapuram and Keller, 1996). Thus a Kernel based Possiblistic c-Means (KPCM) algorithm

can be studied to improve the performance of KFCM classification.

For the composite kernels, weighted summation method has been used. Other methods such as

stacked approach, direct summation kernel (Camps-valls et al., 2006) can be used to study the

behaviour of composite kernels.

The classified results can be improved by optimizing the weight component m. In this research

work, the mean membership difference value has been taken 1.000 (ideal case) which may not be

the case when you match the fuzziness in the image to the fuzziness in the ground (less value than

1.000).

Unsupervised Kernel based Fuzzy c-Means (KFCM) clustering could be done where the mean

feature vectors are not initialized from the signature data of various classes.

Page 77: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 59

REFERENCES

Atkinson, P. M., Cutler, M. E. J., & Lewis, H. (1997). Mapping sub-pixel proportional land cover with AVHRR imagery. International Journal of Remote Sensing, 18(4), 917–935.

Awan, A. M., & Sap, M. N. M. (2005). Clustering spatial data using a kernel-based algorithm. In Proceedings of the Annual Research Seminar (pp. 306–310).

Ayat, N. E., Cheriet, M., Remaki, L., & Suen, C. Y. (2001). KMOD-A New Support Vector Machine Kernel With Moderate Decreasing. IEEE, 1215–1219.

Ben-hur, A., Horn, D., Siegelmann, H. T., & Vapinik, V. (2001). Support Vector Clustering. Journal of Machine Learning Research, 2, 125–137.

Bezdek, J. C., Ehrlich, R., & Full, W. (1984). FCM : The Fuzzy c-Means Clustering Algorithm. Computers & GeoSciences, 10, 191–203.

Bhatt, S. R., & Mishra, P. K. (2013). Study of Local Kernel with Fuzzy C Mean Algorithm. International Journal of Advanced Research in Computer Science & Software Engineering, 3(12), 636–639.

Bhatt, S. R., & Mishra, P. K. (2014). Analysis of Global Kernels Using Fuzzy C Means Algorithm. International Journal of Advanced Research in Computer Science and Software Engineering, 4(6), 79–82.

Binaghi, E., Brivio, P. A., Ghezzi, P., & Rampini, A. (1999). A fuzzy set-based accuracy assessment of soft classification. Pattern Recognitiion Letters, 20, 935–948.

Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A Training Algorithm for Optimal Margin Classifiers. In 5th Annual ACM workshop on COLT (pp. 144–152).

Campbell, J. B. (1996). Introduction to Remote Senisng (pp. 337–349).

Camps-valls, G., & Bruzzone, L. (2005). Kernel-Based Methods for Hyperspectral Image Classification. IEEE Transactions on GeoScience and Remote Sensing, 43(6), 1351–1362.

Camps-Valls, G., & Bruzzone, L. (2009). Kernel Methods for Remote Sensing Data Analysis (pp. 25–45).

Camps-valls, G., Gómez-chova, L., Calpe-maravilla, J., Martín-guerrero, J. D., Soria-olivas, E., Alonso-chordá, L., … Member, A. (2004). Robust Support Vector Method for Hyperspectral Data Classification and Knowledge Discovery. IEEE Transactions on GeoScience and Remote Sensing, 42(7), 1530–1542.

Camps-valls, G., Gomez-chova, L., Muñoz-marí, J., Vila-francés, J., & Calpe-maravilla, J. (2006). Composite Kernels for Hyperspectral Image Classification. IEEE GeoScience and Remote Sensing Letters, 3(1), 93–97.

Camps-valls, G., Member, S., Gómez-chova, L., Muñoz-marí, J., Rojo-álvarez, J. L., Martínez-ramón, M., & Member, S. (2008). Kernel-Based Framework for Multitemporal and Multisource Remote Sensing Data Classification and Change Detection. IEEE Transactions on GeoScience and Remote Sensing, 46(6), 1822–1835.

Page 78: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 60

Cannon, R. L., Dave, J. V, Bezdek, J. C., & Trivedi, M. M. (1986). Segmentation of Thematic Mapper Image Using the Fuzzy c-Means Clusterng Algorthm. IEEE Transactions on GeoScience and Remote Sensing, 24(3), 400–408.

Chawla, S. (2010). Possibilistic c-Means -Spatial Contexutal Information based sub-pixel classification approach for multi-spectral data.

Choodarathnakara, A. L., Kumar, D. T. A., Koliwad, D. S., & Patil, D. C. G. (2012a). Mixed Pixels : A Challenge in Remote Sensing Data Classification for Improving Performance. International Journal of Advanced Research in Computer Engineering & Technology, 1(9), 261–271.

Choodarathnakara, A. L., Kumar, D. T. A., Koliwad, D. S., & Patil, D. C. G. (2012b). Soft Classification Techniques for RS Data. IJCSET, 2(11), 1468–1471.

Congalton, R. G. (1991). A Review of Assessing the Accuracy of Classifications of Remotely Sensed Data. Remote Sensing of Environment, 46(1991), 35–46.

Dehghan, H., & Ghassemian, H. (2006). Measurement of uncertainty by the entropy : application to the classification of MSS data. International Journal of Remote Sensing, 27(18), 4005–4014.

Filippone, M., Camastra, F., Masulli, F., & Rovetta, S. (2008). A survey of kernel and spectral methods for clustering. Pattern Recognition, 41, 176–190.

Fisher, P. F., & Pathirana, S. (1990). The Evaluation of Fuzzy Membership of Land Cover Classes in the Suburban Zone. Remote Sensing of Environment, 34, 121–132.

Foody, G. M. (1995). Cross-entropy for the evaluation of the accuracy of a fuzzy land cover classification with fuzzy ground reference data. ISPRS Journal of Photogrammetry and Remote Sensing, 50(5), 2–12.

Foody, G. M. (2000). Estimation of sub-pixel land cover composition in the presence of untrained classes. Computers & GeoSciences, 26, 469–478.

Genton, M. G. (2001). Classes of Kernels for Machine Learning : A Statistics Perspective. Journal of Machine Learning Research, 2, 299–312.

Girolami, M. (2002). Mercer Kernel Based Clustering in Feature Space. IEEE Transactions on Neural Networks, 13(3), 780–784.

Graves, D., & Pedrycz, W. (2007). Performance of kernel-based fuzzy clustering. Electronic Letters, 43(25).

Harikumar, A. (2014). The effects of discontinuity adaptive MRF models on the Noise classifier.

Hemanth, D. J., Selvathi, D., & Anitha, J. (2009). Effective Fuzzy Clustering Algorithm for Abnormal MR Brain Image Segmentation. In IEEE International Advanced Computing Conference (pp. 609–614).

Huang, H., Chuang, Y., & Chen, C. (2011). Multiple Kernel Fuzzy Clustering. IEEE Transactions on Fuzzy Systems, 1–15.

Huang, H., Chuang, Y., & Chen, C. (2012). Multiple Kernel Fuzzy Clustering. IEEE Transactions on Fuzzy Systems, 20(1), 120–134.

Huang, H., & Zhu, J. (2006). Kernel based Non-linear Feature Extraction Methods for Speech Recognition. In Proceedings of the Sixth International Conference on Intelligent Systems and Applications.

Page 79: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 61

Isaacs, J. C., Foo, S. Y., & Meyer-baese, A. (2007). Novel Kernels and Kernel PCA for Pattern Recognition. In Proceedings of 2007 IEEE Symposium on Computer Intelligence in Robotics and Automation (pp. 438–443).

Jain, C., & Srivastava, G. (2013). Designing a Classifier with KFCM Algorithm to Achieve Optimization of Clustering and Classification Simultaneously. International Journal of Emerging Technology and Advanced Engineering, 3(9), 131–140.

Jr, R. G. P., & Cheuk, M. L. (2006). A generalized cross ‐ tabulation matrix to compare soft ‐ classified maps at multiple resolutions. International Journalof Geographical Information Science, 20(1), 1–30.

Kaur, P., Gupta, P., & Sharma, P. (2012). Review and Comparison of Kernel Based Fuzzy Image Segmentation Techniques. I.J. Intelligent Systems and Applications, 7, 50–60.

Kavzoglu, T., & Reis, S. (2008). Performance Analysis of Maximum Likelihood and Artificial Neural Network Classifiers for Training Sets with Mixed Pixels. GIScience & Remote Sensing, 45(3), 330–342.

Kim, D., Lee, K. H., & Lee, D. (2009). On cluster validity index for estimation of the optimal number of fuzzy clusters. Pattern Recognition, 37(2004), 2009–2025.

Kim, K. I., Park, S. H., & Kim, H. J. (2001). Kernel Principal Component Analysis for Texture Classification. IEEE Signal Processing Letters, 8(2), 39–41.

Kloditz, C., Boxtel, A. Van, Carfagna, E., & Deursen, W. Van. (1998). Estimating the Accuracy of Coarse Scale Classification Using High Scale Information. Photogrammetric Engineering & Remote Sensing, 64(2), 127–133.

Krishnapuram, R., & Keller, J. M. (1996). The Possibilistic C-Means Algorithm: Insights and Recommendations. IEEE Transactions on Fuzzy Systems, 4(3), 385–393.

Kumar, A. (2007). Unpublished PhD Thesis. Investigation in Sub-pixel classification approaches for Land Use and Land Cover Mapping. IIT Roorkee.

Kumar, A., Ghosh, S. K., & Dadhwal, V. K. (2006). A comparison of the performance of fuzzy algorithm versus statistical algorithm based sub-pixel classifier for remote sensing data. In International Society for Photogrammetry and Remote Sensing (pp. 1–5).

Latifovic, R., & Olthof, I. (2004). Accuracy assessment using sub-pixel fractional error matrices of global land cover products derived from satellite data. Remote Sensing of Environment, 90, 153–165.

Lillesand, T. M., & Kiefer, R. W. (1979). Remote Sensing and Image Interpretation (pp. 465–670).

Lu, D., & Weng, Q. (2007). International Journal of Remote A survey of image classification methods and techniques for improving classification performance. International Journal of Remote Sensing, 28(5), 823–870.

Mercier, G., & Lennon, M. (2003). Support Vector Machines for Hyperspectral Image Classification with Spectral-based kernels. In IGARSS (pp. 288–290).

Mohamed, R. M., & Farag, A. A. (2004). Mean Field Theory for Density Estimation Using Support Vector Machines. Computer Vision and Image Processing Laboratory, University of Louisville , Louisville, KY, 40292.

Page 80: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 62

Okeke, F., & Karnieli, A. (2006). Remote Methods for fuzzy classification and accuracy assessment of

historical aerial photographs for vegetation change analyses . Part I : Algorithm development. International Journal of Remote Sensing, 1-2(December 2014), 153–176.

Pal, M. (2009). Kernel Methodsin Remote Sensing: A Review. ISH Journal of Hydraulic Engineering, 15, 194–215. doi:10.1080/09715010.2009.10514975

Ravindraiah, R., & Tejaswini, K. (2013). A Survey of Image Segmentation Algorithms Based On Fuzzy Clustering. International Journal of Computer Science and Mobile Computing, 2(7), 200–206.

Richards, J. A., & Jia, X. (2005). Remote Sensing Digital Image Analysis (pp. 249–263).

Settle, J. J., & Drake, N. A. (1993). Linear mixing and the estimation of ground cover proportions. International Journal of Remote Sensing, 14(6), 1159–1177.

Silvan-Cardenas, J. L., & Wang, L. (2008). Sub-pixel confusion – uncertainty matrix for assessing soft classifications. Remote Sensing of Environment, 112, 1081–1095.

Singha, M. (2013). Study the effect of discontinuity adaptive MRF models in fuzzy based classifier Study the effect of discontinuity adaptive MRF models in fuzzy based classifier.

Smits, P. C., Dellepiane, S. G., & Schowengerdt, R. A. (1999). Quality assessment of image classification

algorithms for land-cover mapping : A review and a proposal for a cost- based approach. International Journal of Remote Sensing, 20(8), 1461–1486.

Suganya, R., & Shanthi, R. (2012). Fuzzy C- Means Algorithm- A Review. International Journal of Scientific and Research Publications, 2(11), 1–3.

Tan, K. C., Lim, H. S., & Jafri, M. Z. M. (2011). Comparison of Neural Network and Maximum Likelihood Classifiers for Land Cover Classification Using Landsat Multispectral Data. In IEEE Conference on Open Systems (pp. 241–244).

Tsai, D., & Lin, C. (2011). Fuzzy C -means based clustering for linearly and nonlinearly separable data. Pattern Recognition, 44, 1750–1760.

Tso, B., & Mather, P. M. (2000). Classification of Remotely Sensed Data (pp. 54–61).

Vinushree, N., Hemalatha, B., & Kaliappan, V. (2014). Efficient Kernel-Based Fuzzy C-Means Clustering For Pest Detection and Classification. In World Congress on Computing and Communication Technologies (pp. 179–181).

Wang, F. (1990). Fuzzy Supervised Classification of Remote Sensing Images. IEEE Transactions on GeoScience and Remote Sensing, 28(2), 194–201.

Yang, A., Jiang, L., & Zhou, Y. (2007). A KFCM-based Fuzzy Classifier. In Fourth International Conference on Fuzzy Systems and Knowledge Discovery.

Yun-song, S., & Yu-feng, S. (2010). Remote sensing image classification and recognition based on KFCM. In 5th International Conference on Computer Science and Education (ICCSE), 2010 (pp. 1062–1065).

Zadeh, L. A. (1965). Fuzzy Sets. Information and Control, 8, 338–353.

Page 81: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 63

Zhang, D., & Chen, S. (2002). Fuzzy Clustering Using Kernel Method. In International Conference on Control and Automation (pp. 123–127).

Zhang, D., & Chen, S. (2003). Clustering incomplete data using kernel-based fuzzy c-means algorithm. Neural Processing Letters, 18, 155–162.

Zhang, & Foody. (2002). Fully-fuzzy supervised classification of sub-urban land cover from remotely

sensed imagery : Statistical and artificial neural network approaches. International Journal of Remote Sensing, 22(5), 615–628.

Zhang, J., & Foody, G. M. (1998). A fuzzy classification of sub-urban land cover from remotely sensed imagery. International Journal of Remote Sensing, 19(14), 2721–2738.

Zimmermann, H. J. (2001). Fuzzy Set Theory- and Its Applications (pp. 277–323).

Page 82: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 64

APPENDIX A

A.1. Generated fraction images for the best single kernels

(a) (b) (c) (d) (e) (f)

(i)

(ii)

(iii)

Figure A-1: Generated fractional images for best single kernels from LISS-III (Resourcesat-1) for (i)

linear (ii) Inverse Multiquadratic (iii) spectral angle kernel for classes identified as (a) Agriculture field

with crop (b) Sal forest (c) Eucalyptus plantation (d) Dry agriculture field with crop (e) Moist agriculture

field with crop (f) Water

Page 83: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 65

(a) (b) (c) (d) (e)

(i)

(ii)

(iii)

Figure A-2: Generated fractional images for best single kernels from LISS-III (Resourcesat-2) for (i)

linear (ii) Gaussian kernel using Euclidean norm (iii) spectral angle kernel for classes identified as (a)

Agriculture field with crop (b) Eucalyptus plantation (c) Fallow land (d) Sal forest (e) Water

Page 84: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 66

(a) (b) (c) (d) (e) (f)

(i)

(ii)

(iii)

Figure A-3: Generated fractional images for best single kernels from AWiFS (Resourcesat-1) for (i) linear

(ii) Inverse Multiquadratic (iii) spectral angle kernel for classes identified as (a) Agriculture field with crop

(b) Sal forest (c) Eucalyptus plantation (d) Dry agriculture field with crop (e) Moist agriculture field with

crop (f) Water

Page 85: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 67

(a) (b) (c) (d) (e)

(i)

(ii)

(iii)

Figure A-4: Generated fractional images for best single kernels from AWiFS (Resourcesat-2) for (i)

linear (ii) Gaussian kernel using Euclidean norm (iii) spectral angle kernel for classes identified as (a)

Agriculture field with crop (b) Eucalyptus plantation (c) Fallow land (d) Sal forest (e) Water

Page 86: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 68

A.5.Variation in Entropy (𝑬 ) and Mean Membership Difference against the weight

constant (𝒎) for FCM and FCM using single kernels.

(i)

(ii)

(iii)

(iv)

(v)

Page 87: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 69

(vi)

(vii)

(viii)

(ix)

Figure A-5: Variation in Entropy(𝐸) and Mean membership difference against the weight constant

( 𝑚 ) for FCM for Resourcesat-1 AWiFS for (i) FCM) (ii) linear (iii)polynomial

(iv)sigmoid(v)Gaussian kernel with Euclidean Norm(vi) Radial Basis (vii) KMOD) (viii)IM (ix)

Spectral Angle

Page 88: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 70

A.6. Variation in Entropy (𝑬) and Mean Membership Difference against the weight

constant (𝒎) for composite kernels.

(a)

(b)

(c)

Figure A-6: Variation in Entropy(𝐸) and Mean membership difference against the weight constant (𝑚)

for Gaussian-spectral angle kernel for (a) Resourcesat-2 AWiFS (b) Resourcesat-1 LISS-III and (c)

Resourcesat-2 LISS-III

Page 89: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 71

(a)

(b)

(c)

Figure A-7: Variation in Entropy(𝐸) and Mean membership difference against the weight constant (𝑚)

for IM-spectral angle kernel for (a) Resourcesat-2 AWiFS (b) Resourcesat-1 LISS-III and (c) Resourcesat-

2 LISS-III

Page 90: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 72

APPENDIX B

B.1. Accuracy Assessment of classified outputs for AWiFS imagery of Resourcesat-1 with

reference dataset as LISS-IV imagery of Resourcesat-1, with all classes trained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 73.89 73.96 ± 3.01

Sal Forest 80.28 80.40 ± 5.20

Eucalyptus Plantation 90.84 90.86 ± 1.44

Dry Agricultural Field Without Crop 51.19 59.26 ± 16.32

Moist Agricultural Field Without Crop 39.43 51.90 ± 4.84

Water 97.08 97.18 ± 0.37

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 84.64 84.87 ± 4.40

Sal Forest 86.38 86.33 ± 5.20

Eucalyptus Plantation 66.01 66.52 ± 3.04

Dry Agricultural Field Without Crop 74.35 74.78 ± 10.54

Moist Agricultural Field Without Crop 90.65 90.98 ± 3.99

Water 81.45 80.62 ± 7.95

Overall Accuracy 77.95 78.08 ± 4.00

Kappa Coefficient - 0.72 ± 0.05

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 98.56 98.53 ± 0.35

Sal Forest 99.53 99.53 ± 0.02

Eucalyptus Plantation 99.19 99.18 ± 0.07

Dry Agricultural Field Without Crop 95.53 95.07 ± 3.36

Moist Agricultural Field Without Crop 91.45 91.90 ± 3.08

Water 97.83 97.77 ± 0.54

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 97.63 97.60 ± 1.41

Sal Forest 94.32 94.26 ± 2.27

Eucalyptus Plantation 94.38 94.33 ± 2.66

Dry Agricultural Field Without Crop 99.47 99.45 ± 0.01

Moist Agricultural Field Without Crop 99.61 99.61 ± 0.005

Water 96.79 96.77 ± 0.61

Overall Accuracy 96.97 96.92 ± 1.30

Kappa Coefficient - 0.96 ± 0.01

Table B-1: Accuracy Assessment results for FCM classified AWiFS (Resourcesat-1) against LISS-IV

(Resourcesat-1) reference data.

Table B-2: Accuracy Assessment results for Inverse Multiquadratic kernel classified AWiFS (Resourcesat-

1) against LISS-IV (Resourcesat-1) reference data.

Page 91: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 73

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 99.81 99.78 ± 0.08

Sal Forest 95.45 95.47 ± 0.13

Eucalyptus Plantation 59.58 61.84 ± 12.07

Dry Agricultural Field Without Crop 16.18 50.33 ± 43.42

Moist Agricultural Field Without Crop 3.03 43.59 ± 42.20

Water 19.13 19.17 ± 0.52

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 53.44 53.54 ± 2.10

Sal Forest 74.05 74.09 ± 1.07

Eucalyptus Plantation 51.99 51.92 ± 0.60

Dry Agricultural Field Without Crop 0.68 0.67 ± 0.04

Moist Agricultural Field Without Crop 0.13 0.13 ± 0.003

Water 98.59 98.58 ± 0.12

Overall Accuracy 67.49

Kappa Coefficient

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 53.09 53.53 ± 3.04

Sal Forest 43.41 44.54 ± 10.28

Eucalyptus Plantation 83.18 83.41 ± 1.12

Dry Agricultural Field Without Crop 33.99 34.24 ± 1.80

Moist Agricultural Field Without Crop 61.89 62.38 ± 5.41

Water 97.69 98.09 ± 2.93

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 71.55 71.57 ± 2.14

Sal Forest 27.78 27.78 ± 0.42

Eucalyptus Plantation 64.02 64.24 ± 0.12

Dry Agricultural Field Without Crop 82.52 82.96 ± 6.71

Moist Agricultural Field Without Crop 59.86 60.44 ± 8.74

Water 80.97 80.45 ± 6.71

Overall Accuracy 65.22 65.30 ± 2.93

Kappa Coefficient - 0.52 ± 0.04

Table B-4: Accuracy Assessment results for spectral angle kernel classified AWiFS (Resourcesat-1)

against LISS-IV (Resourcesat-1) reference data.

Table B-3: Accuracy Assessment results for linear kernel classified AWiFS (Resourcesat-1) against LISS-

IV (Resourcesat-1) reference data.

Page 92: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 74

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 20.59 21.00 ±2.82

Sal Forest 91.55 91.71 ± 0.05

Eucalyptus Plantation 87.56 87.95 ± 0.55

Dry Agricultural Field Without Crop 37.58 39.04 ± 13.33

Moist Agricultural Field Without Crop 12.70 12.78 ± 0.38

Water 97.76 99.00 ± 0.03

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 12.59 12.63 ± 0.41

Sal Forest 10.52 10.66 ±1.11

Eucalyptus Plantation 15.85 15.88 ± 0.60

Dry Agricultural Field Without Crop 49.24 49.39 ± 0.15

Moist Agricultural Field Without Crop 96.01 97.83 ± 0.001

Water 74.32 74.43 ± 1.36

Overall Accuracy 29.37 29.50 ± 1.23

Kappa Coefficient - 0.19 ± 0.018

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 99.77 99.84 ± 0.07

Sal Forest 95.23 95.46 ± 0.03

Eucalyptus Plantation 63.56 67.41 ± 15.66

Dry Agricultural Field Without Crop 100 100 ± 0.0

Moist Agricultural Field Without Crop 100 100 ± 0.0

Water 9.88 9.95 ± 0.16

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 43.30 43.35 ± 1.32

Sal Forest 63.20 63.21 ± 0.72

Eucalyptus Plantation 37.41 37.42 ± 0.05

Dry Agricultural Field Without Crop 0.10 0.09 ± 0.002

Moist Agricultural Field Without Crop 0.11 0.11 ± 0.001

Water 98.88 98.84 ± 0.04

Overall Accuracy 56.30 56.31 ± 0.04

Kappa Coefficient - 0.37 ± 0.01

Table B-6: Accuracy Assessment results for IM-Spectral kernel classified AWiFS (Resourcesat-1) against

LISS-IV (Resourcesat-1) reference data.

Table B-5: Accuracy Assessment results for IM-Spectral kernel classified AWiFS (Resourcesat-1) against

LISS-IV (Resourcesat-1) reference data.

Page 93: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 75

B.2. Accuracy Assessment of classified outputs for AWiFS (Resourcesat-1) with

reference dataset as LISS-III (Resourcesat-1) with all classes trained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 89.40 89.41 ± 2.10

Sal Forest 65.23 85.58 ± 3.44

Eucalyptus Plantation 88.43 88.48 ± 1.61

Dry Agricultural Field Without Crop 73.11 72.96 ± 13.55

Moist Agricultural Field Without Crop 85.60 65.87 ± 3.94

Water 95.30 95.39 ± 1.12

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 83.66 83.82 ± 3.13

Sal Forest 89.35 89.26 ± 2.32

Eucalyptus Plantation 79.74 79.91 ± 2.37

Dry Agricultural Field Without Crop 76.33 76.41 ± 8.01

Moist Agricultural Field Without Crop 89.35 88.79 ± 3.86

Water 88.14 87.66 ± 3.19

Overall Accuracy 84.43 84.49 ± 3.01

Kappa Coefficient - 0.80 ± 0.03

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 99.30 99.30 ± 0.02

Sal Forest 99.26 99.26 ± 0.01

Eucalyptus Plantation 97.84 97.88 ± 0.42

Dry Agricultural Field Without Crop 97.20 96.99 ± 1.78

Moist Agricultural Field Without Crop 93.97 94.32 ± 1.90

Water 96.31 96.27 ± 0.50

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 95.72 95.68 ± 1.87

Sal Forest 96.27 96.30 ± 1.76

Eucalyptus Plantation 97.15 97.13 ± 0.88

Dry Agricultural Field Without Crop 99.04 99.03 ± 0.02

Moist Agricultural Field Without Crop 99.10 99.11 ± 0.01

Water 96.66 96.65 ± 0.13

Overall Accuracy (in %) 97.30 97.29 ± 0.13

Kappa Coefficient - 0.96 ± 0.009

Table B-7: Accuracy Assessment results for FCM classified AWiFS (Resourcesat-1) against LISS-III

(Resourcesat-1) reference data

Table B-8: Accuracy Assessment results for FCM classified AWiFS (Resourcesat-1) against LISS-III

(Resourcesat-1) reference data

Page 94: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 76

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 95.87 95.21 ± 0.32

Sal Forest 87.64 87.59 ± 0.33

Eucalyptus Plantation 84.35 83.64 ± 1.93

Dry Agricultural Field Without Crop 0.28 0.84 ± 0.58

Moist Agricultural Field Without Crop 1.35 2.17 ± 1.02

Water 75.67 75.38 ± 0.64

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 68.72 69.48 ± 1.09

Sal Forest 90.35 90.33 ± 0.24

Eucalyptus Plantation 71.97 72.71 ± 2.93

Dry Agricultural Field Without Crop 6.81 5.00 ± 4.08

Moist Agricultural Field Without Crop 1.87 4.46 ± 0.25

Water 87.20 86.22 ± 0.24

Overall Accuracy 84.42 84.38 ± 0.59

Kappa Coefficient - 0.75 ± 0.009

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 78.43 78.38 ± 1.09

Sal Forest 86.39 86.24 ± 1.45

Eucalyptus Plantation 68.53 68.63 ±0.80

Dry Agricultural Field Without Crop 51.44 53.69 ± 8.07

Moist Agricultural Field Without Crop 45.03 50.57 ±19.97

Water 99.25 99.27 ± 0.01

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 65.29 65.73 ± 5.28

Sal Forest 26.29 26.77 ± 3.62

Eucalyptus Plantation 79.30 79.71 ± 5.59

Dry Agricultural Field Without Crop 76.24 76.30 ± 1.74

Moist Agricultural Field Without Crop 68.53 68.72 ± 2.79

Water 86.79 86.78 ± 2.27

Overall Accuracy 68.40 68.75 ±4.78

Kappa Coefficient - 0.59 ± 0.06

Table B-9: Accuracy Assessment results for linear kernel classified AWiFS (Resourcesat-1) against LISS-

III (Resourcesat-1) reference data

Table B-10: Accuracy Assessment results for spectral angle kernel classified AWiFS (Resourcesat-1) against

LISS-III (Resourcesat-1) reference data

Page 95: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 77

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 76.41 76.43 ± 0.45

Sal Forest 66.85 66.91 ± 1.23

Eucalyptus Plantation 74.76 74.95 ± 3.61

Dry Agricultural Field Without Crop 48.82 48.81 ± 7.66

Moist Agricultural Field Without Crop 29.25 29.31 ± 0.39

Water 94.56 98.88 ± 0.01

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 18.50 18.53 ± 0.35

Sal Forest 13.64 13.96 ± 2.02

Eucalyptus Plantation 36.82 36.86 ± 0.46

Dry Agricultural Field Without Crop 30.93 31.04 ± 0.17

Moist Agricultural Field Without Crop 97.35 97.35 ± 0.003

Water 85.62 86.92 ± 0.24

Overall Accuracy 45.80 45.93 ± 0.82

Kappa Coefficient - 0.31 ± 0.01

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 96.76 96.77 ± 0.13

Sal Forest 90.88 90.85 ± 0.16

Eucalyptus Plantation 85.99 86.11 ± 2.77

Dry Agricultural Field Without Crop 0.01 0.0 ± 0.0

Moist Agricultural Field Without Crop 94.44 50.24 ± 30.70

Water 73.19 73.15 ± 0.26

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 66.33 66.28 ± 0.75

Sal Forest 85.06 85.05 ± 0.17

Eucalyptus Plantation 71.17 71.07 ± 1.59

Dry Agricultural Field Without Crop 0.01 0.0 ± 0.0

Moist Agricultural Field Without Crop 4.80 3.32 ± 1.32

Water 92.36 92.33 ± 0.04

Overall Accuracy 83.93 83.86 ± 0.29

Kappa Coefficient - 0.74 ± 0.004

Table B-11: Accuracy Assessment results for IM-spectral angle kernel classified AWiFS (Resourcesat-1)

against LISS-III (Resourcesat-1) reference data

Table B-12: Accuracy Assessment results for linear-spectral angle kernel classified AWiFS (Resourcesat-1)

against LISS-III (Resourcesat-1) reference data

Page 96: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 78

B.3. Accuracy Assessment of classified outputs for LISS-III imagery of Resourcesat-1

with reference dataset as LISS-IV imagery of Resourcesat-1, with all classes trained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 72.54 72.68 ± 2.79

Sal Forest 84.66 84.61 ± 3.72

Eucalyptus Plantation 91.24 91.23 ± 0.75

Dry Agricultural Field Without Crop 73.04 73.62 ± 9.11

Moist Agricultural Field Without Crop 69.35 69.66 ± 4.32

Water 94.69 94.78 ± 0.99

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 91.57 83.51 ± 1.22

Sal Forest 83.92 83.93 ± 0.97

Eucalyptus Plantation 74.08 74.41 ± 2.36

Dry Agricultural Field Without Crop 77.95 77.59 ± 7.66

Moist Agricultural Field Without Crop 87.32 87.78 ± 4.17

Water 88.72 87.77 ± 7.32

Overall Accuracy 82.35 82.40 ± 2.76

Kappa Coefficient - 0.77 ± 0.03

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 96.82 96.74 ± 0.84

Sal Forest 98.90 98.89 ± 0.09

Eucalyptus Plantation 98.87 98.86 ± 0.05

Dry Agricultural Field Without Crop 96.87 96.75 ± 1.12

Moist Agricultural Field Without Crop 97.11 97.01 ± 0.35

Water 97.54 97.51 ± 0.14

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 99.09 99.06 ± 0.19

Sal Forest 96.67 96.62 ± 1.18

Eucalyptus Plantation 95.44 95.61 ± 1.24

Dry Agricultural Field Without Crop 99.32 99.30 ± 0.03

Moist Agricultural Field Without Crop 99.20 99.18 ± 0.03

Water 96.37 96.29 ± 0.70

Overall Accuracy 97.63 97.63 ± 0.58

Kappa Coefficient - 0.97 ± 0.007

Table B-13: Accuracy Assessment results for FCM classified LISS-III (Resourcesat-1) against LISS-IV

(Resourcesat-1) reference data

Table B-14: Accuracy Assessment results for IM kernel classified LISS-III (Resourcesat-1) against LISS-

IV (Resourcesat-1) reference data

Page 97: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 79

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 94.15 93.87 ± 0.41

Sal Forest 97.56 97.57 ± 0.51

Eucalyptus Plantation 51.09 51.10 ± 3.45

Dry Agricultural Field Without Crop 87.49 79.25 ± 4.49

Moist Agricultural Field Without Crop 87.44 79.13 ± 4.30

Water 21.61 21.62 ± 0.25

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 70.65 70.56 ± 1.06

Sal Forest 72.49 72.48 ± 0.45

Eucalyptus Plantation 52.79 52.60 ± 0.09

Dry Agricultural Field Without Crop 18.15 17.90 ± 0.84

Moist Agricultural Field Without Crop 10.77 10.66 ± 0.21

Water 99.30 98.91 ± 0.23

Overall Accuracy 70.63 70.53 ± 0.62

Kappa Coefficient - 0.55 ± 0.01

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 52.66 52.91 ± 3.09

Sal Forest 35.87 36.34 ± 4.63

Eucalyptus Plantation 91.03 91.30 ± 1.81

Dry Agricultural Field Without Crop 40.28 40.90 ± 2.51

Moist Agricultural Field Without Crop 84.74 85.07 ± 1.49

Water 97.71 98.91 ± 0.05

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 81.21 81.30 ± 0.95

Sal Forest 66.52 66.96 ± 0.03

Eucalyptus Plantation 57.68 57.90 ± 1.52

Dry Agricultural Field Without Crop 40.28 85.59 ± 8.48

Moist Agricultural Field Without Crop 84.74 48.69 ± 10.11

Water 97.71 92.82 ± 5.17

Overall Accuracy 67.36 67.49 ± 3.17

Kappa Coefficient - 0.57 ±0.04

Table B-15: Accuracy Assessment results for linear kernel classified LISS-III (Resourcesat-1) against

LISS-IV (Resourcesat-1) reference data

Table B-16: Accuracy Assessment results for spectral angle kernel classified LISS-III (Resourcesat-1)

against LISS-IV (Resourcesat-1) reference data

Page 98: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 80

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 33.68 34.01 ± 1.18

Sal Forest 88.52 88.58 ± 0.41

Eucalyptus Plantation 89.09 89.13 ±0.62

Dry Agricultural Field Without Crop 34.99 35.48 ± 6.20

Moist Agricultural Field Without Crop 31.59 32.18 ± 2.42

Water 98.41 99.19 ± 0.40

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 85.56 85.60 ± 1.48

Sal Forest 28.13 28.19 ± 2.52

Eucalyptus Plantation 43.18 43.33 ± 1.38

Dry Agricultural Field Without Crop 79.37 79.47 ± 2.41

Moist Agricultural Field Without Crop 86.04 86.04 ± 0.74

Water 83.37 89.40 ± 3.47

Overall Accuracy 55.22 55.72 ± 2.11

Kappa Coefficient - 0.44 ± 0.03

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Agricultural Field With Crop 97.00 97.51 ± 0.20

Sal Forest 96.84 97.12 ± 0.05

Eucalyptus Plantation 53.96 55.21 ± 5.99

Dry Agricultural Field Without Crop 100.0 64.28 ± 35.71

Moist Agricultural Field Without Crop 72.50 50.15 ± 9.02

Water 14.36 14.55 ± 0.55

Fuzzy Producer’s Accuracy

Agricultural Field With Crop 63.99 64.09 ± 1.04

Sal Forest 72.02 72.12 ± 0.51

Eucalyptus Plantation 37.58 37.63 ± 0.12

Dry Agricultural Field Without Crop 0.04 0.04 ± 0.001

Moist Agricultural Field Without Crop 0.09 0.09 ± 0.001

Water 99.83 99.78 ± 0.02

Overall Accuracy 66.53 66.60 ± 0.62

Kappa Coefficient - 0.49 ± 0.01

Table B-18: Accuracy Assessment results for linear-spectral angle kernel classified LISS-III (Resourcesat-

1) against LISS-IV (Resourcesat-1) reference data

Table B-17: Accuracy Assessment results for IM-spectral angle kernel classified LISS-III (Resourcesat-1)

against LISS-IV (Resourcesat-1) reference data

Page 99: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 81

B.4. Accuracy Assessment of classified outputs for AWiFS imagery of Resourcesat-2 with

reference dataset as LISS-III imagery of Resourcesat-2, with all classes trained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 76.90 77.05 ± 2.64

Eucalyptus Plantation 84.62 84.69 ± 1.04

Fallow Land 73.51 73.69 ± 2.60

Sal Plantations 86.26 86.29 ± 1.16

Water 95.61 95.62 ± 1.41

Fuzzy Producer’s Accuracy

Crop 84.07 84.10 ± 1.19

Eucalyptus Plantation 77.04 77.16 ± 2.13

Fallow Land 82.91 83.04 ± 145

Sal Plantations 84.08 84.13 ± 2.22

Water 87.81 87.78 ± 1.92

Overall Accuracy 82.57 82.64 ± 1.83

Kappa Coefficient - 0.78x ± 0.02

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 65.66 66.04 ± 7.05

Eucalyptus Plantation 90.66 90.68 ± 0.79

Fallow Land 61.97 62.71 ± 4.79

Sal Plantations 86.62 86.63 ± 1.56

Water 97.88 97.92 ± 0.36

Fuzzy Producer’s Accuracy

Crop 85.29 85.31 ± 0.31

Eucalyptus Plantation 64.83 65.22 ± 4.43

Fallow Land 87.85 88.02 ± 1.20

Sal Plantations 78.04 78.38 ± 5.01

Water 81.42 81.77 ± 6.30

Overall Accuracy 78.01 78.23 ± 3.77

Kappa Coefficient - 0.72 ± 0.04

Table B-19: Accuracy Assessment results for FCM classified AWiFS (Resourcesat-2) against LISS-III

(Resourcesat-2) reference data

Table B-20: Accuracy Assessment results for Gaussian kernel classified AWiFS (Resourcesat-2) against

LISS-III (Resourcesat-2) reference data

Page 100: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 82

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 87.59 87.63 ± 2.73

Eucalyptus Plantation 55.99 36.28 ± 3.66

Fallow Land 6.50 6.65 ± 0.98

Sal Plantations 76.93 76.90 ± 0.36

Water 89.05 89.25 ±1.14

Fuzzy Producer’s Accuracy

Crop 78.01 78.03 ± 1.06

Eucalyptus Plantation 82.73 82.72 ± 0.71

Fallow Land 19.33 20.30 ± 6.14

Sal Plantations 82.77 82.89 ± 8.12

Water 67.17 67.36 ± 2.47

Overall Accuracy 76.27 76.33 ± 2.25

Kappa Coefficient - 0.68 ± 0.03

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 5.48 5.48 ± 8.74

Eucalyptus Plantation 77.15 77.15 ± 0.008

Fallow Land 70.05 70.05 ± 4.41

Sal Plantations 90.26 90.26 ± 0.09

Water 90.02 90.02 ± 9.67

Fuzzy Producer’s Accuracy

Crop 54.40 54.40 ± 9.38

Eucalyptus Plantation 35.73 95.73 ± 0.001

Fallow Land 75.96 75.96 ± 0.013

Sal Plantations 28.21 28.21 ± 0.009

Water 94.33 94.33 ± 5.68

Overall Accuracy 55.14 55.14 ± 0.005

Kappa Coefficient - 0.42 ± 7.48

Table B-21: Accuracy Assessment results for linear kernel classified AWiFS (Resourcesat-2) against LISS-

III (Resourcesat-2) reference data

Table B-22: Accuracy Assessment results for spectral-angle kernel classified AWiFS (Resourcesat-2)

against LISS-III (Resourcesat-2) reference data

Page 101: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 83

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 47.43 47.48 ± 1.46

Eucalyptus Plantation 57.04 57.05 ± 0.28

Fallow Land 57.23 57.31 ± 1.74

Sal Plantations 71.06 71.06 ± 0.35

Water 95.67 95.71 ± 0.38

Fuzzy Producer’s Accuracy

Crop 65.72 65.72 ± 0.24

Eucalyptus Plantation 40.30 40.37 ± 1.48

Fallow Land 83.30 83.31 ± 0.34

Sal Plantations 40.41 40.48 ± 1.55

Water 95.43 95.42 ± 0.29

Overall Accuracy 61.56 61.59 ± 1.19

Kappa Coefficient - 0.51 ± 0.01

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 91.30 91.31 ± 0.74

Eucalyptus Plantation 56.94 57.47 ± 4.39

Fallow Land 16.58 16.89 ± 3.12

Sal Plantations 59.24 59.28 ± 0.55

Water 86.16 86.48 ± 1.93

Fuzzy Producer’s Accuracy

Crop 52.19 52.28 ± 1.82

Eucalyptus Plantation 79.43 79.46 ± 1.50

Fallow Land 83.15 81.79 ± 9.72

Sal Plantations 92.25 92.27 ± 3.20

Water 42.88 43.04 ± 1.65

Overall Accuracy 65.10 65.18 ± 2.28

Kappa Coefficient - 0.54 ± 0.03

Table B-23: Accuracy Assessment results for Spectral-Gaussian kernel classified AWiFS (Resourcesat-2)

against LISS-III (Resourcesat-2) reference data

Table B-24: Accuracy Assessment results for linear-spectral angle kernel classified AWiFS (Resourcesat-

2) against LISS-III (Resourcesat-2) reference data

Page 102: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 84

B.5. Accuracy Assessment of classified outputs for AWiFS imagery of Resourcesat-2 with

reference dataset as LISS-IV imagery of Resourcesat-2, with all classes trained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 73.44 73.59 ± 2.88

Eucalyptus Plantation 86.23 86.28 ± 0.54

Fallow Land 68.19 68.31 ± 1.75

Sal Plantations 79.36 79.35± 2.05

Water 96.01 95.98 ± 0.56

Fuzzy Producer’s Accuracy

Crop 78.97 79.01 ± 0.76

Eucalyptus Plantation 67.43 67.57 ± 1.86

Fallow Land 85.90 85.95 ± 1.88

Sal Plantations 86.99 86.98 ± 1.37

Water 89.94 89.81 ± 2.39

Overall Accuracy 80.21 80.25 ± 1.67

Kappa Coefficient - 0.75 ± 0.02

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 60.73 61.26 ± 6.19

Eucalyptus Plantation 84.86 84.95 ± 1.01

Fallow Land 67.69 68.24 ± 4.62

Sal Plantations 80.47 80.48 ± 3.23

Water 97.55 97.56 ± 0.20

Fuzzy Producer’s Accuracy

Crop 85.43 85.46 ± 0.76

Eucalyptus Plantation 58.28 58.67 ± 4.01

Fallow Land 90.34 90.39 ± 1.72

Sal Plantations 82.46 82.66 ± 3.76

Water 76.70 77.17 ± 6.70

Overall Accuracy 76.42 76.63 ± 3.77

Kappa Coefficient - 0.70 ± 0.04

Table B-25: Accuracy Assessment results for FCM classified AWiFS (Resourcesat-2) against LISS-IV

(Resourcesat-2) reference data

Table B-26: Accuracy Assessment results for Gaussian kernel classified AWiFS (Resourcesat-2) against

LISS-IV (Resourcesat-2) reference data

Page 103: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 85

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 44.95 45.51 ± 4.47

Eucalyptus Plantation 90.37 90.42 ± 1.94

Fallow Land 15.66 15.71 ± 0.43

Sal Plantations 16.65 16.75 ± 1.48

Water 92.22 92.31 ± 0.08

Fuzzy Producer’s Accuracy

Crop 97.30 97.31 ± 0.10

Eucalyptus Plantation 31.91 31.98 ±1.35

Fallow Land 26.97 37.44 ± 21.44

Sal Plantations 98.73 98.74 ± 0.29

Water 60.76 61.88 ± 6.37

Overall Accuracy 51.09 51.30 ± 3.20

Kappa Coefficient - 0.38 ± 0.04

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 18.24 18.24 ± 0.003

Eucalyptus Plantation 68.59 68.60 ± 0.012

Fallow Land 63.51 63.52 ± 9.10

Sal Plantations 68.18 68.18 ± 9.10

Water 96.56 96.56 ± 3.20

Fuzzy Producer’s Accuracy

Crop 69.42 69.42 ± 3.73

Eucalyptus Plantation 38.05 38.05 ± 0.001

Fallow Land 71.78 71.78 ± 0.01

Sal Plantations 24.16 24.16 ± 0.01

Water 89.93 89.93 ± 0.01

Overall Accuracy 54.24 54.24 ± 0.006

Kappa Coefficient - 0.40 ± 9.01𝒆−𝟎𝟎𝟓

Table B-27: Accuracy Assessment results for linear kernel classified AWiFS (Resourcesat-2) against LISS-

IV (Resourcesat-2) reference data

Table B-28: Accuracy Assessment results for spectral angle kernel classified AWiFS (Resourcesat-2)

against LISS-IV (Resourcesat-2) reference data

Page 104: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 86

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 37.22 37.32 ± 1.30

Eucalyptus Plantation 69.36 69.41 ± 0.72

Fallow Land 61.92 61.93 ± 1.26

Sal Plantations 56.03 56.06 ± 2.75

Water 93.92 93.97 ± 0.14

Fuzzy Producer’s Accuracy

Crop 64.89 64.93 ± 0.47

Eucalyptus Plantation 34.41 34.46 ± 0.92

Fallow Land 86.49 86.42 ± 1.71

Sal Plantations 49.31 49.38 ± 1.46

Water 86.03 86.24 ± 3.37

Overall Accuracy 59.27 59.32 ± 1.43

Kappa Coefficient - 0.49 ± 0.01

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 67.20 68.07 ± 10.38

Eucalyptus Plantation 84.62 84.65 ± 0.93

Fallow Land 28.81 28.81 ± 1.22

Sal Plantations 14.00 14.13 ± 0.91

Water 88.81 88.90 ± 0.67

Fuzzy Producer’s Accuracy

Crop 82.69 82.72 ± 0.75

Eucalyptus Plantation 29.34 29.45 ± 1.46

Fallow Land 59.93 66.42 ± 23.23

Sal Plantations 98.54 98.54 ± 0.37

Water 45.71 46.67 ± 4.73

Overall Accuracy 46.73 47.04 ± 2.94

Kappa Coefficient - 0.35 ± 0.04

Table B-29: Accuracy Assessment results for Gaussian-spectral angle kernel classified AWiFS

(Resourcesat-2) against LISS-IV (Resourcesat-2) reference data

Table B-30: Accuracy Assessment results for linear-spectral angle kernel classified AWiFS (Resourcesat-

2) against LISS-IV (Resourcesat-2) reference data

Page 105: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 87

B.6. Accuracy Assessment of classified outputs for LISS-III imagery of Resourcesat-2

with reference dataset as LISS-IV imagery of Resourcesat-2, with all classes trained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 81.11 81.17 ± 2.18

Eucalyptus Plantation 84.88 84.95 ± 1.05

Fallow Land 78.83 78.97 ± 0.75

Sal Plantations 79.14 79.20 ± 2.40

Water 97.10 97.02± 1.00

Fuzzy Producer’s Accuracy

Crop 80.46 80.55 ± 0.87

Eucalyptus Plantation 78.59 78.67 ± 1.22

Fallow Land 83.92 84.07 ± 4.09

Sal Plantations 86.68 86.68 ± 0.58

Water 93.99 94.14 ± 1.97

Overall Accuracy 84.07 84.15 ± 1.54

Kappa Coefficient - 0.79 ± 0.01

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 72.27 72.46 ± 3.63

Eucalyptus Plantation 84.16 84.21 ± 1.36

Fallow Land 85.39 85.50 ± 0.94

Sal Plantations 78.10 78.25 ± 3.58

Water 94.90 94.86 ± 0.92

Fuzzy Producer’s Accuracy

Crop 83.10 83.11 ± 0.98

Eucalyptus Plantation 77.23 77.33 ± 1.95

Fallow Land 77.14 77.32 ± 4.21

Sal Plantations 87.29 87.25 ± 0.54

Water 92.05 92.23 ± 3.29

Overall Accuracy 83.03 83.10 ± 2.20

Kappa Coefficient - 0.78 ± 0.02

Table B-31: Accuracy Assessment results for FCM classified LISS-III (Resourcesat-2) against LISS-IV

(Resourcesat-2) reference data

Table B-32: Accuracy Assessment results for Gaussian kernel classified LISS-III (Resourcesat-2) against

LISS-IV (Resourcesat-2) reference data

Page 106: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 88

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 41.70 41.92 ± 3.02

Eucalyptus Plantation 96.61 96.64 ± 1.17

Fallow Land 88.69 88.54 ± 1.17

Sal Plantations 18.38 18.51 ± 1.52

Water 81.65 81.72 ± 0.32

Fuzzy Producer’s Accuracy

Crop 97.24 97.24 ± 0.19

Eucalyptus Plantation 23.10 23.14 ± 0.84

Fallow Land 32.30 38.29 ± 16.28

Sal Plantations 98.71 98.73 ± 0.13

Water 84.75 85.35 ± 5.25

Overall Accuracy 53.22 53.89 ± 2.57

Kappa Coefficient - 0.42 ± 0.03

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 23.07 23.07 ± 4.60

Eucalyptus Plantation 77.26 77.26 ± 4.94

Fallow Land 71.11 71.11 ± 1.01

Sal Plantations 66.35 66.35 ± 1.74

Water 99.89 99.89 ± 2.84

Fuzzy Producer’s Accuracy

Crop 4.94 4.94 ± 2.63

Eucalyptus Plantation 80.21 80.21 ± 1.34

Fallow Land 90.60 90.60 ± 1.56

Sal Plantations 35.55 65.35 ± 3.41

Water 96.51 96.50 ± 4.33

Overall Accuracy 67.56 76.07 ± 1.30

Kappa Coefficient - 0.65 ± 1.98s

Table B-33: Accuracy Assessment results for linear kernel classified LISS-III (Resourcesat-2) against

LISS-IV (Resourcesat-2) reference data

Table B-34: Accuracy Assessment results for spectral angle kernel classified LISS-III (Resourcesat-2)

against LISS-IV (Resourcesat-2) reference data

Page 107: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 89

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 37.22 37.32 ± 1.30

Eucalyptus Plantation 69.36 69.41 ± 0.72

Fallow Land 61.92 61.93 ± 1.26

Sal Plantations 56.03 56.06 ± 2.75

Water 93.92 93.97 ± 0.14

Fuzzy Producer’s Accuracy

Crop 64.89 64.93 ± 0.47

Eucalyptus Plantation 34.41 34.46 ± 0.92

Fallow Land 86.49 86.42 ± 1.71

Sal Plantations 49.31 49.38 ± 1.46

Water 86.03 86.24 ± 3.37

Overall Accuracy 59.27 59.32 ± 1.43

Kappa Coefficient - 0.49 ± 0.01

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Crop 44.58 44.96 ± 4.06

Eucalyptus Plantation 97.70 97.73 ± 0.42

Fallow Land 92.31 92.46 ± 0.85

Sal Plantations 21.55 21.78 ± 2.23

Water 91.94 91.96 ± 0.64

Fuzzy Producer’s Accuracy

Crop 99.21 99.18 ± 0.13

Eucalyptus Plantation 24.23 24.35 ± 1.35

Fallow Land 45.64 54.49 ± 23.45

Sal Plantations 99.69 99.69 ± 0.001

Water 86.01 86.53 ± 6.25

Overall Accuracy 57.55 57.78 ± 3.49

Kappa Coefficient - 0.47 ± 0.04

Table B-35: Accuracy Assessment results for Gaussian-spectral angle kernel classified LISS-III

(Resourcesat-2) against LISS-IV (Resourcesat-2) reference data

Table B-36: Accuracy Assessment results for linear-spectral angle kernel classified LISS-III (Resourcesat-

2) against LISS-IV (Resourcesat-2) reference data

Page 108: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 90

B.7. Accuracy Assessment of classified outputs for AWiFS imagery of Resourcesat-1 with

reference dataset as LISS-III imagery of Resourcesat-1, with one class untrained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Sal Forest 49.38 75.31 ± 3.07

Eucalyptus Plantation 74.13 90.17 ± 1.07

Dry Agricultural Field Without Crop 31.48 44.24 ± 4.42

Moist Agricultural Field Without Crop 43.27 61.04 ± 2.03

Water 97.62 97.72 ± 0.004

Average Fuzzy User’s Accuracy 59.18 -

Fuzzy Producer’s Accuracy

Sal Forest 84.36 84.40 ± 0.75

Eucalyptus Plantation 73.95 74.01 ± 1.82

Dry Agricultural Field Without Crop 56.18 56.91 ± 7.41

Moist Agricultural Field Without Crop 82.38 82.56 ± 3.68

Water 97.65 97.63 ± 0.33

Average Fuzzy Producer’s Accuracy 78.90 -

Overall Accuracy 80.54 80.60 ± 1.84

Kappa Coefficient - 0.72 ± 0.026

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Sal Forest 51.16 86.45 ± 3.08

Eucalyptus Plantation 67.44 92.26 ± 2.04

Dry Agricultural Field Without Crop 45.05 63.14 ± 8.48

Moist Agricultural Field Without Crop 25.56 50.46 ± 0.87

Water 88.63 99.22 ± 0.03

Average Fuzzy User’s Accuracy 55.57 -

Fuzzy Producer’s Accuracy

Sal Forest 52.68 53.56 ± 6.73

Eucalyptus Plantations 40.59 40.61 ± 0.90

Dry Agricultural Field Without Crop 33.25 33.32 ± 0.27

Moist Agricultural Field Without Crop 96.37 96.37 ± 0.004

Water 91.64 99.22 ± 0.03

Average Fuzzy Producers Accuracy 62.91 -

Overall Accuracy 66.27 66.32 ± 1.38

Kappa Coefficient - 0.52 ± 0.02

Table B-37: Accuracy Assessment results for FCM classified AWiFS (Resourcesat-1) against LISS-III

(Resourcesat-1) reference data

Table B-38: Accuracy Assessment results for IM kernel classified AWiFS (Resourcesat-1) against LISS-III

(Resourcesat-1) reference data

Page 109: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 91

B.8 Accuracy Assessment of classified outputs for AWiFS imagery of Resourcesat-1 with

reference dataset as LISS-IV imagery of Resourcesat-1, with one class untrained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Sal Forest 73.30 93.01 ± 0.44

Eucalyptus Plantation 75.96 93.27 ± 4.29

Dry Agricultural Field Without Crop 0.78 1.13 ± 0.47

Moist Agricultural Field Without Crop 5.96 11.78 ± 7.16

Water 75.44 87.29 ± 0.29

Average Fuzzy User’s Accuracy 46.29 -

Fuzzy Producer’s Accuracy

Sal Forest 93.11 93.09 ± 0.37

Eucalyptus Plantations 78.08 78.13 ± 2.24

Dry Agricultural Field Without Crop 19.74 14.92 ± 3.85

Moist Agricultural Field Without Crop 20.99 19.47 ± 0.96

Water 88.52 93.09 ± 0.37

Average Fuzzy Producers Accuracy 60.09 -

Overall Accuracy 90.34 90.35 ± 0.87

Kappa Coefficient - 0.82 ± 0.017

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Sal Forest 45.57 64.40 ± 4.47

Eucalyptus Plantation 81.39 91.52 ± 1.15

Dry Agricultural Field Without Crop 41.60 47.04 ± 1.93

Moist Agricultural Field Without Crop 36.92 47.58 ± 3.15

Water 98.04 98.61 ± 8.93

Average Fuzzy User’s Accuracy 60.70 -

Fuzzy Producer’s Accuracy

Sal Forest 89.90 89.94 ± 0.41

Eucalyptus Plantations 67.02 67.15 ± 2.83

Dry Agricultural Field Without Crop 38.16 40.05 ± 8.37

Moist Agricultural Field Without Crop 84.48 84.70 ± 4.01

Water 85.33 85.47 ± 2.60

Average Fuzzy Producers Accuracy 72.98 -

Overall Accuracy 75.01 75.14 ± 2.83

Kappa Coefficient - 0.63 ± 0.04

Table B-39: Accuracy Assessment results for linear kernel classified AWiFS (Resourcesat-1) against

LISS-III (Resourcesat-1) reference data

Table B-40: Accuracy Assessment results for FCM classified AWiFS (Resourcesat-1) against LISS-IV

(Resourcesat-1) reference data

Page 110: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 92

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Sal Forest 92.51 94.60 ± 0.66

Eucalyptus Plantation 80.38 82.43 ± 0.75

Dry Agricultural Field Without Crop 24.58 30.21 ± 8.10

Moist Agricultural Field Without Crop 13.16 15.83 ± 0.20

Water 99.06 99.67 ± 0.02

Average Fuzzy User’s Accuracy 61.94 -

Fuzzy Producer’s Accuracy

Sal Forest 24.17 24.28 ± 1.10

Eucalyptus Plantations 21.31 21.32 ± 0.27

Dry Agricultural Field Without Crop 42.33 42.47 ± 0.19

Moist Agricultural Field Without Crop 95.54 95.54 ± 0.003

Water 76.60 76.68 ± 1.32

Average Fuzzy Producers Accuracy 51.99 -

Overall Accuracy 37.27 37.32 ± 0.67

Kappa Coefficient - 0.23 ± 0.0108

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Sal Forest 82.72 99.66 ±0.10

Eucalyptus Plantation 50.27 89.43 ± 10.56

Dry Agricultural Field Without Crop 4.73 9.65 ± 4.97

Moist Agricultural Field Without Crop 3.52 16.49 ± 13.15

Water 12.91 19.43 ± 0.28

Average Fuzzy User’s Accuracy 30.83

Fuzzy Producer’s Accuracy

Sal Forest 74.45 74.48 ± 0.88

Eucalyptus Plantations 54.27 54.27 ± 0.13

Dry Agricultural Field Without Crop 1.24 1.23 ± 0.13

Moist Agricultural Field Without Crop 0.43 0.43 ± 0.02

Water 99.29 99.30 ± 0.016

Average Fuzzy Producers Accuracy 45.94

Overall Accuracy 72.38 72.37 ± 0.90

Kappa Coefficient - 0.38 ± 0.028

Table B-41: Accuracy Assessment results for IM kernel classified AWiFS (Resourcesat-1) against LISS-

IV (Resourcesat-1) reference data

Table B-42: Accuracy Assessment results for linear kernel classified AWiFS (Resourcesat-1) against LISS-

IV (Resourcesat-1) reference data

Page 111: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 93

B.9. Accuracy Assessment of classified outputs for LISS-III imagery of Resourcesat-1

with reference dataset as LISS-IV imagery of Resourcesat-1, with one class untrained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Sal Forest 46.65 71.03 ± 3.53

Eucalyptus Plantation 83.63 91.60 ± 0.76

Dry Agricultural Field Without Crop 65.29 69.81 ± 3.34

Moist Agricultural Field Without Crop 55.14 66.67 ± 3.97

Water 94.06 95.41 ± 0.97

Average Fuzzy User’s Accuracy 68.95 -

Fuzzy Producer’s Accuracy

Sal Forest 87.07 87.10 ± 0.16

Eucalyptus Plantations 82.30 82.39 ± 2.06

Dry Agricultural Field Without Crop 70.64 71.83 ± 7.26

Moist Agricultural Field Without Crop 76.16 76.53 ± 4.42

Water 94.47 94.49 ± 0.54

Average Fuzzy Producers Accuracy 82.13 -

Overall Accuracy 83.99 84.09 ± 1.97

Kappa Coefficient - 0.75 ± 0.03

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Sal Forest 61.61 80.58 ± 3.29

Eucalyptus Plantation 81.32 87.18 ± 0.27

Dry Agricultural Field Without Crop 24.12 40.81 ± 12.82

Moist Agricultural Field Without Crop 30.96 39.46 ± 2.67

Water 94.43 96.65 ± 0.79

Average Fuzzy User’s Accuracy 58.49 -

Fuzzy Producer’s Accuracy

Sal Forest 72.21 72.22 ± 2.05

Eucalyptus Plantations 65.36 65.94 ± 3.60

Dry Agricultural Field Without Crop 82.64 82.70 ± 1.37

Moist Agricultural Field Without Crop 90.88 90.91 ± 0.60

Water 81.90 85.50 ± 2.06

Average Fuzzy Producers Accuracy 78.58 -

Overall Accuracy 71.78 72.47 ± 2.88

Kappa Coefficient - 0.60 ± 0.04

Table B-43: Accuracy Assessment results for FCM classified LISS-III (Resourcesat-1) against LISS-IV

(Resourcesat-1) reference data

Table B-44: Accuracy Assessment results for IM kernel classified LISS-III (Resourcesat-1) against LISS-

IV (Resourcesat-1) reference data

Page 112: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 94

B.10.Accuracy Assessment of classified outputs for AWiFS imagery of Resourcesat-2 with

reference dataset as LISS-III imagery of Resourcesat-2, with one class untrained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Sal Forest 81.71 99.34 ± 0.08

Eucalyptus Plantation 38.58 88.80 ± 11.19

Dry Agricultural Field Without Crop 73.98 91.68 ± 8.31

Moist Agricultural Field Without Crop 76.52 92.60 ± 7.39

Water 16.14 23.20 ± 0.51

Average Fuzzy User’s Accuracy 57.39 -

Fuzzy Producer’s Accuracy

Sal Forest 81.14 81.13 ± 1.07

Eucalyptus Plantations 58.09 58.13 ± 0.03

Dry Agricultural Field Without Crop 17.34 17.29 ± 1.09

Moist Agricultural Field Without Crop 11.69 11.68 ± 0.32

Water 98.09 97.98 ± 0.42

Average Fuzzy Producers Accuracy 53.27 -

Overall Accuracy 77.91 77.89 ± 1.01

Kappa Coefficient - 0.47 ± 0.03

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Eucalyptus Plantation 70.07 79.52 ± 0.13

Fallow Land 58.20 74.67 ± 0.15

Sal Plantations 24.59 37.62 ± 0.19

Water 96.09 96.55 ± 1.53

Average Fuzzy User’s Accuracy 62.24 -

Fuzzy Producer’s Accuracy

Eucalyptus Plantation 56.04 56.04 ± 0.15

Fallow Land 83.40 83.41 ± 0.33

Sal Plantations 69.66 69.66 ± 1.23

Water 95.97 95.97 ± 2.31

Average Fuzzy Producers Accuracy 76.27 -

Overall Accuracy 73.48 73.49 ± 0.15

Kappa Coefficient - 0.64 ± 0.002

Table B-45: Accuracy Assessment results for linear kernel classified LISS-III (Resourcesat-1) against

LISS-IV (Resourcesat-1) reference data

Table B-46: Accuracy Assessment results for FCM classified AWiFS (Resourcesat-2) against LISS-III

(Resourcesat-2) reference data

Page 113: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 95

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Eucalyptus Plantation 70.08 77.64 ± 0.13

Fallow Land 50.99 60.50 ± 0.25

Sal Plantations 29.60 39.19 ± 0.22

Water 95.82 96.08 ± 0.09

Average Fuzzy User’s Accuracy 61.62

Fuzzy Producer’s Accuracy

Eucalyptus Plantation 38.83 38.83 ± 0.14

Fallow Land 85.26 85.26 ± 0.28

Sal Plantations 79.64 79.64 ± 8.26𝑒−004

Water 93.91 93.91 ± 0.41

Average Fuzzy Producers Accuracy 74.41 65.82 ± 0.22

Overall Accuracy 65.82 0.54 ± 0.003

Kappa Coefficient - -

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Eucalyptus Plantation 38.22 60.34 ± 3.86

Fallow Land 5.88 6.65 ± 0.60

Sal Plantations 55.00 91.80 ± 2.81

Water 84.07 89.94 ± 0.44

Average Fuzzy User’s Accuracy 45.79 -

Fuzzy Producer’s Accuracy

Eucalyptus Plantation 91.46 91.46 ± 0.14

Fallow Land 22.97 25.06 ± 7.21

Sal Plantations 90.55 90.58 ± 1.51

Water 59.46 59.77 ± 3.86

Average Fuzzy Producers Accuracy 66.12 -

Overall Accuracy 77.64 77.77 ± 2.83

Kappa Coefficient - 0.67 ± 0.04

Table B-47: Accuracy Assessment results for Gaussian kernel classified AWiFS (Resourcesat-2) against

LISS-III (Resourcesat-2) reference data

Table B-48: Accuracy Assessment results for linear kernel classified AWiFS (Resourcesat-2) against LISS-

III (Resourcesat-2) reference data

Page 114: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 96

B.11.Accuracy Assessment of classified outputs for AWiFS imagery of Resourcesat-2 with

reference dataset as LISS-IV imagery of Resourcesat-2, with one class untrained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Eucalyptus Plantation 66.36 77.45 ± 0.09

Fallow Land 48.07 61.25 ± 0.15

Sal Plantations 22.25 35.48 ± 0.20

Water 95.65 95.81 ± 1.53𝑒−015

Average Fuzzy User’s Accuracy 58.08 -

Fuzzy Producer’s Accuracy

Eucalyptus Plantation 50.46 50.47 ± 0.15

Fallow Land 89.33 89.34 ± 0.28

Sal Plantations 59.28 59.29 ± 0.003

Water 92.43 92.43 ± 0.06

Average Fuzzy Producers Accuracy 72.88 -

Overall Accuracy 66.93 66.93 ± 0.15

Kappa Coefficient - 0.54 ± 0.002

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Eucalyptus Plantation 65.90 71.40 ± 0.18

Fallow Land 51.59 61.73 ± 0.13

Sal Plantations 25.01 33.49 ± 0.16

Water 93.31 96.32 ± 7.01𝑒−016

Average Fuzzy User’s Accuracy 58.95

Fuzzy Producer’s Accuracy

Eucalyptus Plantation 33.30 33.30 ± 0.09

Fallow Land 92.57 92.56 ± 0.32

Sal Plantations 69.88 69.88 ± 0.02

Water 81.52 81.52 ± 0.23

Average Fuzzy Producers Accuracy 69.31 61.39 ± 0.16

Overall Accuracy 61.39 0.48 ± 0.002

Kappa Coefficient -

Table B-49: Accuracy Assessment results for FCM classified AWiFS (Resourcesat-2) against LISS-IV

(Resourcesat-2) reference data

Table B-50: Accuracy Assessment results for Gaussian kernel classified AWiFS (Resourcesat-2) against

LISS-IV (Resourcesat-2) reference data

Page 115: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 97

B.12.Accuracy Assessment of classified outputs for LISS-III imagery of Resourcesat-2

with reference dataset as LISS-IV imagery of Resourcesat-2, with one class untrained.

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Eucalyptus Plantation 82.98 84.75 ± 2.58

Fallow Land 15.27 16.12 ± 0.32

Sal Plantations 10.92 14.90 ± 0.42

Water 88.87 90.35 ± 0.20

Average Fuzzy User’s Accuracy 49.51 -

Fuzzy Producer’s Accuracy

Eucalyptus Plantation 50.72 50.72 ± 0.07

Fallow Land 31.12 32.97 ± 7.67

Sal Plantations 97.72 97.75 ± 0.44

Water 59.23 59.39 ± 2.69

Average Fuzzy Producers Accuracy 59.70

Overall Accuracy 55.49 55.54 ± 1.28

Kappa Coefficient - 0.36 ± 0.02

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Eucalyptus Plantation 75.90 79.84 ± 0.06

Fallow Land 67.69 81.63 ± 0.30

Sal Plantations 21.93 46.84 ± 0.43

Water 96.63 96.73 ± 1.92𝑒−016

Average Fuzzy User’s Accuracy 65.54 -

Fuzzy Producer’s Accuracy

Eucalyptus Plantation 72.98 72.99 ± 0.31

Fallow Land 83.84 83.84 ± 0.21

Sal Plantations 61.02 61.02 ± 2.07

Water 96.46 96.46 ± 2.83𝑒−018

Average Fuzzy Producers Accuracy 78.58 -

Overall Accuracy 78.79 78.79 ± 0.20

Kappa Coefficient - 0.69 ± 0.0029

Table B-51: Accuracy Assessment results for linear kernel classified AWiFS (Resourcesat-2) against LISS-

IV (Resourcesat-2) reference data

Table B-52: Accuracy Assessment results for FCM classified LISS-III (Resourcesat-2) against LISS-IV

(Resourcesat-2) reference data

Page 116: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 98

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Eucalyptus Plantation 76.56 80.93 ± 0.14

Fallow Land 77.30 88.73 ± 0.001

Sal Plantations 29.51 49.20 ± 0.20

Water 97.05 98.06 ± 3.07𝑒−016

Average Fuzzy User’s Accuracy 74.10

Fuzzy Producer’s Accuracy

Eucalyptus Plantation 76.64 76.64 ± 5.01𝑒−004

Fallow Land 83.61 83.62 ± 0.55

Sal Plantations 65.57 65.57 ± 1.73

Water 94.27 94.27 ±0 .05

Average Fuzzy Producers Accuracy 80.02

Overall Accuracy 80.41 80.41 ± 0.11

Kappa Coefficient - 0.72 ± 0.001

Accuracy Assessment Method FERM (in %) SCM (in %)

Fuzzy User’s Accuracy

Eucalyptus Plantation 98.62 98.69 ± 0.49

Fallow Land 86.02 89.41 ± 4.21

Sal Plantations 10.70 14.55 ± 0.16

Water 89.34 90.72 ± 1.07

Average Fuzzy User’s Accuracy 71.17 -

Fuzzy Producer’s Accuracy

Eucalyptus Plantation 40.39 40.42 ± 0.41

Fallow Land 49.66 50.12 ± 4.43

Sal Plantations 99.52 99.51 ± 0.16

Water 91.75 91.81 ± 0.48

Average Fuzzy Producers Accuracy 70.33

Overall Accuracy 59.82 59.86 ± 0.62

Kappa Coefficient 0.44 ± 0.009

Table B-53: Accuracy Assessment results for Gaussian kernel classified LISS-III (Resourcesat-2) against

LISS-IV (Resourcesat-2) reference data

Table B-54: Accuracy Assessment results for linear kernel classified LISS-III (Resourcesat-2) against

LISS-IV (Resourcesat-2) reference data

Page 117: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 99

APPENDIX C

IMPLEMENTATION OF KERNEL BASED FUZZY C-MEANS (KFCM) CLASSIFIER IN MATLAB cont=1; while(cont==1) % INPUT IMAGE (TIFF FORMAT)

awifs_read=imread('awifs_0606909.tif');

% GET THE DIMENSIONS OF THE IMAGE dim=size(awifs_read); fprintf('\nThe dimensions of the given image = '); disp(dim);

% SAVE DIMENSIONS AND BANDS M=dim(1);N=dim(2);bands=dim(3);

% MEAN CLASS VALUES %Class 1-> AGRICULTURE FIELD WITH CROP %Class 2-> SAL FOREST %Class 3-> EUCALYPTUS PLANTATIONS %Class 4-> DRY AGRICULTURAL FIELD WITHOUT CROP %Class 5-> MOIST AGRICULTURAL FIELD WITHOUT CROP %Class 6-> WATER MeanClassVal=[98.0000 92.058 92.0 125.0 105.25 85.416; 64.00 60.117 61.50 130.0 89.50 54.66; 243.333 240.76 198.91 232.00 145.25 59.41; 172.33 161.117 147.83 345.0 221.25 46.416]; fprintf('The Mean Class Values are:'); fprintf('\n'); disp(MeanClassVal);

% INITIALIZE THE NUMBER OF CLASSES (SUPERVISED) Ncl=6; fprintf('\nThe number of classes are : '); disp(Ncl);

% INITIALIZE THE VARIABLES AND MATRICES j=1;x=1;y=1;k=1;prod=0;prod2=0;a=1;b=1;den=1;num=1; meu=zeros(M,N,Ncl); agri=zeros(M,N,1); sal=zeros(M,N,1); euc=zeros(M,N,1); dry=zeros(M,N,1); moist=zeros(M,N,1); water=zeros(M,N,1); dmat=zeros(M,N,Ncl); % d-matrix smat=zeros(M,N,Ncl); % scaled matrix d2mat=zeros(M,N,Ncl); s2mat=zeros(M,N,Ncl); final_meu=zeros(M,N,Ncl); fprintf('\n GLOBAL KERNELS :'); fprintf('\n\t1.Linear Kernel'); fprintf('\n\t2.Polynomial Kernel'); fprintf('\n\t3.Sigmoid Kernel'); fprintf('\n LOCAL KERNELS :'); fprintf('\n\t4.Gaussian Kernel Using Eucledian Norm'); fprintf('\n\t5.Kernel with Moderate Decreasing(KMOD)'); fprintf('\n\t6.Radial Basis Kernel'); fprintf('\n\t7.Inverse Multiquadratic'); fprintf('\n8.SPECTRAL KERNEL'); ch=input('\nEnter your choice : ');

Page 118: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 100

%INPUT WEIGHTAGE CONSTANT FUZZINESS FACTOR m=input('\n Enter the value of m : ');

switch(ch) % LINEAR KERNEL case 1

fprintf('\n IMPLEMENTING LINEAR KERNEL');

% IMPLEMENT LINEAR KERNEL FUNCTION for j=1:Ncl, for x=1:M, for y=2:N, prod=0; for k=1:bands, prod=prod+(double(awifs_read(x,y,k))*MeanClassVal(k,j)); end dmat(x,y,j)=prod; end end end % FUNCTIONS TO FIND MAXIMUM AND MINIMUM OF dmat

max_fn(dmat,Ncl,M,N); min_fn(dmat,Ncl,M,N);

% SCALING OF MATRIX VALUES BETWEEN 0 AND 1 for j=1:Ncl, for x=1:M, for y=2:N,

smat(x,y,j)=(0.0000000000000001) + (((dmat(x,y,j)- min(j))/ (max(j)-min(j)))*( 0.999999999999999-0.000000000000001));

end end end %MEMBERSHIP MATRIX CALCULATION for j=1:Ncl, for x=1:M, for y=2:N, prod=0;prod2=0;den=0; num=(1/(1-(smat(x,y,j)))^(1/(m-1))); for a=1:Ncl, prod2=0; den= den+(1/(1-(smat(x,y,a)))^(1/(m-1))); end meu(x,y,j)=num/den; end end end % MINIMUM AND MAXIMUM OF MEMBERSHIP MATRIX max_fn(meu,Ncl,M,N); min_fn(meu,Ncl,M,N); fprintf('\n The maximum membership values for each class are : \n'); disp(max); fprintf('\n The minimum membership values for each class are : \n'); disp(min); fprintf('\n'); fprintf('\nCalculated fuzzy membership matrix for linear kernel '); % END OF LINEAR KERNEL % POLYNOMIAL KERNEL case 2 %GET VALUE OF p p=input('\nEnter the value of p : '); fprintf('\n');

Page 119: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 101

% IMPLEMENT KERNEL FUNCTION for j=1:Ncl,

for x=1:M, for y=2:N, prod=0; for k=1:bands, prod=prod+(double(awifs_read(x,y,k))*MeanClassVal(k,j)); end dmat(x,y,j)=((prod+1)^p); end end end max_fn(dmat,Ncl,M,N); min_fn(dmat,Ncl,M,N); % SCALING OF MATRIX for j=1:Ncl, for x=1:M, for y=2:N, smat(x,y,j)=(0.0000000000000001) + (((dmat(x,y,j)-min(j))/ (max(j)-min(j)))*(0.999999999999999-0.000000000000001)); end end end % MEMBERSHIP MATRIX CALCULATION for j=1:Ncl, for x=1:M, for y=2:N, prod=0;prod2=0;den=0; num=((1/(1-smat(x,y,j)))^(1/(m-1))); for a=1:Ncl, prod2=0; den= den+((1/(1-smat(x,y,a)))^(1/(m-1))); end meu(x,y,j)=num/den; end end end %MAXIMUM AND MINIMUM max_fn(meu,Ncl,M,N); min_fn(meu,Ncl,M,N); fprintf('\n The maximum membership values for each class are : \n'); disp(max); fprintf('\n The minimum membership values for each class are : \n'); disp(min); fprintf('\nCalculated the membership matrix for polynomial kernel'); % END OF POLYNOMIAL KERNEL % SIGMOID KERNEL

case 3 % IMPLEMENT KERNEL FUNCTION for j=1:Ncl, for x=1:M, for y=2:N, prod=0;

for k=1:bands, prod=prod+(double(awifs_read(x,y,k))*MeanClassVal(k,j));

end dmat(x,y,j)=prod+1; end end end

Page 120: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 102

% MAXIMUM AND MINIMUM max_fn(dmat,Ncl,M,N); min_fn(dmat,Ncl,M,N); % SCALING OF MATRIX for j=1:Ncl, for x=1:M, for y=2:N, smat(x,y,j)=(0.0000000000000001) + (((dmat(x,y,j)- min(j))/ (max(j)-min(j)))*(0.999999999999999- 0.000000000000001)); end end end % MEMBERSHIPMATRIX CALCULATION for j=1:Ncl, for x=1:M, for y=2:N, prod=0;prod2=0;den=0; num=((1/(1-tanh(smat(x,y,j))))^(1/(m-1))); for a=1:Ncl, prod2=0; den= den+((1/(1-tanh(smat(x,y,a))))^(1/(m-1))); end meu(x,y,j)=num/den; end end end % MAXIMUM AND MINIMUM VALUES max_fn(meu,Ncl,M,N); min_fn(meu,Ncl,M,N); fprintf('\n The maximum membership values for each class are : \n'); disp(max); fprintf('\n The minimum membership values for each class are : \n'); disp(min); fprintf('\n'); fprintf('\nCalculated the membership matrix for Sigmoid Kernel'); % END OF SIGMOID KERNEL % GAUSSIAN KERNELUSING EUCLIDEAN NORM case 4 fprintf('\nImplementing Gaussian Kernel using Eucledian Norm'); I=eye([bands,bands]); %Identity Matrix A=zeros(bands,1); % IMPLEMENT GAUSSIAN KERNEL FUNCTION for j=1:Ncl, for x=1:M, for y=2:N, prod=0; for k=1:bands, A(k,1)=(double(awifs_read(x,y,k))- MeanClassVal(k,j)); end C=mtimes(A.',inv(I)); prod=mtimes(C,A); dmat(x,y,j)=(-0.5*prod); end end end % MAXIMUM AND MINIMUM max_fn(dmat,Ncl,M,N); min_fn(dmat,Ncl,M,N);

Page 121: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 103

% SCALING OF MATRIX BETWEEN 0 AND 1 for j=1:Ncl, for x=1:M, for y=2:N, smat(x,y,j)=(0.0000000000000001) + (((dmat(x,y,j)-min(j))/ (max(j)-min(j)))*(0.999999999999999- 0.000000000000001)); end end end % GAUSSIAN KERNEL FUNCTION for j=1:Ncl, for x=1:M, for y=2:N, d2mat(x,y,j)=exp(smat(x,y,j)); end end end % MAXIMUM AND MINIMUM max_fn(d2mat,Ncl,M,N); min_fn(d2mat,Ncl,M,N); % SCALINGOF MATRIX for j=1:Ncl, for x=1:M, for y=2:N, s2mat(x,y,j)=(0.0000000000000001) + (((d2mat(x,y,j)- min(j))/ (max(j)-min(j)))*(0.999999999999999- 0.000000000000001)); end end end %MEMBERSHIP MATRIX CALCULATION for j=1:Ncl, for x=1:M, for y=2:N, prod=0;prod2=0;den=0; num=(1/(1-(s2mat(x,y,j)))^(1/(m-1))); for a=1:Ncl, prod2=0; den= den+(1/(1-(s2mat(x,y,a)))^(1/(m-1))); end meu(x,y,j)=num/den; end end end max_fn(meu,Ncl,M,N); min_fn(meu,Ncl,M,N); fprintf('\n The maximum membership values for each class are : \n'); disp(max); fprintf('\n The minimum membership values for each class are : \n'); disp(min); fprintf('\n'); fprintf('\nCalculated the membership matrix for Gaussian Kernel'); % END OF GAUSSINAN KERNEL % KMOD KERNE LFUNCTION case 5 % IMPLEMENTATION OF KMOD KERNEL for j=1:Ncl, for x=1:M, for y=2:N,

Page 122: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 104

prod=0; for k=1:bands, prod=prod+((double(awifs_read(x,y,k))- MeanClassVal(k,j))^2);

end dmat(x,y,j)=prod; end end end % MAXIMUM AND MINIMUM BETWEEN 0AND 1 max_fn(meu,Ncl,M,N); min_fn(meu,Ncl,M,N); % SCALING OF MATRIX for j=1:Ncl, for x=1:M, for y=2:N, smat(x,y,j)=0.0000000000000001 + ((dmat(x,y,j)- min(j))*(0.999999999999999-0.000000000000001))/ (max(j)-min(j)); end end end % KMOD FUNCTION for j=1:Ncl, for x=1:M, for y=2:N, d2mat(x,y,j)=exp(1/(1+smat(x,y,j)))-1; end end end % MAXIMUM AND MINIMUM max_fn(d2mat,Ncl,M,N); min_fn(d2mat,Ncl,M,N); fprintf('\n After the second time scaling'); fprintf('\n The maximum values for each class are : \n'); disp(max); fprintf('\n The minimum values for each class are : \n'); disp(min); fprintf('\n'); % SCALING OF MATRIX for j=1:Ncl, for x=1:M, for y=2:N, s2mat(x,y,j)=0.00001 + ((d2mat(x,y,j)-min(j))*(0.99999- 0.00001))/ (max(j)-min(j)); end end end % MEMBERSHIP MATRIX CALCULATION for j=1:Ncl, for x=1:M, for y=2:N, prod=0;prod2=0;den=0; num=((1/(1-s2mat(x,y,j)))^(1/(m-1))); for a=1:Ncl, prod2=0; den= den+((1/(1-s2mat(x,y,a)))^(1/(m-1))); end meu(x,y,j)=num/den; end

Page 123: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 105

end end % MAXIMUM AND MINIMUM max_fn(meu,Ncl,M,N); min_fn(meu,Ncl,M,N); fprintf('\n The maximum membership values for each class are : \n'); disp(max); fprintf('\n The minimum membership values for each class are : \n'); disp(min); fprintf('\n'); fprintf('\nCalculated the membership matrix for KMOD Kernel');

% END OF KMOD KERNEL % RADIAL BASIS KERNEL case 6 fprintf('\nImplementing Radial Basis Kernel'); %IMPLEMENT RADIAL BASIS FUNCTION for j=1:Ncl, for x=1:M, for y=2:N, prod=0; for k=1:bands, prod=prod+((double(awifs_read(x,y,k))- MeanClassVal(k,j))^2); end dmat(x,y,j)=prod; end end end % MAXIMUM AND MINIMUM max_fn(dmat,Ncl,M,N); min_fn(dmat,Ncl,M,N); fprintf('\n The maximum values for each class are : \n'); disp(max); fprintf('\n The minimum values for each class are : \n'); disp(min); fprintf('\n'); % SCALING OF MATRIX for j=1:Ncl, for x=1:M, for y=2:N, smat(x,y,j)=(0.0000000000000001) + (((dmat(x,y,j)-min(j))/ (max(j)-min(j)))*(0.999999999999999-0.000000000000001)); end end end % MEMBERSHIP MATRIX CALCULATION for j=1:Ncl, for x=1:M, for y=2:N, prod=0;prod2=0;den=0; num=((1/(1-exp(-1*(smat(x,y,j)))))^(1/(m-1))); for a=1:Ncl, prod2=0; den= den+((1/(1-exp(-1*(smat(x,y,a)))))^(1/(m-1))); end meu(x,y,j)=num/den; end end end % MAXIMUM ANDMINIMUMMEMBERSHIPMATRIX

Page 124: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 106

max_fn(meu,Ncl,M,N); min_fn(meu,Ncl,M,N); fprintf('\n The maximum membership values for each class are : \n'); disp(max); fprintf('\n The minimum membership values for each class are : \n'); disp(min); fprintf('\n'); fprintf('\nCalculated the membership matrix for Radial Basis Kernel'); % END OF RADIAL BASIS KERNEL

%INVERSE MULTIQUADRATIC KERNEL case 7 fprintf('\nImplementing IM Kernel'); % IM KERNEL FUNCTION for j=1:Ncl, for x=1:M, for y=2:N, prod=0; for k=1:bands, prod=prod+((double(awifs_read(x,y,k))- MeanClassVal(k,j))^2); end dmat(x,y,j)=(1/sqrt(prod+1)); end end end % MAXIMUM AND MINIMUM max_fn(dmat,Ncl,M,N); min_fn(dmat,Ncl,M,N); fprintf('\n The maximum values for each class are : \n'); disp(max); fprintf('\n The minimum values for each class are : \n'); disp(min); fprintf('\n'); % SCALINGOF MATRIX for j=1:Ncl, for x=1:M, for y=2:N, smat(x,y,j)=(0.0000000000000001) + (((dmat(x,y,j)-min(j))/ (max(j)- min(j)))*(0.999999999999999-0.000000000000001)); end end end % MEMBERSHIP MATRIX CALCULATION for j=1:Ncl, for x=1:M, for y=2:N, prod=0;prod2=0;den=0; num=((1/(1-smat(x,y,j)))^(1/(m-1))); for a=1:Ncl, prod2=0; den= den+((1/(1-smat(x,y,a)))^(1/(m-1))); end meu(x,y,j)=num/den; end end end % MAXIMUM AND MINIMUM max_fn(meu,Ncl,M,N);

Page 125: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 107

min_fn(meu,Ncl,M,N); fprintf('\n The maximum membership values for each class are : \n'); disp(max); fprintf('\n The minimum membership values for each class are : \n'); disp(min); fprintf('\n'); fprintf('\nCalculated the membership matrix for IM Kernel'); % END OF IM KERNEL % SPECTRAL ANGLE KERNEL case 8 fprintf('\nImplementing Spectral Kernel'); % IMPLEMENT SPECTRAL ANGLE KERNEL for j=1:Ncl, for x=1:M, for y=2:N, prod=0;sum_mean=0;sum_bands=0; for k=1:bands, prod=prod+(double(awifs_read(x,y,k))*MeanClassVal(k,j)); sum_mean=sum_mean+(double(awifs_read(x,y,k))^2); sum_bands=sum_bands+(MeanClassVal(k,j)^2); end dmat(x,y,j)=(prod/(sqrt(sum_mean)*sqrt(sum_bands))); end end end % MAXIMUM AND MINIMUM max_fn(dmat,Ncl,M,N); min_fn(dmat,Ncl,M,N); b = 0.9; for i=1:115, b=(b/10)+0.9; end % SCALING OF MATRIX for j=1:Ncl, for x=1:M, for y=2:N,

smat(x,y,j)=(10^(-115)) + (((dmat(x,y,j)- min(j))*(0.9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999-(10^(-115))))/(max(j)-min(j)));

end end end % IMPLEMENT OF SPECTRAL ANGLE KERNEL for j=1:Ncl, for x=1:M, for y=2:N, for k=1:bands, dmat(x,y,j)=acosd(smat(x,y,j)); end end end end % MAXIMUM AND MINIMUM max_fn(dmat,Ncl,M,N); min_fn(dmat,Ncl,M,N); fprintf('\n The maximum values for each class are : \n'); disp(max);

Page 126: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 108

fprintf('\n The minimum values for each class are : \n'); disp(min); fprintf('\n'); % SCALING OF MATRIX for j=1:Ncl, for x=1:M, for y=2:N, smat(x,y,j)=(0) + (((dmat(x,y,j)-min(j))*(90- 0))/(max(j)-min(j))); end end end % MEMBERSHIP MATRIX CALCULATION for j=1:Ncl, for x=1:M, for y=2:N, prod=0;prod2=0;den=0; num=((1/((1-(smat(x,y,j)))))^(1/(m-1))); for a=1:Ncl, den= den+((1/((1-(smat(x,y,a)))))^(1/(m-1))); end meu(x,y,j)=real(num/den); end end end %MAXIMUM AND MINIMUM max_fn(meu,Ncl,M,N); min_fn(meu,Ncl,M,N); fprintf('\n The maximum membership values for each class are : \n'); disp(max); fprintf('\n The minimum membership values for each class are : \n'); disp(min); fprintf('\n'); % END OF SPECTRAL KERNEL otherwise fprintf('Wrong Choice !'); end % DISPLAY FRACTIONAL IMAGES for i=1:Ncl, figure,imshow(meu(:,:,i)); end % CONVERT FRACTION IMAGE TO TIFF FORMAT agri=meu(:,:,1); imwrite(agri,'agri.TIFF'); sal=meu(:,:,2); imwrite(sal,'sal.TIFF'); euc=meu(:,:,3); imwrite(euc,'euc.TIFF'); dry=meu(:,:,4); imwrite(dry,'dry.TIFF'); moist=meu(:,:,5); imwrite(moist,'moist.TIFF'); water=meu(:,:,6); imwrite(water,'water.TIFF'); cont=input('\nDo you want to check the output of more kernels(if yes Press 1 if No Press 2)??');

if cont==2

break; end

Page 127: Non-Linear Separation of classes using a Kernel based

NON LINEAR SEPARATION OF CLASSES USING A KERNEL BASED FUZZY C-MEANS APPROACH

Page | 109

end

% FUNCTION TO FIND THE MAXIMUM function max= max_fn(tmat,Ncl,M,N) max=[0 0 0 0 0 0]; for j=1:Ncl, % to find the min values val=tmat(1,1,j); for x=2:M, for y=2:N, if tmat(x,y,j)<val, val=tmat(x,y,j); end end end max(j)=val; end

end % FUNCTION TO FIND THE MINIMUM function min= min_fn(tmat,Ncl,M,N) min=[0 0 0 0 0 0]; for j=1:Ncl, % to find the min values val=tmat(1,1,j); for x=2:M, for y=2:N, f tmat(x,y,j)<val, val=tmat(x,y,j); end end end min(j)=val; end end