CE-636 Soft Classification

Embed Size (px)

Citation preview

  • 8/10/2019 CE-636 Soft Classification

    1/34

    Soft Classifications Methods

  • 8/10/2019 CE-636 Soft Classification

    2/34

    Mixed Pixel Problem

    Depends upon the spatial resolution of the sensor.

    Pure

    Pure

    Pure

    Mixed

    Mixed

    Pure

    Pure

    MixedMixed

  • 8/10/2019 CE-636 Soft Classification

    3/34

    Both supervised and unsupervised classification may be

    applied to perform the hard and soft classification.

    Hard classification allocates each pixel of remote sensing

    image to a single class.

    It results an inherent assumption that all the pixel in the remote

    sensing imagery are pure.

    However often the images are dominated by mixed pixel they

    do not represent one particular land cover but contain two or

    more !and Cover "!C# classes in a single pixel.

  • 8/10/2019 CE-636 Soft Classification

    4/34

    Coarser the spatial resolution higher is the chance of

    mixed pixels occurring in a single pixel area.

    $lthough the chances of two or more class contributing

    to a mixed pixel are high with a coarse spatial

    resolution but the number of such pixels is small. %n

    the other hand with improved spatial resolution the

    number of classes within a pixel is reduced but the

    number of mixed pixels increases.

    &urthermore for improved spatial resolution the

    mas'ing due to shadow also results the loss of

    information.

  • 8/10/2019 CE-636 Soft Classification

    5/34

    (resence of mixed pixel creates a problem in image classification.

    a mixed pixel displays a composite spectral response that may be

    dissimilar to the spectral response of each of its component !C

    classes and therefore pixel may not be allocated to any of its

    component !C classes.

    Conventional image classification techni)ues may thus result into

    a lot of information loss present in a pixel. *hese techni)ues

    therefore tend to over+or under+estimate the actual areal extents of

    the !C classes on ground thereby degrading the classificationaccuracy of the image contaminated by mixed pixels.

  • 8/10/2019 CE-636 Soft Classification

    6/34

    Resolution and spectral mixing

  • 8/10/2019 CE-636 Soft Classification

    7/34

    CLASSIFICATION AND TAR!T

    ID!NTIFICATION

    Spectral analysis methods usually compare pixel spectra with a

    reference spectrum "often called a target#. *arget spectra can be

    derived from a variety of sources including spectral libraries

    regions of interest within a spectral image or individual pixels

    within a spectral image.

    "#ole Pixel Met#ods

    ,hole pixel analysis methods attempt to determine whether oneor more target materials are abundant within each pixel in a

    multispectral or hyperspectral image on the basis of the spectral

    similarity between the pixel and target spectra.

  • 8/10/2019 CE-636 Soft Classification

    8/34

    ,hole pixel tools include standard supervisedclassifiers such as Minimum Distance or Maximum

    li'elihood as well as tools developed specifically for

    hyperspectral imagery such as

    Spectral $ngle Mapper

    Spectral &eature &itting

  • 8/10/2019 CE-636 Soft Classification

    9/34

    $%Spectral Angle Mapper &SAM'

    Scatter plot of pixel values from two bands of a spectral

    image. In such a plot pixel spectra and target spectra willplot as points

    *he Spectral $ngle Mapper "-uhas et al. //0#

    computes a spectral angle between each pixel spectrum

    and each target spectrum.

  • 8/10/2019 CE-636 Soft Classification

    10/34

    (% Spectral Feature Fitting

    In Spectral &eature &itting the user specifies a range of

    wavelengths within which a uni)ue absorption feature

    exists for the chosen target. *he pixel spectra are then

    compared to the target spectrum using two measurements1

    .*he depth of the feature in the pixel is compared to the

    depth of the feature in the target and

    0.*he shape of the feature in the pixel is compared to the

    shape of the feature in the target "using a least+s)uares

    techni)ue#.

  • 8/10/2019 CE-636 Soft Classification

    11/34

    )%Complete Linear Spectral *nmixing

    It is also 'nown as spectral mixture modeling or spectral

    mixture analysis.

    Set of spectrally uni)ue surface materials existing within ascene are often referred to as the spectral end members.

    reflectance spectrum of any pixel is the result of linear

    combinations of the spectra of all end members inside that

    pixel. 2nmixing simply solves a set of n linear e)uations for

    each pixel where n is the number of bands in the image.

  • 8/10/2019 CE-636 Soft Classification

    12/34

  • 8/10/2019 CE-636 Soft Classification

    13/34

    Matc#ed Filtering

    %ften called a 3partial un+mixing4.

    5o need to find the spectra of all end members in the

    scene to get an accurate analysis.

    %riginally developed to compute abundances oftargets that are relatively rare in the scene.

    Matched &iltering 6filters7 the input image for good

    matches to the chosen target spectrum by maximi8ing

    the response of the target spectrum within the dataand suppressing the response of everything else.

  • 8/10/2019 CE-636 Soft Classification

    14/34

    So+t Classi+ication

    9ach pixel may represent the multiple and partial classmemberships.

    It is an alternative to hard classification because of its

    ability to deal with the mixed pixel.

    Membership functions allocates to each pixel a real

    value between : and i.e. membership grade.

    Sub+pixel scale information is typically represented in

    the output of a soft classification by the strength ofmembership a pixel displays to each class.

    It is used to reflect the relative proportion of the classes

    in the area represented by the pixel.

  • 8/10/2019 CE-636 Soft Classification

    15/34

  • 8/10/2019 CE-636 Soft Classification

    16/34

    So+t classi+iers

    Most common soft classifiers are1

    Maximum li'elihood classification

    &u88y c+means

    (ossibilistic c+means

    5oise Clustering

    $rtificial neural networ's Decision *rees

    &u88y set theory

    based approaches

  • 8/10/2019 CE-636 Soft Classification

    17/34

    *hese techni)ues can be applied to resolve a pixel

    into various !C class components thus generatingsoft class outputs.

    *he output is not a single classified image in soft

    classification. Here a number of images are obtainedas the classified output. *he pixel in each image

    "generally referred to as fraction image# depicts the

    proportion of individual !C classes.

    However these proportions do not actually represent

    the spatial distribution of !C classes on ground.

  • 8/10/2019 CE-636 Soft Classification

    18/34

    Maximum Li,eli#ood Classi+ier &MLC' -

    M!C is one of the most widely used hard classifier.

    In a standard M!C each pixel is allocated to the class with which

    it has the highest posterior probability of class membership.

    M!C has been adapted for the derivation of sub+pixel information.

    *his is possible because a by+product of a conventional M!C are

    the posterior probabilities of each class for each pixel.

  • 8/10/2019 CE-636 Soft Classification

    19/34

    *he posterior probability of each class provides is a relative

    measure of class membership and can therefore be used as an

    indicator of sub+pixel proportions.

    %ften many author use the term &u88y M!C to discriminate it

    from the "hard# M!C.

    Conceptually, there is not a direct link between the proportional

    coverage of a class and its posterior probability. In fact, posterior

    probabilities are an indicator of the uncertainty in making aparticular class allocation. However, many authors have find that

    in practice useful sub-pixel information can be derived from this

    approach.

  • 8/10/2019 CE-636 Soft Classification

    20/34

    ( ) ( )..

    ln0

    t

    m i i i ip N x N x =

    m ip p>

    Xis a !C class cif and only if

    ,here

    X is a vector of D5 values of unclassified pixels

    ip

    mpis li'elihood of ith!C class "i;

  • 8/10/2019 CE-636 Soft Classification

    21/34

    a posterior probabilities of a pixel belonging to ith!C class the

    can be given by1ap

    .

    c

    a m mj

    j

    p p p

    =

    =

    *hese a posterior probabilities represent the soft classification

    output. &or example the a posterior probabilities of classmemberships for a pixel containing three !C classes= soil water

    and vegetation are obtained as :.>? :.:@ and :.00 respectively.

    *he M!C in its hard form will assign the pixel to soil= its

    probability of occurrence being maximum in that pixel. %n the

    other hand a softened output will show the probabilities of each

    of the !C classes considered in a pixel.

  • 8/10/2019 CE-636 Soft Classification

    22/34

    Fu../ c-Means &FCM'-

    It is an iterative clustering method. may be employed to partition

    pixels of a satellite image into different class membership values. 9ach pixel in the satellite image is related with every information

    class by a function 'nown as membership function. *he value of

    membership function 'nown simply as membership varies between

    : and . *he membership value close to implies that the pixel is more

    representative of that particular information class while

    membership value close to : implies that the pixel has little or no

    similarity with the information class. *he net effect of such a function is to produce fu88y c-partition of a

    given data "or satellite image in case of remote sensing#.

  • 8/10/2019 CE-636 Soft Classification

    23/34

    *he obAective function for &CM can be given by

    ( ). ." # " #

    c Nm

    fcm ki k ii kJ U V D x v= ==

    00" # " # " #Tk i ki k i k i k iAD x v d x - v x - v A x - v= = =,here

    SubAect to the constraints

    .

    .c

    ki

    i

    =

    = for all k ;.

    :

    N

    ki

    k

    =

    > for all i ; : .ki

    for all k,i

    ,hereU N c= matrix

    " #1 cV v v= is the collection of the vectors with the informationclass center iv

    ki is a class membership values of a pixel

    kid is distance between feature space between kx and iv

    kx is vector "feature vector# denoting spectral response of class k

  • 8/10/2019 CE-636 Soft Classification

    24/34

    iv is vector " prototype vector# denoting the information class center

    of class i

    c Nand are total number of information classes and pixels

    respectively.A is a weight matrix.

    m is a weighting exponent "or fu88ifier# . m< <

    ..

    .

    .

    " #

    " #

    c mk i

    kijj k

    D x v

    D x v

    =

    =

    &rom the obAective function of the &CM the membership value can

    calculated as1

    .

    " # " #c

    jk k i

    i

    D x v D x v=

    = where

    ki is reali8ation value of class membership ki

    ( )

    ( )

    .

    .

    N m

    ki k

    ki N m

    ki

    k

    x

    v

    =

    =

    =

    *he center of information class can be computed as1

  • 8/10/2019 CE-636 Soft Classification

    25/34

    Possibilistic c-Means &PCM'-

    ( ). . . .

    " # " # ". #c N c N

    m m

    pcm ki k i i ki

    i k i k

    J U V D x v = = = =

    = +

    *he obAective function for (CM can be given by1

    *he specificity of this new term is that it emphasi8es "orassigns high membership value# the representative feature

    point and de+emphasi8es "or assigns low membership value#

    the unrepresentative feature point present in the data.

    SubAect to the constraints

    max :kii

    > for all k;.

    :

    N

    ki

    k

    =

    > for all i ; : .ki for all k,i

  • 8/10/2019 CE-636 Soft Classification

    26/34

    ( )( )..

    .

    . " #ki

    mk i iD x v

    = +

    &rom the obAective function of the (CM the membership value can calculated as1

    . .

    " #N N

    m m

    i ki k i ki

    k k

    ! D x v = =

    =

    where

    i is 'nown as bandwidth parameter

  • 8/10/2019 CE-636 Soft Classification

    27/34

    Noise Clustering &NC'-

    In &CM noisy points "i.e. outliers# are grouped with

    information classes with same overall membership value of one.

    5oise classes "or outliers# can be segregated from the core

    information class "or cluster#. *hey do not degrade the )uality

    of clustering analysis.

    *he main concept of the 5C algorithm is the introduction of a

    single noise information class "c# that will contain all noise

    data points.

    *he obAective function for 5C can be given by1

  • 8/10/2019 CE-636 Soft Classification

    28/34

    ( ) ( ) .. . .

    " # " #c N N

    mm

    nc ki k i k c

    i k k

    J U V D x v += = =

    = +

    " # " #

    " #

    c m mk i k i

    kijj k

    D x v D x vi c

    D x v

    =

    = +

    ( )

    ..

    .

    .

    .

    .

    mc

    k c

    j jkD x v

    +=

    = +

    (erformance of the 5C classifier is dependent on the Resolution parameter

    &0'.

    %ptimi8ed value of resolution parameter is re)uired.

    *he obAective function for 5C can be given by1

  • 8/10/2019 CE-636 Soft Classification

    29/34

    Arti+icial Neural Net1or, &ANN'-

    $n $55 is a form of artificial intelligence that imitates somefunctions of the human brain.

    $n $55 consists of a series of layers each containing a set of

    processing units "i.e. neurones#.

    $ll neurones on a given layers are lin'ed by weighted connections

    to all neurones on the previous and subse)uent layers.

    During the training phase the $55 learns about the regularities

    present in the training data and based on these regularities

    constructs rules that can be extended to the un'nown data.

  • 8/10/2019 CE-636 Soft Classification

    30/34

  • 8/10/2019 CE-636 Soft Classification

    31/34

    Ad2antages o+ ANN It is a non+parametric classifier i.e. it does not re)uire any assumption about the

    statistical distribution of the data.

    High computation rate achieved by their massive parallelism resulting from a dense

    arrangement of interconnections "weights# and simple processors "neurones# which

    permits real+time processing of very large datasets.

    Disad2antages o+ ANN

    $55 are semantically poor. It is difficult to gain any understanding about how the

    result was achieved.

    *he training of an $55 can be computationally demanding and slow.

    $55 are perceived to be difficult to apply successfully. It is difficult to select the type

    of networ' architecture the initial values of parameters such as learning rate and

    momentum the number of iterations re)uired to train the networ' and the choice of

    initial weights.

    i i & '

  • 8/10/2019 CE-636 Soft Classification

    32/34

    Decision Trees &DT'-

    Can be used as both the 3ard or so+t classi+ierAd2antage-

    $bility to handle non+parametric training data i.e. D* are not based on anyassumption on training data distribution.

    D* can reveal nonlinear and hierarchical relationships between input variables

    and use these to predict class membership.

    D* yields a set of rules which are easy to interpret and suitable for deriving a

    physical understanding of the classification process.

    ood computational efficiency.

    D* unli'e $55 do not need an extensive design and training.

    Disad2antage- *he use of hyperplane decision boundaries parallel to the feature axes may

    restrict their use in which classes are clearly distinguishable.

  • 8/10/2019 CE-636 Soft Classification

    33/34

    $lthough the soft classification is informative andmeaningful it fails to account for the actual spatial

    distribution of class proportions 1it#in t#e pixel%

    Super+resolution mapping "or sub+pixel mapping# is a

    step forward.

    Super+resolution mapping considers the spatialdistribution within and between pixels in order to

    produce maps at sub+pixel scale.

    Super4resolution Mapping -

  • 8/10/2019 CE-636 Soft Classification

    34/34

    Several approaches of super+resolution mapping havebeen developed1

    Mar'ov random fields

    Hopfield neural networ's

    !inear optimi8ation

    (ixel+swapping solution "based on geostatistics#