26
PATTERN CLASSIFICATION AND RECOGNITION USING FUZZY CELLULAR AUTOMATA DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING C.V.RAMAN COLLEGE OF ENGINEERING BHUBANESWAR Biswajit Panda 0701227457 7CS2A

pattern ppt

Embed Size (px)

Citation preview

Page 1: pattern ppt

PATTERN CLASSIFICATION AND RECOGNITION USING

FUZZY CELLULAR AUTOMATA

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERINGC.V.RAMAN COLLEGE OF ENGINEERING

BHUBANESWAR

Biswajit Panda0701227457

7CS2A

Page 2: pattern ppt

INTRODUCTION TO CELLULAR AUTOMATON

First proposed by Von Neumann in 1950’sAdvanced version by Stephen Wolfram in 1990’sAutomata theory applied to cellular structures A set of rules are applied on the present state of the cells generating the next stateThe production rules are applied repeatedly for some finite steps to produce a specific patternEach cell has 1>two states ON(1) or OFF(0) 2>two neighbours

3>a rule to determine the next state

Page 3: pattern ppt

CONTRIBUTION OF CELLULAR AUTOMATA

•Analysis and synthesis of 1. linear boolean CA (MACA) –CA with only

XOR logic 2. application of MACA in pattern recognition •data mining •image compression

•fault diagnosis of electronic circuit•Analysis and synthesis of 1.non-linear boolean CA (GMACA) –CA with all possible logic 2.application of non-linear CA in pattern recognition•Analysis and synthesis of 1.fuzzy CA (FMACA) –CA with fuzzy logic 2. application of fuzzy CA in pattern recognition

Page 4: pattern ppt

BASICS OF CELLULAR AUTOMATA

Present state of each cell is represented by P(t).The next state of a cell depends on 1. A production rule 2. present state of its neighbouring cells 3. its own stateIf the next state is represented by P(t + 1) , then

P(t + 1) = T · P(t)After d no of steps

P(t + d) = T d · P(t)T is a 2-dimensional n×n matrix known as Dependency Matrix (DM)

Page 5: pattern ppt

DEPENDENCY MATRIX

EXAMPLE:A 4-cell null boundary hybrid FCA with rule vector <238,254,238,252> that is < (qi + qi+1), (qi−1 + qi+ qi+1), (qi + qi+1), (qi−1 + qi) > being applied from left to right, may be represented as follows

P(t + 1) = T · P(t)where T corresponds to the Dependency Matrix

 

Page 6: pattern ppt

PRODUCTION RULESFor 3 consecutive cells there are 23=8 combinations and 28

i.e. 256 possible next state functions (rules) for a cell.

Page 7: pattern ppt

EVOLUTION OF FUZZY LOGIC

The main characteristics of the work from eighties was analytical approach to CA.

There were not a lot of results from synthesis approached results.

The simulation results of such model should be similar to the real behaviour.

Because of this reason fuzzy logic was introduced. The generalized structure is called fuzzy cellular automata (FCA).

Page 8: pattern ppt

FUZZY MULTIPLE ATTRACTOR CA (FMACA)

Page 9: pattern ppt

FMACA

Depth Of Attractor Basin:The depth d of a FMACA is defined as the number of clock cycles required to reach the attractor state from any non reachable state in the state transition diagram of the FMACA.If d is the depth of a FMACA with dependency matrix T, then

Td+1 = Td

FMACA is a special class of FCA that can efficiently model an associative memory to perform pattern classification/recognition problems.

Page 10: pattern ppt

PATTERN RECOGNITION PROBLEM

Pattern Recognition - Study how machines can learn to distinguish patterns of interest Conventional Approach - Compares input patterns with each of the stored patterns learn.

AComic Sans

MS

AB

C ....Z

Bookman Old Style

Page 11: pattern ppt

THE PROBLEM:

A A BGrid by Grid Comparison

0 0 1 00 0 1 00 1 1 11 0 0 11 0 0 1

0 1 1 00 1 1 00 1 1 01 0 0 11 0 0 1

No of Mismatch = 3

Page 12: pattern ppt

Time to recognize a pattern - Proportional to the number of stored patterns ( Too costly with the increase of number of patterns stored ).

SOLUTION: FCA based associative memory for pattern recognition

Page 13: pattern ppt

FMACA BASED PATTERN CLASSIFIER

An n-cell FMACA with k attractor basins can be viewed as a natural classifier.

Steps for pattern classification and recognition:

1: following the respective production rule repeat the level of production for d (where d is the depth of the attractor basin) no of steps.2: after d no of steps we reach at the attractor pattern. The attractor represents the class. We can recognize the pattern belonging to a class.3: the attractor yields the address of the memory that stores the class information. 4: All patterns that can be produced to reach the same attractor constitute a class whose required information is present at the address pointed by the attractor.

Page 14: pattern ppt

EXAMPLE OF FMACA CLASSIFICATION:

Suppose we have to classify five patterns into respective classes. Let the five patterns<0.5,0.00,0.00><1.00,0.25,0.00><0.00,0.50,0.25><1.00,0.25,0.25> <1.00,0.00,0.75>In the above figure there are five attractor basins namely <0.00,0.00,0.00> <0.25,0.25,0.00> <0.75,0.75,0.00> <0.5,0.5,0.00> <1.00,1.00,0.00>.After following the corresponding rule for d no of times we get Attribute 1 Attribute 2 Attribute 3 Attractor0.5 0.00 0.00 <0.00,0.00,0.00 > 1.00 0.25 0.00 <0.25,0.25,0.00>0.00 0.50 0.25 < 0.75,0.75,0.00>1.00 0.25 0.25 <0.50,0.50,0.00>1.00 0.00 0.75 < 0.75,0.75,0.00>

Page 15: pattern ppt

LINK TO MEMORY

Page 16: pattern ppt

FMACA BASED TREE STRUCTURED CLASSIFIER

To break down a complex decision making process into a collection of simpler decisions, thus providing a solution which is easier to interpret.

Page 17: pattern ppt

FMACA BASED TREE STRUCTURED CLASSIFIER

1.Each node of tree is either a leaf node or intermediate node.2.A leaf node represents an attractor basin of the FMACA designated as a specific class.3.The basin of attractor covers the elements belonging to a specific class. 4.An intermediate node represents the instance to design another FMACA

Page 18: pattern ppt

Algorithm for FMACA tree building:Suppose we want to design a FMACA based tree structured classifier to classify a training set S={S1,…..,Si,……,Sk} into K classes, where Si is the set of elements of class i.1. A FMACA with k attractor basins is generated.2. The elements of the training set S get distributed into k

attractor basins/nodes.3. Let S/ be the set of elements in an attractor basin. If S/

belongs to any one particular class (i.e. all patterns of S/ are covered by an attractor basin/node belonging to only one particular class), then mark it as a leaf node and label that attractor basin/node as that class.

4. Otherwise if S/ belongs to more than one class, then repeat the steps 1 to 4.

5. Stop.Output a tree that acts as pattern classifier. Output tree as small as possible less memory requirement, less time to run

Page 19: pattern ppt

TIME COMPLEXITY

characteristic equation of FMACA P(t + 1) = T · P(t)

For d no of steps P(t + d) = Td · P(t)

time complexity O(n3).

To identify an attractor basin of FMACA in O(n) time complexity, the Dependency Vector (DV) is used.

Page 20: pattern ppt

DEPENDENCY VECTOR

k No of attractor basinsK No of fuzzy statesWhere

k=Km And m=1,2,3……n then there exists m no of dependency relations (DR) among all the vectors of each attractor basin.

The dependency vector (DV) represents each individual dependency relation satisfied by all the vectors of each attractor basin.For K=3 and n=3The vectors of any attractor basin is conceived as a system of equations with three variables (x1,x2,x3),

Page 21: pattern ppt

DEPENDENCY VECTOR

then x2 and x3 are dependent variablesAnd x1 is dependent variable DV <011>. Thus in an n-demensional vector space with K fuzzy states, a FMACA having k no of attractor basins can be characterized by m no of DV’s if k=Km, where m=1,2,3….n.

Page 22: pattern ppt

For a n-cell K-attractor basin FMACADV of the form <000…..11111….11111…000> which contains 1’s from ith position (FCP) to jth (LSP) position. The region from FCP to LCP Dependent Region (DR).The kth bit of DV is given by

Let qm denotes a fuzzy state where qm=m/(K-1), then

The Pivot Cell (PC) of an attractor of a basin is in between First Cell and Last Cell of the DR.Pivot cell (PC) –represents an attractor basin uniquely

Page 23: pattern ppt

IDENTIFICATION OF ATTRACTOR BASINS

Employing Dependency Vector (DV) we Can reduce the time complexity to O(n).

Page 24: pattern ppt

CONCLUSION

An efficient and cost effective pattern classification algorithm was introduced.

The time complexity of pattern recognition algorithm from O(n3) to O(n).

Pattern classification is an important research area used in several systems such as database systems, machine learning, intelligent information systems, statistics, and expert systems.

Page 25: pattern ppt

REFERENCES

1.Stephen Wolfram, “A New Kind of Science” ISBN 1-57955-008-8,2002 Heather Betel and Paola Flocchini, ”On The Relationship Between Boolean and Fuzzy Cellular Automata”2.M.Mraz,N.Zimic,I.Lapanja,I.Bajec, “Fuzzy Cellular Automata From Theory To application”3.Pradipta Maji and P.Pal Chaudhuri, “Basins of Attraction of Cellular Automata Based Associative Memory and its Rule Space”4.Pradipta Maji, Chandrama Shaw, Niloy Ganguly, Biplab K. Sikdar and P.Pal Chaudhuri, “Theory and Application of Cellular Automata for Pattern Classification”5.Pradipta Maji and P.Pal Chaudhuri, “Fuzzy Cellular Automata for Modeling Pattern Classifier”.6.Pradipta Maji and P.Pal Chaudhuri, “Fuzzy Cellular Automata Based Associative Memory for Pattern Recognition” 

Page 26: pattern ppt

THANK YOU