58
Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov Nik Kasabov [email protected] Knowledge Engineering and Discovery Research Institute, KEDRI www.kedri.info , AUT, NZ

Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov [email protected] Knowledge

Embed Size (px)

Citation preview

Page 1: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

Seminar, University of Lancaster, September 2005

Evolving Connectionist Systems: Methods and Applications

Nik KasabovNik Kasabov

[email protected] Engineering and Discovery Research Institute, KEDRI

www.kedri.info, AUT, NZ

Page 2: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Content

1. Adaptation and interaction in biological and artificial systems

2. Evolving Connectionist Systems - ECOS. 3. ECOS for Bioinformatics4. ECOS for Neuroinformatics 5. ECOS for Medical Decision Support 6. Computational Neurogenetic Modelling (CNGM)7. ECOS for adaptive signal processing, speech and image8. ECOS for adaptive mobile robots9. ECOS for adaptive financial time-series prediction 10.Conclusions and future research 11.References

Page 3: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

• Evolving process – the process s unfolding, developing, revealing, changing over time in a continuous way

• Evolving processes in living organisms -> five levels. Different information processing operations.

• Dynamic models may need to to take into account all 5 levels together and not just one of them (e.g.neural networks, evolutionary computation)

• Developing and applying such models to solving complex problems in bioinformatics, brain study and engineering.

5. Evolutionary developmentFunction examples: genome evolution, creation of new

individuals and species,

______________________________________________4. Brain levelFunction examples: cognition, speech and language______________________________________________

3. Neural network ensemble level Function examples: sound perception, signal processing______________________________________________

2. Single cell (e.g. neuron) level Function examples: neuron activation______________________________________________

1 Molecular level Function examples : DNA translation into RNA, RNA and gene

transcription into proteins

1. Adaptation and interaction in biological and artificial systems

Page 4: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Intelligent systems for adaptive modeling and knowledge

discovery

EnvironmentD

ata Pre-processing,Feature ExtractionLabelling

Modelling,Learning,Knowledge discovery

Information

Human

Knowledge

Adaptation

Control

Sound

Image

Video

Numbers

Sensors

Other Sources

DNA

Brain Signals

Environmental andSocial Information

Stock Market

• Modelling complex processes is a difficult task

• Most existing techniques may not be appropriate to model complex, dynamic processes.

• Variety of methods need to be developed to be applied to a number of challenging real-world applications

Page 5: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

NeuCom: A Neuro Computing Environment for Intelligent Decision Support Systems

• 60 new and old methods for data mining, data modelling and knowledge discovery; prognostic systems; web based decision support systems; decision support systems.

• Free copy of a student version: www.theneucom.com

• Applications: Bionformatics, Neuroinformatics, Robotics, Signal processing , speech and image, …

Page 6: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

2. Evolving COnnectionist Systems - ECOS

• ECOS are modular connectionist-based systems that evolve their structure and functionality in a continuous, self-organised, on-line, adaptive, interactive way from incoming information; they can process both data and knowledge in a supervised and/or unsupervised way.

• N.Kasabov, Evolving connectionist systems: methods and applications in bio-informatics, brain study and intelligent machine, Springer Verlag, 2002

• First publications – Iizuka, 98 and ICONIP’98

• ‘Throw the “chemicals” and let the system grow, is that what you are talking about, Nik ?’ Walter Freeman, UC at Berkeley, a comment at “Iizuka’”1998 conference

• ‘ This is a powerful method ! Why don’t you apply it to challenging real world problems from Life sciences?’ Prof. R.Hech-Nielsen, UC San Diego, a comment at “Iizuka” 1998 conference

• Early examples of ECOS: • RAN (J.Platt) – evolving RBF NN• RAN with a long term memory – Abe et al, ;• Incremental FuzzyARTMAP; • Growing gas; etc.

Environment

ECOS

Page 7: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

EI

An information system that develops its structure and functionality in a continuous, self-organised, adaptive, interactive way from incoming information, possibly from many sources, and performs intelligent tasks typical for the human brain (e.g. adaptive pattern recognition, decision making, concept formation, languages,….).

EI is characterised by:

Adaptation in an on-line, incremental, life-long mode Fast learning from large amount of data, e.g. 'one-pass' training Open structure, extendable, adjustable Memory-based (add and retrieve information, delete information) Active interaction with other systems and with the environment Represent adequately space and time at their different scales Knowledge-based: rules; self-improvement

Page 8: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Five levels of functional information processing in ECOS

• 1. “Genetic level” – each neuron in the system has a set of parameters – genes that are subject to adaptation through both learning and evolution

• 2. Neuronal level

• 3. Ensembles of neurons – evolving NNM

• 4. The whole ECOS

• 5. Populations of ECOS and their development over generations

Different learning algorithms apply at each level, capturing different meaning (aspect) of the whole process.

Page 9: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

ECOS are based on unsupervised or supervised clustering and local, knowledge-based modelling

An evolving clustering process using ECM with consecutive examples x1 to x9 in a 2D space (Kasabov and Song, 2002)

(a) (b)

(c) (d)

C 3

Cc2

Ru3 = 0

2

3

x1

C1

Cc11

x4

x2

x 3

C2

Cc20

Ru2 = 00

Ru11

x1

C1

Cc1Ru1 = 00

0

x6

x5

x7

x8

Cc3

Ru1

0 Ru21

C 1

C2

Cc1

0

0

1

0

0

2

11

2Ru1Cc1

3

C 1

x9

3

C31C2

0

: samplexi : cluster centreCcj Cj : cluster

Ruj : cluster radius k

kk

Page 10: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Evolving Fuzzy Neural Networks (EFuNNs)

• Learning is based on clustering in the input space and a function estimation for this cluster

• Prototype rules represent the clusters and the functions associated with them

• Different types of rules: e.g. – Zadeh-Mamdani, or Takagi-Sugeno

• The system grows and shrinks in a continuous way

• Feed-forward and feedback connections (not shown)

• Fuzzy concepts may be used

• Not limited in number and types of inputs, outputs, nodes, connections

• On-line/off line training

• IEEE Tr SMC, 2001, N.Kasabov

• ECF – evolving classifier function – a partial case of EFuNN – no output MF

Inputs:not fixed,fuzzified or not

outputs:not fixed,defuzzified

rule(case) node layer:growingand shrinking

Page 11: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Local, incremental, cluster-based learning in EFuNN

• Incremental (possibly on-line) supervised clustering • First layer of connections: W1(rj(t+1))=W1 (rj(t)) + lj. D(W1(rj(t)) , xf)

• Second layer: W2 (rj(t+1) ) = W2(rj(t)) + lj. (A2 - yf). A1(rj(t)),

where: - rj is the jth rule node (hidden node); - D – distance – fuzzy or Euclidean (normalised) - A2=f2(W2.A1) is the activation vector of the fuzzy output neurons in the EFuNN

structure when x is presented; - A1(rj(t)) =f2 (D (W1 (rj(t)) , xf)) is the activation of the rule node r j (t); a simple linear

function can be used for f1 and f2, e.g. A1(rj(t)) = 1- D (W1 (r j(t)) , xf)); - lj is the current learning rate of the rule node r j calculated for example as lj = 1/ Nex(rj),

where Nex(rj) is the number of examples associated with rule node r j.

- Example: Classification of gene expression data.

Page 12: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Dynamic Evolving Neuro-Fuzzy System DENFIS

• Modeling, prediction and knowledge discovery from dynamic time series

• Publication: Kasabov, N., and Song, Q., DENFIS: Dynamic Evolving Neural-Fuzzy Inference System and its Application for Time Series Prediction, IEEE Transactions on Fuzzy Systems, 2002, April

Page 13: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Local, incremental learning of cluster-based fuzzy rules in DENFIS

• Input vector: x = [x1,x2, … , xq]

• Result of inference:

Σ i=1,m [ ωi fi ( x1, x2, …, xq )]

y = __________________________

Σ i=1,m ωi

• A partial case is using linear regression functions:

y = β0 + β1 x1 + β2 x2 + … + βq xq.

• Fuzzy rules: IF x is in cluster Cj THEN yj = fj (x)

• Incremental learning of the function coefficients through least square error

Page 14: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Learning and Inference in DENFIS

Page 15: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Evolving Multiple ECOS Models Through Evolutionary Computation

Evolutionary computation. Terminology:

• Gene

• Chromosome

• Population

• Crossover

• Mutation

• Fitness function

• Selection

Page 16: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Genetic Algorithms

Page 17: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

GA and ECOS

• Many individual ECOS are evolved simultaneously on the same data through a GA method

• A chromosome represents each individual ECOS parmeters

• Individuals are evaluated and the best one is selected for a further development

• Mutation

Page 18: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Feature and Parameter Optimisation of Local ECOS Models Through GA

• Nature optimises its “parameters” through evolution

• Replication of individual ECOS systems and selection of:

- The best one

- The best m averaged

Page 19: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

A framework of evolving connectionist machines (ICONIP,1998)

Self analysis, Rule extraction

Act

ion

Mod

ule

ActionPart

DecisionPartFeature

SelectionPart

Presentation &Representation Part

Inputs

NewInputs

Environment(Critique)

HigherLevelDecis.Module

Adaptation

NNM

NNM

NNM

Results

Page 20: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

ECOS have the following characteristics

Evolving structure Rule node creation based on similarity between data

examples Pruning nodes Regular node aggregation Supervised and unsupervised learning Local learning!! On-line learning Life-long learning Sleep learning Fuzzy rule extraction Time-space preserved and can be traced back

Page 21: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Machine learning algorithms for ECOS

Machine Learning

Supervised Learning

Unsupervised Learning

Batch Learning

Incremental Leaning

Transductive Learning

Inductive Learning

Reinforcement Learning

Competitive Learning

Ensemble Learning

Model-based Tree /Forest Learning

Evolving Leaning

Other Categories

Page 22: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

● – a new data vector○ – a sample from D∆ – a sample from M

• A transductive model is created on a sub-set of neighbouring data to each input vector. A new data vector is situated at the centre of such a sub-set (here illustrated with two of them – x1 and x2), and is surrounded by a fixed number of nearest data samples selected from the training data D and generated from an existing model M (Vapnjak)

• The principle of “What is good for my neigbours will be good for me”

• GA parameter optimisation – poster 1065, IJCNN05 – Mon, 7-11pm (N.Mohan, NK)

Inductive (Local) versus Transductive (“Personalised”) Modelling

Page 23: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Incremental Classifier and Feature Selection Learning

• When new data is added incrementally, new features may become important, e.g. different frequencies, different Principle Components

• Incremental PCA (Hall and Martin, 1998)

• Incremental, chunk PCA and ECOS modification

• Examples: • S. Ozawa, S.Too, S.Abe, S. Pang and N. Kasabov, Incremental Learning of

Feature Space and Classifier for Online Face Recognition, Neural Networks, August, 2005

• S. Pang, S. Ozawa and N. Kasabov, Incremental Linear Discriminant Analysis for Classification of Data Streams, IEEE Trans. SMC-B, vol. 35, No. 4, 2005

Page 24: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

3. ECOS for Bioinformatics

• DNA/ RNA sequence analysis• Gene expression profiling for diagnosis and prognosis • microRNA structure analysis and prediction• Protein structure analysis and prediction• Gene regulatory network modelling and discovery

Page 25: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

Gene Expression Data Analysis and Disease Profiling

• DNA analysis - large data bases; data always being added and modified; different sources of information

• Markers and drug discoveries:• Gastric cancer

• Bladder cancer

• CRC

• www.pedblnz.com

Page 26: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

A specialized gene expression profiling SIFTWARE based on ECOS

www.peblnz.com

www.kedri.info

Page 27: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Case study on gene expression data modelling and profiling: The DLBCL Lymphoma data set

(M. Ship et al, Nature Medicine, vol.8, n.1, January 2002, 68-74)

Page 28: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Rule extraction from a trained ECOS on the DLBCL Lymphoma data set

(M. Ship et al, Nature Medicine, vol.8, n.1, January 2002, 68-74)

if X1 is( 2: 0.84 )

X2 is( 2: 0.81 )

X3 is( 1: 0.91 )

X4 is( 1: 0.91 )

X5 is( 3: 0.91 )

X6 is( 1: 0.89 )

X7 is ()

X8 is( 1: 0.91 )

X9 is( 1: 0.91 )

X10 is ( 3: 0.87 )

X11 is ( 1: 0.86 )

then Class is [1] - Fatal

Where: X1,..X11 are known genes;

1- small, 2-medium, 3-large

if X1 is( 2: 0.60 )

X2 is( 1: 0.73 )

X3 is( 1: 0.91 )

X4 is( 3: 0.91 )

X5 is( 1: 0.64 )

X6 is ()

X7 is( 2: 0.65 )

X8 is( 2: 0.90 )

X9 is( 1: 0.91 )

X10 is( 1: 0.62 )

X11 is( 1: 0.91 )

then Class is [2] - Cured

Page 29: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Comparative Analysis of Global, Local and Personalised Modelling on the Lymphoma Gene

Expression Case Study

Model/Features

Inductglobal MLR

Induct Global SVM

Induct Local ECF

TransWKN

N K=8

TransWKN K=26Pthr=0.5

Trans MLR

K=8

TransMLR k=26

Trans SVM

K=8

Trans SVMk=26

Trans ECF K=8

Trans ECF k=26

IPI 73(87,58

73(87,58)

46(0,100)

50(87,8)

73(87,56)

50(87,8)

73(87,58

)

46 (100,0)

73(87,58

61(63,58

)

46(0,100

11 gene Variab.

79(91,65

83(88,78)

86(88,84)

74(91,54

73(93,47)

66(66,65

78(81,73

76(91,58

78(91,62

78(81,73

83(91,73

IPI + 11 gen.

82(83,81

86(90,81)

88(83, 92)

77(90,62

76(100,50

Pthr=0.4: 77%(73,81)

Pthr=.45, 82%(97,65)

57(60,54

79(80,77

77(93,58

84(93,73

75(83,65

77(87,65

Page 30: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Gene regulatory network modeling and discovery

Case study:

•Leukemia cell line U937 (experiments done at the NCI, NIH, Frederick, USA, Dr Dimitrov’s lab)

•Two different clones of the same cell line treated with retinoic Acid

•12,680 genes expressed over time points

•4 time points (the MINUS clone, the cell died) and

•6 time points (PLUS cell line, cancer)

Page 31: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Cluster analysis of time course gene expression data reduces the variables in the GRN

• Genes that share similar functions usually show similar gene expression profiles and cluster together

• Different clustering techniques:

• Exact clusters vs fuzzy clusters

• Pre-defined number of clusters or evolving

• Batch vs on-line• Using different similarity

or correlation measure

Page 32: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

The Goal is to Discover Gene Regulatory Networks and

Gene State Transitions

Page 33: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Gene networks and reverse engineering

• GN describe the regulatory interaction between genes

• DNA transcription, RNA translation and protein folding and binding – all are part of the process of gene regulation

• Here we use only RNA gene expression data

• Reverse engineering – from gene expression data to GN.

• It is assumed that gene expression data reflects the underlying genetic regulatory network

• Co-expressed genes over time – either one regulates the other, or both are regulated by same other genes

• Problems: • What is the time unit?• Appropriate data needed and a validation procedure• Data is usually insufficient – we need to use special methods

• Correct interpretation of the models may generate new biological knowledge

Page 34: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Evolving fuzzy neural networks for GN modeling (Kasabov and Dimitrov, ICONIP, 2002)

G(t) EFuNN G(t+dt)

• On-line, incremental learning of a GN

• Adding new inputs/outputs (new genes)

• The rule nodes capture clusters of input genes that are related to the output genes

• Rules can be extracted that explain the relationship between G(t) and G(t+dt), e.g.: IF g13(t) is High (0.87) and g23(t) is Low (0.9)

THEN g87 (t+dt) is High (0.6) and g103(t+dt) is Low

• Playing with the threshold will give stronger or weaker patterns of relationship

Page 35: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Rules extracted are turned into state transition

graphs                      

PLUS cell line MINUS cell line

Page 36: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Using DENFIS (Dynamic, evolving neuro-fuzzy inference system) for GRN modeling

(IEEE Trans. FS, April, 2002)

• DENFIS vs EFuNN

• G(t) gj(t+dt)

• Dynamic partitioning of the input space

• Takagi-Sugeno fuzzy rules, e.g.:

If X1 is ( 0.63 0.70 0.76) and

X2 is ( 0.71 0.77 0.84) and X3 is ( 0.71 0.77 0.84) and X4 is ( 0.59 0.66 0.72) and then Y = 1.84 - 1.26 X1 - 1.22X2 + 0.58X3 - 0.03 X4

Page 37: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Using a GRN model to predict the expression of genes in a future time

0 2 4 6 8 10 12 14 16 18-1.5

-1

-0.5

0

0.5

1

1.5

2plus : 33 8 27 21

3382721

Page 38: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

4. ECOS for Neuroinformatics

• Why ECOS for brain study?

• Modeling brain states of individuals or group of individuals from EEG, fMRI and other information

• Example: epileptic seizure of a patient; 8 EEG channels data is shown

Page 39: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

EEG data and modeling perception

• Standard EEG electrode systems

• In the experiment here, four classes classes of brain perception states are used of brain perception states are used with 37 single trials each of them with 37 single trials each of them including the following stimuli: including the following stimuli:

• Class1 - Auditory Stimulus;

• Class2 - Visual Stimulus;

• Class3 - Mixed Auditory and visual stimuli;

• Class 4 - No stimulus.Class 4 - No stimulus.

• (With van Leewen, RIKEN, BSI, Tokyo)

Page 40: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

ECF for building individual cognitive models

Table 2. The correctly recognized data samples by an ECF model trained on 80% and tested on 20% of a single person’s data (person A) – 65 variables : 64 electrodes plus time

Stimulus A V AV No Accuracy

A 81.2 1.3 0.1 0.2 98%

V 1.1 82.4 2.9 1.8 93.4%

AV 0.6 3.3 75 1.4 93.4%

No 0.4 1.5 1.3 80.5 96.2%

Page 41: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

5. Medical decision support systems

• Large amount of data available in the clinical practice

• The need for intelligent decision support systems - a market demand

• Web-based learning and decision support systems

• Palmtops can be used to download and run an updated decision support system

• Examples: Cardio-vescular risk analysis; Trauma data analysis and prognosis; Sarcoma prognostic systems

• N.Kasabov, R.Walton, et al, AI in Medicine, submitted, 2005

Page 42: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

The case study on GFR prediction for renal medical decision support

A real data set from a medical institution is used here for experimental

analysis. The data set has 447 samples, collected at hospitals in New

Zealand and Australia. Each of the records includes six variables

(inputs): age, gender, serum creatinine, serum albumin, race and blood

urea nitrogen concentrations, and one output - the glomerular filtration

rate value (GFR). All experimental results reported here are based on

10-cross validation experiments with the same model and parameters

and the results are averaged. In each experiment 70% of the whole data

set is randomly selected as training data and another 30% as testing

data.

Page 43: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Adaptive Renal Function Evaluation System: GFR-ECOS(Song, Ma, Marshal, Kasabov, Kidney International 2005)

Page 44: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Comparative Analysis of Global, Local and Personalised Modelling on the Case Study of GFR Decision Support

Model Neurons or rules

Testing RMSE

Testing MAE

Weights of input variablesAge Sex SCr Surea Race Salb w1 w2 w3 w4 w5 w6

MDRD __ 7.74 5.88 1 1 1 1 1 1

MLP 12 8.44 5.75 1 1 1 1 1 1

ANFIS 36 7.49 5.48 1 1 1 1 1 1

DENFIS 27 7.29 5.29 1 1 1 1 1 1

TNFI6.8

(average) 7.31 5.30 1 1 1 1 1 1

TWNFI

6.8 (average) 7.11 5.16 0.89 0.71 1 0.92 0.31 0.56

Page 45: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

A GFR personalised model of a patient obtained with the use of the TWNFI

Inputvariables

Age 58.9

SexFemale

SCr0.28

Surea28.4

Face White

Salb38

Weights of input variables (TWNFI) 0.91 0.73 1 0.82 0.52 0.46

ResultsGFR (desired)

18.0MDRD

14.9TWRBF

16.6

Page 46: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

6. Computational Neurogenetic Modelling (CNGM)

-0.7

0.5

3 1

2

4

0.3 -0.2 -0.6

0.5

-0.4

0.7 0.8

Page 47: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

DNA

transcription

protein

translation

mRNA

Endoplasmic Reticulum

Golgi Complex

Final destination

Nucleus

Ribosome

Neuron

What is going on in a neuron?

Page 48: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Computational Neurogenetic ModellingComputational Neurogenetic Modelling

GRNGRNCNGM as a SNNCNGM as a SNN

Page 49: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

c-junmGluR3GABRAGABRBAMPARNMDAR

NaCKCC1CJerkyBDNFFGF-2IGF-I

GALR1NOS

S100beta

• 10 genes related to neuronal parameters and some specific genes • Linear vs non-linear gene interaction• Initial Gene Expression Values:

random [-0.1,0.1]● Initial Weight Matrix : random [-1,1] with constraints

Input OutputSigmoid [-1,1]

Gene Regulatory Network in a neuronal cell

Page 50: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

A CNGM of a spiking neuron (IJCNN 2005)

n

kkjkgj tgwttg

1

)()(

n

kkjkgj tgwttg

1

)()(

)()0()( tpPtP jjj

i jjj Ft

axijjijiji ttJtu )()(

synapserise

synapsedecay

synapsesynapseij

ssAs

expexp)(

decay

iii

ttktt exp)( 0

Page 51: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Obtained GRN

g8

g9

g3

g5

g2

g7

g10 g1

g4

g6

Page 52: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

7. ECOS for adaptive signal processing, speech and image

• New methods for signal processing, filtering, feature selection

• Adaptive speech recognition, image and video data processing

• ECOVIS prototype system• Multimodal (face-, finger print-,

iris-, and voice) person verification system

• Future research: Other modality recognition models and systems such as: odour-, DNA-, waves- , personal magnetic fields.

• Language translation: English and Maori translation system:

http://translator.kedri.info/

Page 53: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

8. ECOS for adaptive, autonomous mobile robots:

Interactive robots that learn

• Continuous learning of the environment

• Interactive communication in a spoken language

• Rokel - an experimental robot Pioneer

Page 54: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

Multiple robots - Robocup1 gold and 2 silver medals for the Singapore Polytechnic at the World Robo-cup in

Seoul, 2004, using ECOS; Bronze medal in Japan, 2005 (Loulin)

Page 55: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

9. ECOS for adaptive, on-line financial time series prediction

• Weekly prediction of the exchange rate R Euro/US$ (1,2,3 and 4 weeks ahead)

• 3 inp. variables:ERate, Euro/Yen, Stock-E/US, with 4 time lags each

• Data for years 1999 and 2000 only

• 14 nodes evolved

• Overall RMSE=0.13

• RMSE for the last 4 weeks is 0.05

• RMSE for the random walk method is 0.075

• (collaboration with the Universityof Trento)

0 10 20 30 40 50 60 700.85

0.9

0.95

1

1.05

DesiredandActual

0 10 20 30 40 50 60 700

10

20

30

40Numberofrulenodes

Page 56: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

10. Conclusions and Future Research

• Real life problems New methods of Computational Intelligence (CI)

• An integrated, knowledge based approach to modelling and knowledge discovery

• Computational Intelligence and Ontology systems

• Personalised modelling, personalised drug design and personalised medicine

• Multi-agent systems where each agent is an ECOS

Page 57: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

N.Kasabov, 2005

11. References

1. N.Kasabov, Evolving connectionist systems: Methods and Applications in Bioinformatics, Brain study and intelligent machines, Springer Verlag, London, New York, Heidelberg, 2002 (www.springer.de)

2. N.Kasabov, Foundations of neural networks, fuzzy systems and knowledge engineering, MIT Press, CA, MA, 1996 (www.mitpress.edu).

3. N. Kasabov, L. Benuskova, Computational Neurogenetics, International Journal of Theoretical and Computational Nanoscience, Vol. 1 (1) American Scientific Publisher, 2004 (www.asp.com)

4. N.Kasabov, L.Benuskova, S.Wysoski, A Computational Neurogenetic Model of a Spiking Neuron, IJCNN 2005 Conf. Proc., IEEE Press, 2005

5. S. Ozawa, S.Too, S.Abe, S. Pang and N. Kasabov, Incremental Learning of Feature Space and Classifier for Online Face Recognition, Neural Networks, August, 2005 (also IJCNN 2005)

6. N.Mohan and N. Kasabov, Transductive Modelling with GA parameter optimisation, IJCNN 2005 Conf. Proceed., IEEE Press, 2005

7. Software: www.theneucom.com, www.kedri.info

Page 58: Nik Kasabov Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge

Acknowledgement: The Knowledge Engineering and Discovery Research Institute – KEDRI, AUT, NZ, http://www.kedri.info

• Photo of KEDRI…staff

• Established March-June 2002.

• Funded by AUT, NERF (FRST), NZ industry.

• Appr. NZ$ 1mln pa

• 25 research staff and graduate students; 25 associated researchers

• Both fundamental and applied research (theory + practice)

• 85 publications in 2002-03

• 450 publications all together

• Multicultural environment (12 ethnical origins)

• Strong national and international collaboration:

• NZ: PEBL (www.pebl.co.nz); ViaLactia; GENESIS

• Japan – RIKEN; KIT; Ritsumeikan

• USA: UCBerkeley; NCI, NIH;

• Singapore: NTU; InfoCom;

• EI-asia Ltd (www.ei-asia.net)

• Germany: U.Kaiserslautern; Humbold U.