23
Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary http://cneuro.rmki.kfki.hu Zoltán Somogyvári and Péter Érdi Hungarian Academy of Science, Research Institute for Particle and Nuclear Physics, Department of Biophysics

Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary Zoltán Somogyvári

Embed Size (px)

Citation preview

Page 1: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning: Theory and an application to

sequence learning

P.O. Box 49, 1525, Budapest, Hungary http://cneuro.rmki.kfki.hu

Zoltán Somogyvári and Péter Érdi

Hungarian Academy of Science,Research Institute for Particle and Nuclear

Physics, Department of Biophysics

Page 2: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning:A new framework

1. Background2. Theory3. Algorithm4. Application to sequence learning:

4.1 Evaluation of convergence speed4.2 How to avoid sequence ambiguity4.3 Storage of multiple sequences

5. Outlook

Page 3: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning: Background I.

Learning algorithms

Supervised Unsupervised

Learning by linking neurons with existing and fixed

receptive fields.Hopfield-networkAttractor network

Symbol linking Symbol generation

Learning by receptive field generationTopographical projection

generationOcular dominance formation

Kohonen-map

Page 4: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning: Background II.

In many (if not all) symbol generator learning algorithms a built-in connection structure determines the formation of receptive fields.

Lateral inhibition in wide variety of learning algorithms. `Mexican hat' lateral interaction in the topographic map formation algorithms and in ocular dominance generation.

Most explicitly in Kohonen's self-organizing map.

Page 5: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

A symbol generator learning: Self-organizing maps

Input layer,the `external world'

Internal connectionstructure

Connections between internal and external

Learning:

Modification of connections betweenneurons of externaland internal layer.Changes in the receptive fields.

A 2 dimensional gridof neurons

Samples from an N dimensionalvector space

Page 6: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Self-organizing maps II.

Stages of the learning: the internal net stretches out to wrap the input space

Page 7: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Self-organizing maps II.

The result of learning: the neural grid is fitted to the input space.

The result of thelearning is storedin the internal-externalconnections, in the locations of receptivefields.

Each of the neurons,in the internal layerrepresents a part ofexternal world.Map formation.

Page 8: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning principle: Encounter between internal

and external structures

From this unusual point of view, it is an evident generalization,to extend the set of applicable internal connection structures, and using them as built-in models or scheme.

In this way, the learning procedure is become an encounter between an internal model and the structures in the signals coming from the `external world.'

The result of the learning is a correspondence betweenneurons of the internal layer and elements of input space.

Page 9: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning:Internal models

Any connection structure can be used to be fitted to the signal,and the same input can be represented many ways, even parallel.

The models may represent different reference frames, hierarchicalstructures, periodical processions...

Page 10: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning:Application to sequence learning

One of the most importantinternal model structure type,is a linear chain of neuronsconnected with directedconnections.

A directed linear chain of neurons is able to represent atemporal sequence.

The question:An instantaneousposition in thestate space.The answer:The prediction of the followingstate: Or even thepredictionof the whole sequence:

If the system is able toaddresses, theoreticallyit can accessed to any of the following states inone step, or even to thepreceding states.

Page 11: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning:Basic algorithm

L cells in a chain

N dimensional input

LNconnectionsto modify

a ni t= 1 = 1,n

a n1i t 1 = an

i t

W n t 1 = W n t ani t n t a e t

n t1=n t −ani t 1−d n t

ani

ae

W n

Internal dynamics:

Learn when internally activated:

Decreasing learning rate:

Page 12: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning:Simple sequence learning task

Steps of learning without noise, from the random initial distributionof receptive fields. T=100, thenumber of iteration steps.During one iteration the whole sequence is presented, thus it requires NL weight modification.

x=cos 2 t y= sin 4 t

A Lissajous-curve applied as input.

N=5, L=12

The final distribution of receptive fields

Page 13: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning:Sequence with noise

The same input with additive noise

The steps of learning.

The result of the learning is verygood, but (of course) less precise.

Page 14: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning:Noise dependence of convergence

Err

Iterations

The time evolution of error,in case of different noiseamplitude.

=0.5

=0.3=0.1=0.05=0

Page 15: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Noise dependence of speedIt

erat

ions

The required number of iterations to reach a given precision is slightlyincreases with the noiseamplitude.

Page 16: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Sequence lengthdependence of speed

The required number of iterations to reach a given precision doesnot depend on the lengthof the sequence.

Length of sequence, L

Iterations

Page 17: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Input dimensiondependence of speed

The required number of iterations to reach a given precisiondoes not dependon the dimension ofthe input.

Input dimension, N

Iterations

Page 18: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model based learning:evaluation of learning speed

Since the algorithm does LN operations during an iteration, and the required number of iterations to reach a given precision does not depend either on the length of the sequence (L), either on the dimension of the input (N), the whole learning

procedure works with O(LN) operations.

Page 19: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model based learningavoids sequence ambiguity

The task is to learn a self-crossing sequence.

The sequence is noisy

The result of the learning

The usual way of solving the problem is the extension of state-spacewith the recent past of the system.

Page 20: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model based learningavoids sequence ambiguity II.

Two portion of the sequence areoverlap.

Of course sequence is noisy

The result of the learning.

This problem can be solved if the state-space become extended with thederivative of the signal.

Page 21: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model based learningavoids sequence ambiguity III.

Two portion of the sequence areoverlap and their directions are the same.

The noisy signal.

This type of problem is hard to solve with traditional methods, because of the lengthof the overlapping parts are not known previously.

The well-trained connections

Page 22: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning:Multiple sequences

Learning of multiple sequences needs:

A set of built-in neuron chains as models of sequences.

An organizer algorithm to conduct this orchestra.

Different strategies can exist, but the most importantfunctions of it:

The initiation of a models' activity.

The termination of them.

To harmonize the predictions of different models with each other and with the external world.

Page 23: Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary  Zoltán Somogyvári

Model-based learning:Outlook