58
University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Outlook Networks with feedback connections Associative memory Hopfield model Self-organizing maps Dimensionality reduction Principal Component Analysis Independent Component Analysis Hendrich 1

Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Introduction AL 64-360

Outlook

I Networks with feedback connections

I Associative memory

I Hopfield model

I Self-organizing maps

I Dimensionality reduction

I Principal Component Analysis

I Independent Component Analysis

Hendrich 1

Page 2: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Introduction AL 64-360

Remember: Eight major aspects of a PDP model

I a set of processing units

I the current state of activation

I an output function for each unit

I a pattern of connectivity between units

I a propagation rule for propagatin patterns of activities throughthe network of connectivities

I an activation rule for combining the inputs impinging on a unit

I a learning rule whereby patterns of connectivity are modified byexperience

I the environment within the system must operate

(Rumelhard, McClelland, et.al. 1986)

Hendrich 2

Page 3: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Introduction AL 64-360

Remember: Multi-layer perceptrons

Multi-layer perceptrons:

I map an input-space into an output space

I using only feed-forward computations

I continous neuron activation functions (e.g. sigmoid)

I backpropagation learning algorithm

I therefore, neighborhood of input vector x is mapped to aneighborhood of output vector y (image of x)

I continous mapping networks

Hendrich 3

Page 4: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Associative Memory AL 64-360

Associative memories

I goal of learning is to associate known-input vectors (patterns)with given output vectors

I input vectors x ′ near to x should be mapped to y

I e.g., noise-reduction: noisy input vectors are mapped to theoriginal patterns

I similar to clustering algorithms

I implemented with networks with or without feedback

I but feedback networks produce better results

Hendrich 4

Page 5: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Associative Memory AL 64-360

Variants of associative networks

Hendrich 5

Page 6: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Associative Memory AL 64-360

BAM: network

I hetero-associative: x 7→ y 7→ x ′ 7→ y ′ 7→ x ′′ 7→ y ′′ . . .

(Kosko 1988)

Hendrich 6

Page 7: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Associative Memory AL 64-360

Auto-associative memory

I complete whole memory from a little fragment

I complete whole memory from noisy input

I considered a main function of the human cortex

Hendrich 7

Page 8: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Associative Memory AL 64-360

Auto-associative memory

A

M AA

SSG

AL

NN

L

L

I popular example: word-completion task

I first and last letters more important than middle letters

I etc.

Hendrich 8

Page 9: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Associative Memory AL 64-360

Pure feedback network: architecture

I network initialized with external input, S(t = 0) = II next states calculated via neuron activationI output readout once stable (attractor states)I output readout after fixed number of iterations

Hendrich 9

Page 10: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Associative Memory AL 64-360

Feedback network: iterations

I example: pattern-recognition, pattern-completion

I network state updated several times

I iterative improvement of current network state

I more economical than feed-forward network with many layers

Hendrich 10

Page 11: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Associative Memory AL 64-360

Feedback networks: unfolding

network with feedback connections:

I equivalent to n-layer network with same forward connections

I but only for fixed number of feedback iterations

Hendrich 11

Page 12: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Associative Memory AL 64-360

Jordan-network, Elman-network

I feed-forward architecture with some feedback connections

I internal state (context)

I complex behaviour, see SNNS/JavaNNS for implementation

Hendrich 12

Page 13: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Associative Memory AL 64-360

Brain modeling: mostly associative?

(Amit 1986)

Hendrich 13

Page 14: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Basins of attraction AL 64-360

Network state evolution

I network initialized in (random) input state

I will the network stabilize at all?

I will the network converge to the nearest attractor?

I how fast will the network converge?

I basins of attraction

Hendrich 14

Page 15: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Basins of attraction AL 64-360

Energy landscape model

M1 basin of attraction

x1x0

m1l m3rm3lq2rq2lm2rm2lm1r

M1

A

E(x)

M2 Q1 Q2 M3 x

Hendrich 15

Page 16: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Basins of attraction AL 64-360

Basins of attraction: single iteration

I example network, n = 10, p = 1 . . . 4 stored patterns

Hendrich 16

Page 17: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Basins of attraction AL 64-360

Basins of attraction: five iterations

Hendrich 17

Page 18: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Basins of attraction AL 64-360

Basins of attraction: multiple iterations

Hendrich 18

Page 19: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Basins of attraction AL 64-360

Basins of attraction: few patterns (n = 10)

I all patterns stable

I recognition rate vs. Hamming distance of input patterns

Hendrich 19

Page 20: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Basins of attraction AL 64-360

Basins of attraction: saturated network (n = 10)

I all patterns still stable

I but basins of attraction smaller

Hendrich 20

Page 21: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Basins of attraction AL 64-360

Basins of attraction (n = 100)

Hendrich 21

Page 22: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

BAM network AL 64-360

BAM: network

I hetero-associative: x 7→ y 7→ x ′ 7→ y ′ 7→ x ′′ 7→ y ′′ . . .

Hendrich 22

Page 23: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

BAM network AL 64-360

BAM: energy function

Hendrich 23

Page 24: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

BAM network AL 64-360

BAM: energy function (with neuron thresholds θ)

Hendrich 24

Page 25: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

BAM network AL 64-360

BAM: stable states

Hendrich 25

Page 26: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Digression: chaos AL 64-360

Digression: dynamic systems and chaos

I dynamic systems: differential equations, x = Φ(x)

I dynamic maps: discrete time, x(t + 1) = Φ(x)

I study the time evolution of the systems

I but non-linear systems typically show chaotic behaviour

Hendrich 26

Page 27: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Digression: chaos AL 64-360

Example: logistic map

I Logistic map: xn+1 = r xn(1− xn)I Models population growth, population growth parameter rI x ∈ [0, 1]

I 0 < r < 1: population eventually diesI 1 < r < 2: population stabilizes at (r − 1)/rI 2 < r < 3: population oscillates, finally reaches (r − 1)/rI 3 < r < 1 +

√6: population osciallates between two values

I 3.45 < r < 3.54: oscillations between four valuesI 3.57 < r < 4: chaosI 4 < r : population divergesI typical behaviour for non-linear systems with feedback

Wikipedia, Logistic MapHendrich 27

Page 28: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Digression: chaos AL 64-360

Logistic map: bifurcation diagram

Hendrich 28

Page 29: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Digression: chaos AL 64-360

Chaotic brain?!

I chaos typical even in the 1-dimensional case

I gets a lot worse in high-dimensional systems

I neurons are highly non-linear

I brain has 1011 neurons

I with lots of feedback connections

I how to explain stable behaviour at all?

I how to explain memory?

Hendrich 29

Page 30: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Digression: chaos AL 64-360

Asynchronous dynamics: energy decreases

Hendrich 30

Page 31: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Digression: chaos AL 64-360

Energy: equivalent forms

Hendrich 31

Page 32: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Digression: chaos AL 64-360

Energy: limit cycles with asymmetric couplings

Hendrich 32

Page 33: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Digression: chaos AL 64-360

Energy: non-zero diagonal elements

Hendrich 33

Page 34: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Digression: chaos AL 64-360

Energy: asymmetric couplings

I network oscillates

I Note: we can try to use asymmetric couplings to store statesequences in the network

I Note: possible, but doesn’t work very well

Hendrich 34

Page 35: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model

I fully-connected network

I binary neurons, si = {−1,+1}I network initialized with external input state s(0)

I asynchronous network update

I si (t + 1) = sgn(∑

j wijsj(t))

I symmetric couplings, zero diagonal (wii = 0)

I allows to define an energy function

I stable attractors

I use Hebb learning to store a set of patterns xµ

(Hopfield 1982)

Hendrich 35

Page 36: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Feedback network: architecture

I network initialized via separate input layer

I network dynamics via feedback connections

Hendrich 36

Page 37: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: symmetric couplings

Hendrich 37

Page 38: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: energy function

Hendrich 38

Page 39: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: flipflop network

Hendrich 39

Page 40: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: flipflop network energy function

I one of two stable states (lowest energy)

I assumes “smooth” activation functions (more about this later)

Hendrich 40

Page 41: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: OR function

I computations with the Hopfield network?

I example: OR function

Hendrich 41

Page 42: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: XOR function tables

Hendrich 42

Page 43: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: XOR network diagram

Hendrich 43

Page 44: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: Ising spin model

I neuron model is equivalent to Ising model of magnetismI magnetic field hi =

∑nj=1 wijxj + h∗

I can reuse techniques from theoretical phyiscs for networkanalysis

I possible to calculate the storage capacity of the networkHendrich 44

Page 45: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: Hebbian learning

Hendrich 45

Page 46: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: Minover learning

Hendrich 46

Page 47: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: letter recognition

Hendrich 47

Page 48: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

The Hopfield model AL 64-360

Hopfield model: letter recognition

I example input patterns and stable attractors

I append additional index bits to patterns

Hendrich 48

Page 49: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Hopfield model limitations AL 64-360

Hopfield model: Perceptron equivalence

Hendrich 49

Page 50: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Hopfield model limitations AL 64-360

Hopfield model: applications?

I model works well for associative memory

I other applications?

I in particular, parallel fast computation of difficult problems?

I interest triggered by paper on traveling-salesman problem

(Hopfield 1984)

Hendrich 50

Page 51: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Hopfield model limitations AL 64-360

Hopfield model: eight rooks problem

I Put n rooks on an nxn chessboard

I so that they cannot take each other

I solve with Hopfield network, nxn neurons

I consider sorted into rows and columns

I active neuron suppresses other neurons in same row/column

I network converges to solution

Hendrich 51

Page 52: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Hopfield model limitations AL 64-360

Hopfield model: eight rooks problem

Hendrich 52

Page 53: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Hopfield model limitations AL 64-360

Hopfield model: eight queens problem

Hendrich 53

Page 54: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Hopfield model limitations AL 64-360

Hopfield model: eight queens problem

I wij = −2 for neurons on same row, column, diagonal

I neuron threshold −1

I problem: this doesn’t work always

I energy function has local minima with less than n queens

Hendrich 54

Page 55: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Hopfield model limitations AL 64-360

Hopfield model: traveling salesman problem

I use continuous neuron activation functions (e.g. sigmoid)

Hendrich 55

Page 56: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Hopfield model limitations AL 64-360

Hopfield model: solving the TSP

Hendrich 56

Page 57: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Hopfield model limitations AL 64-360

Hopfield model: storage capacity

Hendrich 57

Page 58: Introduction AL 64-360 Outlook · University of Hamburg MIN Faculty Department of Informatics Introduction AL 64-360 Remember: Multi-layer perceptrons Multi-layer perceptrons: I map

University of Hamburg

MIN Faculty

Department of Informatics

Hopfield model limitations AL 64-360

Hopfield model: stochastic noise

I noise can improve network convergence

I because network can escape from local minima

Hendrich 58