17
Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Embed Size (px)

Citation preview

Page 1: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Slide 1

Lecture 2

ASSOCIATIONS, RULES, AND MACHINES

Victor Eliashberg

Consulting professor, Stanford University, Department of Electrical Engineering

Page 2: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

BDW

Human-like robot (D,B)External world, W

External system (W,D)

Sensorimotor devices, D

Computing system, B, simulatingthe work of human nervous system

Slide 2

“When you have eliminated the impossible, whatever remains, however improbable, must be the truth.” (Sherlock Holmes)

SCIENTIFIC / EGINEERING APPROACH

Page 3: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

ZERO-APPROXIMATION MODEL

Slide 3

s(ν+1)

s(ν)

Page 4: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

BIOLOGICAL INTERPRETATION

Slide 4

Working memory, episodic memory, and mental imagery

ASAM

Motor control

Page 5: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

PROBLEM 1: LEARNING TO SIMULATE the Teacher This problem is simple: system AM needs to learn a manageable number of fixed rules.

Slide 5

NM.y

Teacher

AM

symbol read

X11

X12

0y

current state of mind next state of mind

movetype symbol

Xy

sel

NM1

Page 6: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

PROBLEM 2: LEARNING TO SIMULATE EXTERNAL SYSTEM This problem is hard: the number of fixed rules needed to represent a RAM

with n locations explodes exponentially with n.

Slide 6

y

1

2

NS

NOTE. System (W,D) shown in slide 3 has the properties of a random access memory (RAM).

Page 7: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Programmable logic array (PLA): a logic implementation of a local associative memory (solves problem 1 from slide 5)

Slide 7

Page 8: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

BASIC CONCEPTS FROM THE AREA OF ARTIFICIAL NEURAL NETWORKS

Slide 8

Page 9: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Typical neuron

Neuron is a very specialized cell. There are several types of neurons with different shapes and different types of membrane proteins. Biological neuron is a complex functional unit. However, it is helpful to start with a simple artificial neuron (next slide).

Slide 9

Page 10: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Neuron as the first-order linear threshold element:

x1

xk xmgkg1 gm

y

u

Inputs: xk R’Output: y R’

Parameters: g1,… gm R’

y=L( u )

(1)

(2)

L( u) ={ u if u > 00 otherwise

(3)

Equations:

dudt

+ u =

m

Σ gkxk k=1

τ

where,

u0

y=L( u )

R’ is the set of real non-negative numbers

u

x1xk

xm

g1

gk

gm

y

s

A more convenient notation

xk is the k-th component of input

vectorgk is the gain (weight) of the k-th

synapses =

Σ gkxk

m

k=1

is the total postsynaptic current

u is the postsynaptic potential

y is the neuron output

τ is the time constant of the neuron

τ

Slide 10

Page 11: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Input synaptic matrix, input long-term memory (ILTM) and DECODING

si =

Σ gxikxk

m

k=1

i=1,…n

(1) fdec: X × Gx S (2)

An abstract representation of (1):

x1xk

xm

s1 si sn

gx1

k DECODING (computing similarity)

x

s1si sn

ILTMgx

ik gxnk

Notation:

x=(x1, .. xm) are the signals from input neurons (not shown)

gx = (gxik) i=1,…n, k=1,…m is the matrix of synaptic gains -- we

postulate that this matrix represents input long-term memory (ILTM)s=(s1, .. sn) is the similarity function

Slide 11

Page 12: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Layer with inhibitory connections as the mechanism of the winner-take-all (WTA) choice

Slide 12

Note. Small white and black circles represent excitatory and inhibitory synapses, respectively.

s1

d1

α

β

si

di

α

β

sn

dn

α

β

uiu1un

xinhq

τττEquations:

(1)

(2)

(3)

iwin : { i / si=max sj > 0 }

( j )

if (i == iwin) di=1; else di=0;

(4)

(5)

Procedural representation:RANDOM CHOICE

s1 si sn

iwin

“: “ denotes random equally probable choice

Page 13: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Output synaptic matrix, output long-term memory (OLTM) and ENCODING

y1

yk

ypgy

ki

d1 di dn

gykngy

k

1

NOTATION:

d=(d1, .. dm) signals from the WTA layer (see previous slide) gy = (gy

ki) i=1,…n, k=1,…m is the matrix of synaptic gains -- we postulate that this matrix represents output long-term memory (OLTM)y=(y1, .. yp) output vector

OLTM

ENCODING (data retrieval)

y

d1 di dn

yk =

Σ gykidi

i=1

k=1,…p (1) fenc: D × Gy Y (2)

An abstract representation of (1):n

Slide 13

Page 14: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

A neural implementation of a local associative memory (solves problem 1 from slide 5) (WTA.EXE)

Slide 14

DECODING

ENCODING

RANDOM CHOICE

Input long-term memory (ILTM)

Output long-term memory (OLTM)

addressing by content

retrieval

S21(I,j)

N1(j)

S21(i,j)

Page 15: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

A functional model of the previous network [7],[8],[11]

(WTA.EXE)

(1)

(2)

(3)

(4)

(5)

Slide 15

Page 16: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Slide 17

Representation of local associative memory in terms of three “one-step” procedures: DECODING, CHOICE, ENCODING

Page 17: Slide 1 Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering

Slide 18

HOW CAN WE SOLVE THE HARD PROBLEM 2 from slide 6?