Hyperdimensional Cognitive Computing: A New Approach to ......Hyperdimensional Cognitive Computing:...

Preview:

Citation preview

Hyperdimensional Cognitive Computing: A New Approach to

Some Very Old Problems

Simon D. LevyWashington & Lee University

Lexington, Virginia, USA

Tuesday, April 14, 2009

Inspiration(s)

[I]t turns out that we don’t think the way we think we think! ... The scientific evidence coming in all around us is clear: Symbolic conscious reasoning, which is extracted through protocol analysis from serial verbal introspection, is a myth.

− J. Pollack (2005)

[W]hat kinds of things suggested by the architecture of the brain, if we modeled them mathematically, could give some properties that we associate with mind?

− P. Kanerva (to appear)

Tuesday, April 14, 2009

What is Mind?

Tuesday, April 14, 2009

What is Mind?

Tuesday, April 14, 2009

Tuesday, April 14, 2009

Tuesday, April 14, 2009

The Need for New Representational Principles• Ecological affordances (Gibson

1979); exploiting the environment (Clark 1998)

• Distributed/Connectionist Representations (PDP 1986)

• Holographic Representations (Gabor 1971; Plate 2003)

• Fractals / Attractors / Dynamical Systems (Tabor 2000; Levy & Pollack 2001)

Tuesday, April 14, 2009

The Need for New Representational Principles• Ecological affordances (Gibson

1979); exploiting the environment (Clark 1998)

• Distributed/Connectionist Representations (PDP 1986)

• Holographic Representations (Gabor 1971; Plate 2003)

• Fractals / Attractors / Dynamical Systems (Tabor 2000; Levy & Pollack 2001)

Tuesday, April 14, 2009

Pitfalls to Avoid

1. The “Short Circuit” (Localist Connectionist) Approachi) Traditional models of phenomenon X (language) use entities A,

B, C, ... (Noun Phrase, Phoneme, ...)

ii) We wish to model X in a more biologically realistic way.

iii) Therefore our model of X will have a neuron (pool) for A, one for B, one for C, etc.

Tuesday, April 14, 2009

a.k.a. The Reese’s Peanut Butter Cup Model

Tuesday, April 14, 2009

E.g. Neural Blackboard Model (van der Velde & de Kamps 2006)

Tuesday, April 14, 2009

Benefits of Localism (Page 2000)

• Transparent (one node, one concept)

• Supports lateral inhibition / winner-takes all

Tuesday, April 14, 2009

Lateral Inhibition (WTA)

A B C

L1

L2

Tuesday, April 14, 2009

Problems with Localism

• Philosophical problem: “fresh coat of paint on old rotting theories” (MacLennan 1991): what new insights does “neuro-X” provide?

• Engineering problem: need to recruit new hardware for each new concept/combination leads to combinatorial explosion (Stewart & Eliasmith 2008)

Tuesday, April 14, 2009

The Appeal of Distributed

Representations(Rumelhart &

McClelland 1986)

Tuesday, April 14, 2009

WALK

WALKED

Tuesday, April 14, 2009

ROAR

ROARED

Tuesday, April 14, 2009

SPEAK

SPOKE

Tuesday, April 14, 2009

GO

WENT

Tuesday, April 14, 2009

ignores(mary, john)

Mary won’t give John the time of day.

Tuesday, April 14, 2009

Challenges (Jackendoff 2002)

Tuesday, April 14, 2009

I. The Binding Problem

+

? ? ? ?Tuesday, April 14, 2009

II. The Problem of Two

+

? ? ?Tuesday, April 14, 2009

III. The Problem of Variables

ignores(X, Y)

X won’t give Y the time of day.

Tuesday, April 14, 2009

IV. Binding in Working Memory vs. Long-Term Memory

ignores(X, Y) X won’t give Y the time of day.

Long-term memory (idiosyncratic):

transitive-action(X, Y, Z) X Verb Y Z

Working memory (“rule-based”):

Mary gave John an apple.

Tuesday, April 14, 2009

Vector Symbolic Architectures

(Plate 1991; Kanerva 1994; Gayler 1998)

Tuesday, April 14, 2009

Tensor Product Binding(Smolensky 1990)

Tuesday, April 14, 2009

Binding

Tuesday, April 14, 2009

Bundling

+ =

Tuesday, April 14, 2009

Unbinding (query)

Tuesday, April 14, 2009

Lossy

Tuesday, April 14, 2009

Lossy

Tuesday, April 14, 2009

Cleanup

Hebbian / Hopfield /

Attractor Net

Tuesday, April 14, 2009

Reduction(Holographic; Plate 2003)

Tuesday, April 14, 2009

Reduction(Binary;

Kanerva 1994,Gayler 1998)

Tuesday, April 14, 2009

Composition / Recursion

Tuesday, April 14, 2009

Variables

X

john

Tuesday, April 14, 2009

Scaling Up

• With many (> 10K) dimensions, get

• Astronomically large # of mutually orthogonal vectors (symbols)

• Surprising robustness to noise

Tuesday, April 14, 2009

Pitfalls to Avoid

2. The Homunculus problem, a.k.a. Ghost in the Machine(Ryle 1949)

In cognitive modeling, the homunculus is the researcher: supervises learning, hand-builds representations, etc.

Tuesday, April 14, 2009

Banishing the Homunculus

Tuesday, April 14, 2009

Step I: Automatic Variable Substitution

• If A is a vector over {+1,-1}, then A*A = vector of 1’s (multiplicative identity)

• Supports substitution of anything for anything: everything (names, individuals, structures, propositions) can be a variable!

Tuesday, April 14, 2009

“What is the Dollar of Mexico?” (Kanerva, to appear)

• Let X = <country>, Y = <currency>, A = <USA>, B = <Mexico>

• Then A = X*U + Y*D, B = X*M + Y*PD*A*B =

D*(X*U + Y*D) * (X*M + Y*P) =

(D*X*U + D*Y*D) * (X*M + Y*P) =

(D*X*U + Y) * (X*M + Y*P) =

D*X*U*X*M + D*X*U*Y*P + Y*X*M + Y*Y*P =

P + noise

Tuesday, April 14, 2009

Learning Grammatical Constructions from a Single Example (Levy, to appear)

• Given

• Meaning: KISS(MARY, JOHN)

• Form: Mary kissed John

• Lexicon: KISS/kiss, MARY/Mary, ...

• What is the form for HIT(BILL, FRED) ?

Tuesday, April 14, 2009

Learning Grammatical Constructions from a Single Example (Levy, to appear)

(ACTION*KISS + AGENT*MARY + PATIENT*JOHN) *

(P1*Mary + P2*kissed + P3*John) *

(KISS*kissed + MAY*Mary + JOHN*John + BILL*Bill + FRED*Fred + HIT*hit) *

(ACTION*HIT + AGENT*BILL + PATIENT*FRED) =

....

= (P1*Bill + P2*hit + P3*Fred) + noise

Tuesday, April 14, 2009

Step II: Distributed “Lateral Inhibition”

• Analogical mapping as holistic graph isomorphsm (Gayler & Levy, in progress)

cf. Pelillo (1999)

A

B

C D

P

Q

R S

Tuesday, April 14, 2009

A

B

C D

P

Q

R S

Possibilities x: A*P + A*Q + A*R + A*S + ... + D*S

Evidence w: A*B*P*Q + A*B*P*R +...+ B*C*Q*R + .. + C*D*R*S

X*W = A*Q + B*R + ... + A*P + ... + D*S

Tuesday, April 14, 2009

c∧

c

w * cleanup /∑

xt

xt+1πt

Tuesday, April 14, 2009

Tuesday, April 14, 2009

Tuesday, April 14, 2009

Future Work: Automatic Decomposition

MSC (Arathorn 2002)Tuesday, April 14, 2009

MSC (Arathorn 2002)Tuesday, April 14, 2009

Recommended