12
High Performance High Performance Associative Neural Associative Neural Networks: Networks: Overview and Library Overview and Library Presented at AI’06, Quebec city, Canada, June 7-9, 2006 Oleksiy K. Dekhtyarenko 1 and Dmitry O. Gorodnichy 2 1 - Institute of Mathematical Machines and Systems, Dept. of Neurotechnologies, 42 Glushkov Ave., Kiev, 03187, Ukraine. [email protected] 2 - Institute for Information Technology, National Research Council of Canada, M-50 Montreal Rd, Ottawa, Ontario, K1A 0R6, Canada. [email protected] http://synapse.vit.iit.nrc.ca

High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

Embed Size (px)

Citation preview

Page 1: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

High Performance Associative High Performance Associative Neural Networks:Neural Networks:

Overview and LibraryOverview and LibraryPresented at AI’06, Quebec city, Canada, June 7-9, 2006

Oleksiy K. Dekhtyarenko1 and Dmitry O. Gorodnichy2

1 - Institute of Mathematical Machines and Systems, Dept. of Neurotechnologies,42 Glushkov Ave., Kiev, 03187, Ukraine. [email protected]

2 - Institute for Information Technology, National Research Council of Canada,M-50 Montreal Rd, Ottawa, Ontario, K1A 0R6, Canada. [email protected]

http://synapse.vit.iit.nrc.ca

Page 2: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

2

Associative Neural Network ModelAssociative Neural Network Model

Features:• Distributed storage of information fault tolerance• Parallel way of operation efficient hardware

implementation• Non-iterative learning rules fast, deterministic training

Confirms to three main principles of neural processing:1. Non-linear processing2. Massively distributed collective decision making3. Synaptic plasticity

1. to accumulate learning data in time by adjusting synapses2. to associate receptor to effector (using thus computed synaptic values)

The Associative Neural Network (AsNN) is a dynamical nonlinear system capable of processing information via the evolution of its state in high dimensional state-space.

Page 3: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

3

Examples of Practical ApplicationsExamples of Practical Applications

• Face recognition from video*

• “Electronic Nose”**

*D. Gorodnichy – “Associative Neural Networks as Means for Low-Resolution Video-Based Recognition”, IJCNN’05**A. Reznik; Y. Shirshov; B. Snopok; D. Nowicki; O. Dekhtyarenko & I. Kruglenko – “Associative Memories for Chemical Sensing”, ICONIP'02

Page 4: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

4

Associative PropertiesAssociative PropertiesConvergence ProcessConvergence Process

ttt WXsignSsignX 1

ntX 1,1

Network evolves according to the state update rule:

mVVVV ,...,, 21

ii VVeconvergenc

RadiusAttractionmi

)(

:,,...,1

– set of memorized patterns

We want the network to be retrieve data by associative similarity (to restore noisy or incomplete input data):

Page 5: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

5

Sparse Associative Neural NetworkSparse Associative Neural Network

iNj

niiNN ...12

1

nNn

ii

Advantages over Fully-Connected Model:

• Less memory needed for s/w simulation• Quicker convergence during s/w simulation• Fewer and/or more suitable connections for h/w

implementation• Greater biological plausibility

Output of neuron i can affect neuron j (wij ≠ 0) if and only if:

Architecture, or Connectivity Template:

Connection Density:

Page 6: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

6

Network ArchitecturesNetwork Architectures

Random Architecture1D Cellular Architecture Small-World Architecture

1 – the worst

5 – the bestAssociative

PerformanceMemory

ConsumptionHardware Friendly

Regular (cellular) 1 5 5

Small-World 2 5 4

Scale-Free 2 5 3

Random 3 5 2

Adaptive 4 5 2

Fully-Connected 5 1 1

Page 7: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

7

Compare to …Compare to …

Fully connected net with n=24x24 neurons obtained by tracking and memorizing faces (of 24x24 pixel resolution) from real-life video sequences [Gorodnichy’05]

• Notice visible inherent synaptic structure !

• This synaptic interdependency is utilized by Sparse architectures.

Page 8: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

8

Some Learning AlgorithmsSome Learning Algorithms

• Projective

• Hebbian (Perceptron LR)

• Delta Rule

• Pseudo-Inverse

2mmmj

mj

mi

miij WVVSVSVdW

imj

miiji NVVdWNj :

mj

mi

miiji VVSdWNj :

ii Nlnl : – selection operator

TTiiii VVW , where

Performance Evaluation CriteriaPerformance Evaluation CriteriaError correction capability (Associativity strength) Capacity Training complexity Memory requirements Execution time: a) in Learning and b) in Recognition

Page 9: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

9

Comparative Performance AnalysisComparative Performance AnalysisNetworks with Fixed ArchitecturesNetworks with Fixed Architectures

Associative performance and training complexity as a function of number of stored patterns

Cellular 1D network with dimension 256 and connection radius 12, randomly generated data vectors

Page 10: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

10

Comparative Performance AnalysisComparative Performance AnalysisInfluence of ArchitectureInfluence of Architecture

Sparse network with dimension 200, randomly generated data vectors, various ways of architecture selection

Associative performance as a function of connection density

• PI WS – PseudoInverse Weight Select, architecture targeting maximum informational capacity per synapse

• PI Random – Randomly set sparse architecture with PseudoInverse learning rule

• PI Cell – Cellular architecture with PseudoInverse learning rule

• PI WS Reverse – architecture constructed using the opposite criterion of PI WS

Page 11: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

11

Associative Neural Network LibraryAssociative Neural Network Library

• Publicly available at http://synapse.vit.iit.nrc.ca/memory/pinn/library.html

• Effective C++ implementation of full and sparse associative networks

• Includes noniterative Pseudo-Inverse LR with possibility of addition/removal of selected vectors to/from memory

• Different learning rules: Projective, Hebbian, Delta Rule, Pseudo-Inverse

• Different architectures: fully-connected, cellular (1D and 2D), random, small-world, adaptive

• Desaturation Technique: allows to increase memory capacity up to 100%

• Different update rules: synchro. vs. asynchro. Detection of cycles

• Different testing functions: absolute and normalized radius of attraction, capacity

• Associative Classifiers: Convergence-based, Modular

Page 12: High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

12

Associative Neural Network Library Associative Neural Network Library Hierarchy of Main ClassesHierarchy of Main Classes