20
Neural Networks

Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Embed Size (px)

Citation preview

Page 1: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Neural Networks

Page 2: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Introduction

• Artificial Neural Networks (ANN)– Connectionist computation– Parallel distributed processing

• Biologically Inspired computational models• Machine Learning• Artificial intelligence

"the study and design of intelligent agents" where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success.

Page 3: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

History

• McCulloch and Pitts introduced the Perceptron in 1943.– Simplified model of a biological neuron

• The drawback in the late 1960's – (Minsky and Papert) – Perceptron limitations

• The solution in the mid 1980's– Multi-layer perceptron– Back-propagation training

Page 4: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Summary of Applications

• Function approximation• Pattern recognition/Classification• Signal processing• Modeling• Control• Machine learning

Page 5: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Biologically Inspired.

• Electro-chemical signals• Threshold output firing

Human brain: About 100 billion (1011) neurons and 100 trillion (1014) synapses

Page 6: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

The Perceptron

• Sum of Weighted Inputs.• Threshold activation function

Axon

Terminal Branches of Axon

Dendrites

S

x1

x2

w1

w2

wnxn

x3 w3

i

n

iixwu

1

u

uy

if 0

if 1{

Page 7: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Activation Function

• The sigmoid function: Logsig (Matlab)

ueuy

1

1)(logsig

Page 8: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Activation Function

• The tanH function: tansig (Matlab)

11

2)(

2tansig

ueuy

Page 9: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

The multi layer perceptron (MLP)

f

f

f

f

...

...

f

f

f

f...

1 1

f

f

f

f

...

1

zinzout

W1 W2 W3

W1 W2 W3F1 F2 F3

1 1 1

zin zout

X1 X2 X3Y0Y1

Y2 Y3

Page 10: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

The multi layer perceptron (MLP)

W1 W2 W3F1 F2 F3

1 1 1

zin zout

X1 X2 X3Y0Y1

Y2 Y3

))((

))1((

)(

11

ii

i

iii

iii

mXf

Xf

XFY

YWX

Page 11: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Supervised Learning

• Learning a function from supervised training data. A set of Input vectors Zin and corresponding desired output vectors Zout.

),,..,(ˆ21 innout ZWWWFZ

)()(2

1)( kEkEkV T

)(ˆ)()( kZkZkE outout

• The performance function

N

k

TN kEkEkV

1

)()(2

1)(

Page 12: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Supervised LearningGradient descent backpropagation

• The Back Propagation Error Algorithm

),,..,(ˆ21 innout ZWWWFZ

wkwkw )()( 1

w

iViw

)(

*)(

Page 13: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

BPE learning.

12

11

11

ˆ

)(

1

YWZ

XFY

zWX

out

in

Sigmoid

2211221'

1

2

)1()(

TT WYYWXF

E

111

T

11

222122

)1()( , 1

)1()( ,

GkWkWZ

G

GkWkWYG

in

T

W1 W2F

1zin zout

f

f

f

f

S

S

S

S

...

1...

...

X1Y1

Page 14: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Neural Networks

0 Collect data.1 Create the network.2 Configure the network.3 Initialize the weights.4 Train the network.5 Validate the network.6 Use the network.

Page 15: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Collect data.

Lack of information in the traning data.The main problem !

• As few neurons in the hidden layer as posible.• Only use the network in working points represented in the traningdata.• Use validation and test data.• Normalize inputs/targets to fall in the range [-1,1] or have zero mean and

unity variance

Page 16: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Create the network.Configure the network.Initialize the weights.

Number of neurons in the hidden layer

NV

f

f

f

f

S

S

S

S

...

1

...

...

Only one hidden layer.

Page 17: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Train the network.Validate the network.

Dividing the Data into three subsets.1. Training set (fx. 70%)2. Validation set (fx. 15%)3. Test set (fx. 15%)

NV

Number of iterations.

trainlm: Levenberg-Marquardttrainbr: Bayesian Regularizationtrainbfg: BFGS Quasi-Newtontrainrp: Resilient Backpropagationtrainscg: Scaled Conjugate Gradienttraincgb: Conjugate Gradient with Powell/Beale Restartstraincgf: Fletcher-Powell Conjugate Gradienttraincgp: Polak-Ribiére Conjugate Gradienttrainoss: One Step Secanttraingdx: Variable Learning Rate Gradient Descenttraingdm: Gradient Descent with Momentomtraingd: Gradient Descent

Page 18: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Other types of Neural networksThe RCE net: Only for classification.

X1

X2

xx

xx

x

x

x

xx

xx

o

oo

o o

ooo

o o

Page 19: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Other types of Neural networksThe RCE net: Only for classification.

X1

X2

xx

xx

x

x

x

xx

xx

o

oo

o o

ooo

o o

l

l

l

l

S...

...

Page 20: Neural Networks. Introduction Artificial Neural Networks (ANN) – Connectionist computation – Parallel distributed processing Biologically Inspired computational

Parzen Estimator

G

G

G

G

S

...

... S/

Y

Xin Yout

X

)()(2 iin

Tiini XXXXD

n

i

D

n

i

Di

outi

i

e

eYY

1

)2

(

1

)2

(

2

2

2

2

xx

xxx

x xx

Xin

Yout