28
ANNs (Artificial Neural Networks)

ANNs (Artificial Neural Networks). THE PERCEPTRON

Embed Size (px)

Citation preview

Page 1: ANNs (Artificial Neural Networks). THE PERCEPTRON

ANNs(Artificial Neural Networks)

Page 2: ANNs (Artificial Neural Networks). THE PERCEPTRON
Page 3: ANNs (Artificial Neural Networks). THE PERCEPTRON

THE PERCEPTRON

Page 4: ANNs (Artificial Neural Networks). THE PERCEPTRON

Perceptron

• X is a (column) vector of inputs.• W is a (column) vector of weights.• or w0 is the bias or threshold weight.

Page 5: ANNs (Artificial Neural Networks). THE PERCEPTRON

Perceptron

Usage:1.training/learning– local vs. global minima– supervised vs. unsupervised

2.feedforward (or testing or usage or application)– Indicate class i if output g(X)>0; not class i otherwise.

Page 6: ANNs (Artificial Neural Networks). THE PERCEPTRON

Perceptron

01

0

wxw

wXWXgd

iii

Page 7: ANNs (Artificial Neural Networks). THE PERCEPTRON

Perceptron

The bias or threshold can be represented as simply another weight (w0) with a constant input of 1 (x0=1).

d

iii

d

iii

d

iii

xxw

wxw

wxw

wXWXg

00

01

01

0

1 where

1

Page 8: ANNs (Artificial Neural Networks). THE PERCEPTRON

Perceptron

• This is the dot product of two vectors, W and X.

d

iii

d

iii

d

iii

xxw

wxw

wxw

wXWXg

00

01

01

0

1 where

1

i

iixwXWXW cos

Page 9: ANNs (Artificial Neural Networks). THE PERCEPTRON

Activation functions

1. linear

2. threshold

3. sigmoid

4. tanh

Xgout

eout

eout

Xgout

Xgkout

XgXg

tanh

1

1or

1

1

otherwise 0)(or 1 ;0 if 1

21

Page 10: ANNs (Artificial Neural Networks). THE PERCEPTRON
Page 11: ANNs (Artificial Neural Networks). THE PERCEPTRON

Relationship between sigmoid and tanh

11

2)tanh(

tanh12

1

1

1

2

21

x

x

ex

xe

Page 12: ANNs (Artificial Neural Networks). THE PERCEPTRON

Perceptron training/learning

1. Initialize weights (including thresholds, w0) to random numbers in [-0.5,…,+0.5] (uniformly distributes).

2. Select training input vector, X, and desired output, d.

3. Calculate actual output, y.4. Learn. (Only perform this step when output is

incorrect.)where is in [0..1] and is the gain or learning rate.

icurrenti

newi xydww

Page 13: ANNs (Artificial Neural Networks). THE PERCEPTRON

• “The proof of the perceptron learning theorem (Rosenblatt 1962) demonstrated that a perceptron could learn anything it could represent.” [3]

• So what can it represent?– Any problem that is linearly separable.– Are all problems linear separable?

Page 14: ANNs (Artificial Neural Networks). THE PERCEPTRON

• Consider a binary function of n binary inputs.– “A neuron with n binary inputs can have 2n different

input patterns, consisting of ones and zeros. Because each input pattern can produce two different binary outputs, 1 and 0, there are 22n different functions of n variables.” [3]

– How many of these are separable? Not many! For n=6, 226 = 1.8x1019 but only 5,028,134 are linearly separable.

– AND and OR are linearly separable but XOR is not!

Page 15: ANNs (Artificial Neural Networks). THE PERCEPTRON

How about adding additional layers?

• “Multilayer networks provide no increase in computational power over a single-layer network unless there is a nonlinear function between layers.” [3]

– That’s because matrix multiplication is associative.• (XW1)W2 = X(W1W2)

Page 16: ANNs (Artificial Neural Networks). THE PERCEPTRON

• “It is natural to ask if every decision can be implemented by such a three-layer network … The answer, due ultimately to Kolmogorov …, is “yes” – any continuous function from input to output can be implemented in a three-layer net, given sufficient number of hidden units, proper nonlinearities, and weights.” [1]

Page 17: ANNs (Artificial Neural Networks). THE PERCEPTRON

MULTILAYER NETWORKS

Page 18: ANNs (Artificial Neural Networks). THE PERCEPTRON

Note threshold nodesfrom “Usefulness of artificial neural networks to predict follow-up dietary

protein intake in hemodialysis patients”http://www.nature.com/ki/journal/v66/n1/fig_tab/4494599f1.html

Page 19: ANNs (Artificial Neural Networks). THE PERCEPTRON
Page 20: ANNs (Artificial Neural Networks). THE PERCEPTRON
Page 21: ANNs (Artificial Neural Networks). THE PERCEPTRON
Page 22: ANNs (Artificial Neural Networks). THE PERCEPTRON
Page 23: ANNs (Artificial Neural Networks). THE PERCEPTRON
Page 24: ANNs (Artificial Neural Networks). THE PERCEPTRON
Page 25: ANNs (Artificial Neural Networks). THE PERCEPTRON
Page 26: ANNs (Artificial Neural Networks). THE PERCEPTRON

Backpropagation(learning in a multilayer network)

Page 27: ANNs (Artificial Neural Networks). THE PERCEPTRON

Sigmoid/logistic function(and its derivative)

compute) to(trivial 1

1)subtract & (add

rule) l(reciproca

2

2

11

1

1

1

1

1

1

1

11

1

1

1-1

1

11

2

22

2

2

2dxd

xsxsxs

xsxs

xs

xs

xx

x

xx

x

x

x

x

x

x

x

x

x

ee

e

ee

e

e

e

e

e

e

eedx

d

e

Page 28: ANNs (Artificial Neural Networks). THE PERCEPTRON

References

1. R.O. Duda, P.E. Hart, D.G. Stork, “Pattern Classification,” John Wiley and Sons, 2001.

2. S.K. Rogers, M. Kabrisky, “An introduction to biological and artificial neural networks for pattern recognition,” SPIE Optical Engineering Press, 1991.

3. P.D. Wasserman, “Neural computing,” Van Nostrand Reinhold, 1989.