38
Cooperating Intelligent Systems Bayesian networks Chapter 14, AIMA

Cooperating Intelligent Systems

Embed Size (px)

DESCRIPTION

Cooperating Intelligent Systems. Bayesian networks Chapter 14, AIMA. Inference. Inference in the statistical setting means computing probabilities for different outcomes to be true given the information - PowerPoint PPT Presentation

Citation preview

Page 1: Cooperating Intelligent Systems

Cooperating Intelligent Systems

Bayesian networksChapter 14, AIMA

Page 2: Cooperating Intelligent Systems

Inference

• Inference in the statistical setting means computing probabilities for different outcomes to be true given the information

• We need an efficient method for doing this, which is more powerful than the naïve Bayes model.

)|( nInformatioOutcomeP

Page 3: Cooperating Intelligent Systems

Bayesian networks

A Bayesian network is a directed graph in which each node is annotated with quantitative probability information:

1. A set of random variables, {X1,X2,X3,...}, makes up the nodes of the network.

2. A set of directed links connect pairs of nodes, parent child

3. Each node Xi has a conditional probability distribution P(Xi | Parents(Xi)) .

4. The graph is a directed acyclic graph (DAG).

Page 4: Cooperating Intelligent Systems

The dentist network

Cavity

CatchToothache

Weather

Page 5: Cooperating Intelligent Systems

The alarm network

Alarm

Burglary Earthquake

JohnCalls MaryCalls

Burglar alarm responds to both earthquakes and burglars.

Two neighbors: John and Mary,who have promised to call youwhen the alarm goes off.

John always calls when there’san alarm, and sometimes whenthere’s not an alarm.

Mary sometimes misses the alarms (she likes loud music).

Page 6: Cooperating Intelligent Systems

The cancer network

From Breese and Coller 1997

Age Gender

SmokingToxics

Cancer

SerumCalcium

LungTumour

GeneticDamage

Page 7: Cooperating Intelligent Systems

The cancer network

From Breese and Coller 1997

Age Gender

SmokingToxics

Cancer

SerumCalcium

LungTumour

GeneticDamage

P(A,G) = P(A)P(G)

P(C|S,T,A,G) = P(C|S,T)

P(SC,C,LT,GD) = P(SC|C)P(LT|C,GD)P(C) P(GD)

P(A,G,T,S,C,SC,LT,GD) = P(A)P(G)P(T|A)P(S|A,G)P(C|T,S)P(GD)P(SC|C)

P(LT|C,GD)

Page 8: Cooperating Intelligent Systems

The product (chain) rule

n

iiin

nn

XparentsxPxxxP

xXxXxXP

121

2211

))(|(),,,(

)(

(This is for Bayesian networks, the general case comeslater in this lecture)

Page 9: Cooperating Intelligent Systems

Bayes network node is a function

A B¬a b

a∧b a∧¬b ¬a∧b ¬a∧¬b

Min 0.1 0.3 0.7 0.0

Max 1.5 1.1 1.9 0.9

C

P(C|¬a,b) = U[0.7,1.9]

0.7 1.9

Page 10: Cooperating Intelligent Systems

Bayes network node is a function

A B

C

A BN node is a conditionaldistribution function• Inputs = Parent values• Output = distribution over values

Any type of function from valuesto distributions.

Page 11: Cooperating Intelligent Systems

Example: The alarm network

Alarm

Burglary Earthquake

JohnCalls MaryCalls

P(B=b)

0.001

P(E=e)

0.002

A P(J=j)

a 0.90

¬a 0.05

A P(M=m)

a 0.70

¬a 0.01

B E P(A=a)

b e 0.95

b ¬e 0.94

¬b e 0.29

¬b ¬e 0.001

Note: Each number in the tables represents aboolean distribution.Hence there is adistribution output forevery input.

Page 12: Cooperating Intelligent Systems

Example: The alarm network

Alarm

Burglary Earthquake

JohnCalls MaryCalls

P(B=b)

0.001

P(E=e)

0.002

A P(J=j)

a 0.90

¬a 0.05

A P(M=m)

a 0.70

¬a 0.01

B E P(A=a)

b e 0.95

b ¬e 0.94

¬b e 0.29

¬b ¬e 0.001

00063.090.070.0001.0998.0999.0

)|()|(),|()()(

)(

ajPamPebaPePbP

ebamjP

Probability distribution for”no earthquake, no burglary,but alarm, and both Mary andJohn make the call”

Page 13: Cooperating Intelligent Systems

Meaning of Bayesian network

n

inii

nnn

nnn

xxxP

xxxPxxxxPxxxxP

xxxPxxxxPxxxP

11

43432321

3232121

),,|(

),,,(),,,|(),,,|(

),,,(),,,|(),,,(

The general chain rule (always true):

n

iiin XparentsxPxxxP

121 ))(|(),,,(

The Bayesian network chain rule:

The BN is a correct representation of the domain iff each node is conditionally independent of its predecessors, given its parents.

Page 14: Cooperating Intelligent Systems

The alarm network

Alarm

Burglary Earthquake

JohnCalls MaryCalls

The fully correct alarm network might look something like the figure.

The Bayesian network (red) assumes that some of the variables are independent (or that the dependecies can be neglected since they are very weak).

The correctness of the Bayesian network of course depends on the validity of these assumptions.

Alarm

Burglary Earthquake

JohnCalls MaryCalls

It is this sparse connection structure that makes the BN approachfeasible (~linear growth in complexity rather than exponential)

Page 15: Cooperating Intelligent Systems

How construct a BN?

• Add nodes in causal order (”causal” determined from expertize).

• Determine conditional independence using either (or all) of the following semantics:– Blocking/d-separation rule– Non-descendant rule– Markov blanket rule– Experience/your beliefs

Page 16: Cooperating Intelligent Systems

Path blocking & d-separation

Intuitively, knowledge about Serum Calcium influences our belief about Cancer, if we don’t know the value of Cancer, which in turn influences our belief about Lung Tumour, etc.

However, if we are given the value of Cancer (i.e. C= true or false), then knowledge of Serum Calcium will not tell us anything about Lung Tumour that we don’t already know.

We say that Cancer d-separates (direction-dependent separation) Serum Calcium and Lung Tumour.

Cancer

SerumCalcium

LungTumour

GeneticDamage

Page 17: Cooperating Intelligent Systems

Path blocking & d-separation

Two nodes Xi and Xj are conditionally independent given a set

= {X1,X2,X3,...} of nodes if for every undirected path in the BN

between Xi and Xj there is some node Xk on the path having one of the following three properties:

1. Xk ∈ , and both arcs on the path

lead out of Xk.

2. Xk ∈ , and one arc on the path

leads into Xk and one arc leads out.

3. Neither Xk nor any descendant of

Xk is in , and both arcs on the

path lead into Xk.

Xk blocks the path between Xi and Xj

Xi

Xj

Xk1

Xk2

Xk3

)|()|()|,( jiji XPXPXXP

Xk1

Xk2

Xk3

Xi and Xj are d-separated if all paths betweeen them are blocked

Page 18: Cooperating Intelligent Systems

Non-descendants

A node is conditionally independent of its non-descendants (Zij), given its parents.

),,|,,(),,|(

),,|,,,(

111

11

mnjjm

mnjj

UUZZPUUXP

UUZZXP

Page 19: Cooperating Intelligent Systems

Markov blanket

A node is conditionally independent of all other nodes in the network, given its parents, children, and children’s parents

These constitute the nodes Markov blanket.

),,,,,,,,|,,(),,,,,,,,|(

),,,,,,,,|,,,(

1111111

1111

nnjjmknnjjm

nnjjmk

YYZZUUXXPYYZZUUXP

YYZZUUXXXP

X1

X2 X3

X4

X5

X6Xk

Page 20: Cooperating Intelligent Systems

Efficient representation of PDs

• Boolean Boolean• Boolean Discrete• Boolean Continuous• Discrete Boolean• Discrete Discrete• Discrete Continuous• Continuous Boolean• Continuous Discrete• Continuous Continuous

C

A

B

P(C|a,b) ?

Page 21: Cooperating Intelligent Systems

Efficient representation of PDs

Boolean Boolean: Noisy-OR, Noisy-AND

Boolean/Discrete Discrete: Noisy-MAX

Bool./Discr./Cont. Continuous: Parametric distribution (e.g. Gaussian)

Continuous Boolean: Logit/Probit

Page 22: Cooperating Intelligent Systems

Noisy-OR exampleBoolean → Boolean

P(E|C1,C2,C3)

C1 0 1 0 0 1 1 0 1

C2 0 0 1 0 1 0 1 1

C3 0 0 0 1 0 1 1 1

P(E=0) 1 0.1 0.1 0.1 0.01 0.01 0.01 0.001

P(E=1) 0 0.9 0.9 0.9 0.99 0.99 0.99 0.999

Example from L.E. Sucar

The effect (E) is off (false) when none of the causes are true. The probability for the effect increases with the number of true causes.

)(#10)0( TrueEP (for this example)

Page 23: Cooperating Intelligent Systems

Noisy-OR general caseBoolean → Boolean

false

trueC

qCCCEP

i

n

i

Cin

i

if0

if1

),,,|0(1

21

Example on previous slide usedqi = 0.1 for all i.

q1

P(E|C1,...)

C1 C2 Cn

q2 qn

PROD

Image adapted from Laskey & Mahoney 1999

Page 24: Cooperating Intelligent Systems

Noisy-MAXBoolean → Discrete

e1

Observed effect

C1 C2 Cn

e2 en

MAX

Effect takes on the max value from different causes

Restrictions:– Each cause must have an off state,

which does not contribute to effect– Effect is off when all causes are off – Effect must have consecutive

escalating values: e.g., absent, mild, moderate, severe.

Image adapted from Laskey & Mahoney 1999

n

i

CkinkiqCCCeEP

1,21 ),,,|(

Page 25: Cooperating Intelligent Systems

Parametric probability densitiesBoolean/Discr./Continuous → Continuous

Use parametric probability densities, e.g., the normal distribution

),(2

)(exp

2

1)(

2

2

N

xXP

Gaussian networks (a = input to the node)

2

2

2

)(exp

2

1)(

ax

XP

Page 26: Cooperating Intelligent Systems

Probit & LogitDiscrete → Boolean

If the input is continuous but output is boolean, use probit or logit

x

dxxxaAP

xxaAP

)/)(exp(2

1)|( :Probit

/)(2exp1

1)|( :Logit

22

-8 -6 -4 -2 0 2 4 6 80

0.2

0.4

0.6

0.8

1The logistic sigmoid

P(A|x)

x

Page 27: Cooperating Intelligent Systems

The cancer network

Age: {1-10, 11-20,...}Gender: {M, F}Toxics: {Low, Medium, High}Smoking: {No, Light, Heavy}Cancer: {No, Benign, Malignant}Serum Calcium: LevelLung Tumour: {Yes, No}

Age Gender

SmokingToxics

Cancer

SerumCalcium

LungTumour

Discrete Discrete/boolean

Discrete Discrete

Discrete

Continuous Discrete/boolean

Page 28: Cooperating Intelligent Systems

Inference in BN

Inference means computing P(X|e), where X is a query (variable) and e is a set of evidence variables (for which we know the values).

Examples:

P(Burglary | john_calls, mary_calls)

P(Cancer | age, gender, smoking, serum_calcium)

P(Cavity | toothache, catch)

Page 29: Cooperating Intelligent Systems

Exact inference in BN

”Doable” for boolean variables: Look up entries in conditional probability tables (CPTs).

y

yePePeP

ePeP ),,(),(

)(

),()|( XX

XX

Page 30: Cooperating Intelligent Systems

Example: The alarm network

Alarm

Burglary Earthquake

JohnCalls MaryCalls

P(B=b)

0.001

P(E=e)

0.002

A P(J=j)

a 0.90

¬a 0.05

A P(M=m)

a 0.70

¬a 0.01

B E P(A=a)

b e 0.95

b ¬e 0.94

¬b e 0.29

¬b ¬e 0.001

},{ },{

),,,,(),|(eeE aaA

mjAEBmjB PP

Evidence variables = {J,M}Query variable = B

What is the probability for a burglary if both John and Mary call?

Page 31: Cooperating Intelligent Systems

Example: The alarm network

Alarm

Burglary Earthquake

JohnCalls MaryCalls

P(B=b)

0.001

P(E=e)

0.002

A P(J=j)

a 0.90

¬a 0.05

A P(M=m)

a 0.70

¬a 0.01

B E P(A=a)

b e 0.95

b ¬e 0.94

¬b e 0.29

¬b ¬e 0.001

)()(),|()|()|(

),(),|()|()|(

),,()|()|(

),,(),,|,(

),,,,(

EPbPEbaPAmPAjP

EbEbaPAmPAjP

AEbAmPAjP

AEbAEbmj

mjAEbB

P

P

PP

P

= 0.001 = 10-3

What is the probability for a burglary if both John and Mary call?

)(),|()|()|(10 3 EPEbAPAmPAjP

},{ },{

),,,,(),|(eeE aaA

mjAEBmjB PP

Page 32: Cooperating Intelligent Systems

Example: The alarm network

Alarm

Burglary Earthquake

JohnCalls MaryCalls

P(B=b)

0.001

P(E=e)

0.002

A P(J=j)

a 0.90

¬a 0.05

A P(M=m)

a 0.70

¬a 0.01

B E P(A=a)

b e 0.95

b ¬e 0.94

¬b e 0.29

¬b ¬e 0.001

3

3

3

},{},{

3

10491.1),,(

105923.0

)](),|()|()|(

)(),|()|()|(

)(),|()|()|(

)(),|()|()|([10

)(),|()|()|(10

),,(

mjb

ePebaPamPajP

ePebaPamPajP

ePebaPamPajP

ePebaPamPajP

EPEbAPAmPAjP

mjb

eeEaaA

P

P

What is the probability for a burglary if both John and Mary call?

},{ },{

),,,,(),|(eeE aaA

mjAEBmjB PP

Page 33: Cooperating Intelligent Systems

Example: The alarm network

Alarm

Burglary Earthquake

JohnCalls MaryCalls

P(B=b)

0.001

P(E=e)

0.002

A P(J=j)

a 0.90

¬a 0.05

A P(M=m)

a 0.70

¬a 0.01

B E P(A=a)

b e 0.95

b ¬e 0.94

¬b e 0.29

¬b ¬e 0.001

716.0),,(),|(

284.0),,(),|(

]10083.2[

),,(),,(),(

10491.1),,(

105923.0),,(

13

11

3

3

mjbmjb

mjbmjb

mjbmjbmj

mjb

mjb

PP

PP

PPP

P

P

What is the probability for a burglary if both John and Mary call?

},{ },{

),,,,(),|(eeE aaA

mjAEBmjB PP

Page 34: Cooperating Intelligent Systems

Example: The alarm network

Alarm

Burglary Earthquake

JohnCalls MaryCalls

P(B=b)

0.001

P(E=e)

0.002

A P(J=j)

a 0.90

¬a 0.05

A P(M=m)

a 0.70

¬a 0.01

B E P(A=a)

b e 0.95

b ¬e 0.94

¬b e 0.29

¬b ¬e 0.001

716.0),,(),|(

284.0),,(),|(

]10083.2[

),,(),,(),(

10491.1),,(

105923.0),,(

13

11

3

3

mjbmjb

mjbmjb

mjbmjbmj

mjb

mjb

PP

PP

PPP

P

P

What is the probability for a burglary if both John and Mary call?

Answer: 28%

},{ },{

),,,,(),|(eeE aaA

mjAEBmjB PP

Page 35: Cooperating Intelligent Systems

Use depth-first search

A lot of unneccesary repeated computation...

Page 36: Cooperating Intelligent Systems

Complexity of exact inference

• By eliminating repeated calculation & uninteresting paths we can speed up the inference a lot.

• Linear time complexity for singly connected networks (polytrees).

• Exponential for multiply connected networks.– Clustering can improve this

Page 37: Cooperating Intelligent Systems

Approximate inference in BN

• Exact inference is intractable in large multiply connected BNs ⇒ use approximate inference: Monte Carlo methods (random sampling).– Direct sampling– Rejection sampling– Likelihood weighting– Markov chain Monte Carlo

Page 38: Cooperating Intelligent Systems

Markov chain Monte Carlo

1. Fix the evidence variables (E1, E2, ...) at their given values.

2. Initialize the network with values for all other variables, including the query variable.

3. Repeat the following many, many, many times:a. Pick a non-evidence variable at random (query Xi or

hidden Yj)

b. Select a new value for this variable, conditioned on the current values in the variable’s Markov blanket.

Monitor the values of the query variables.