Upload
roy-houston
View
219
Download
0
Embed Size (px)
Citation preview
Artificial Intelligence Artificial Intelligence Chapter 19Chapter 19
Reasoning with Uncertain Reasoning with Uncertain InformationInformation
Biointelligence Lab
School of Computer Sci. & Eng.
Seoul National University
(C) 2000-2002 SNU CSE Biointelligence Lab
2
OutlineOutline
Review of Probability Theory Probabilistic Inference Bayes Networks Patterns of Inference in Bayes Networks Uncertain Evidence D-Separation Probabilistic Inference in Polytrees
(C) 2000-2002 SNU CSE Biointelligence Lab
3
19.1 Review of Probability Theory (1/4)19.1 Review of Probability Theory (1/4)
Random variables
Joint probability
(B (BAT_OK), M (MOVES) , L (LIFTABLE), G (GUAGE))
Joint Probability
(True, True, True, True) 0.5686
(True, True, True, False) 0.0299
(True, True, False, True) 0.0135
(True, True, False, False) 0.0007
… …
Ex.
(C) 2000-2002 SNU CSE Biointelligence Lab
4
19.1 Review of Probability Theory (2/4)19.1 Review of Probability Theory (2/4) Marginal probability
Conditional probability
Ex. The probability that the battery is charged given that the are does not move
Ex.
(C) 2000-2002 SNU CSE Biointelligence Lab
5
19.1 Review of Probability Theory (3/4)19.1 Review of Probability Theory (3/4)
Figure 19.1 A Venn Diagram
(C) 2000-2002 SNU CSE Biointelligence Lab
6
19.1 Review of Probability Theory (4/4)19.1 Review of Probability Theory (4/4)
Chain rule
Bayes’ rule
Abbreviation for where
(C) 2000-2002 SNU CSE Biointelligence Lab
7
19.2 Probabilistic Inference19.2 Probabilistic Inference
The probability some variable Vi has value vi given the evidence =e.
p(P,Q,R) 0.3
p(P,Q,¬R) 0.2
p(P, ¬Q,R) 0.2
p(P, ¬Q,¬R) 0.1
p(¬P,Q,R) 0.05
p(¬P, Q, ¬R) 0.1
p(¬P, ¬Q,R) 0.05
p(¬P, ¬Q,¬R) 0.0
RpRp
Rp
RQPpRQPp
Rp
RQpRQp
3.01.02.0
,,),,,|
RpRp
Rp
RQPpRQPp
Rp
RQpRQp
1.00.01.0
,,),,,|
1|||
75.0|
RQpRQP
RQp
(C) 2000-2002 SNU CSE Biointelligence Lab
8
Statistical IndependenceStatistical Independence
Conditional independence
Mutually conditional independence
Unconditional independence
(C) 2000-2002 SNU CSE Biointelligence Lab
9
19.3 Bayes Networks (1/2)19.3 Bayes Networks (1/2)
Directed, acyclic graph (DAG) whose nodes are labeled by random variables.
Characteristics of Bayesian networks Node Vi is conditionally independent of any subset of
nodes that are not descendents of Vi.
Prior probability Conditional probability table (CPT)
k
iiik VPaVpVVVp
121 )(|,...,,
(C) 2000-2002 SNU CSE Biointelligence Lab
11
19.4 Patterns of Inference in Bayes 19.4 Patterns of Inference in Bayes Networks (1/3)Networks (1/3) Causal or top-down inference
Ex. The probability that the arm moves given that the block is liftable
BpLBMpBpLBMp
LBpLBMpLBpLBMp
LBMpLBMpLMp
,|,|
|,||,|
|,|,|
(C) 2000-2002 SNU CSE Biointelligence Lab
12
19.4 Patterns of Inference in Bayes 19.4 Patterns of Inference in Bayes Networks (2/3)Networks (2/3) Diagnostic or bottom-up inference
Using an effect (or symptom) to infer a cause Ex. The probability that the block is not liftable given that the arm
does not move.
9525.0| LMp (using a causal reasoning)
88632.0|
03665.07.00595.0||
28575.03.09525.0||
MLp
MpMpMp
LpLMpMLp
MpMpMp
LpLMpMLp (Bayes’ rule)
(C) 2000-2002 SNU CSE Biointelligence Lab
13
19.4 Patterns of Inference in Bayes 19.4 Patterns of Inference in Bayes Networks (3/3)Networks (3/3) Explaining away
¬B explains ¬M, making ¬L less certain
88632.0030.0
,
,|
,
|,|
,
|,,|
MBp
LpBpLBMp
MBp
LpLBpLBMp
MBp
LpLBMpMBLp (Bayes’ rule)
(def. of conditional prob.)
(structure of the Bayes network)
(C) 2000-2002 SNU CSE Biointelligence Lab
14
19.5 Uncertain Evidence19.5 Uncertain Evidence
We must be certain about the truth or falsity of the propositions they represent. Each uncertain evidence node should have a child node, about whi
ch we can be certain. Ex. Suppose the robot is not certain that its arm did not move.
Introducing M’ : “The arm sensor says that the arm moved”– We can be certain that that proposition is either true or false.
p(¬L| ¬B, ¬M’) instead of p(¬L| ¬B, ¬M)
Ex. Suppose we are uncertain about whether or not the battery is charged.
Introducing G : “Battery guage” p(¬L| ¬G, ¬M’) instead of p(¬L| ¬B, ¬M’)
(C) 2000-2002 SNU CSE Biointelligence Lab
15
19.6 D-Separation (1/3)19.6 D-Separation (1/3)
d-separates Vi and Vj if for every undirected path in the Bayes network between Vi and Vj, there is some node, Vb, on the path having one of the following three properties. Vb is in , and both arcs on the path lead out of Vb
Vb is in , and one arc on the path leads in to Vb and one arc leads out.
Neither Vb nor any descendant of Vb is in , and both arcs on the path lead in to Vb.
Vb blocks the path given when any one of these conditions holds for a path.
Two nodes Vi and Vj are conditionally independent given a set of node
(C) 2000-2002 SNU CSE Biointelligence Lab
16
19.6 D-Separation (2/3)19.6 D-Separation (2/3)
Figure 19.3 Conditional Independence via Blocking Nodes
(C) 2000-2002 SNU CSE Biointelligence Lab
17
19.6 D-Separation (3/3)19.6 D-Separation (3/3)
Ex. I(G, L|B) by rules 1 and 3
By rule 1, B blocks the (only) path between G and L, given B. By rule 3, M also blocks this path given B.
I(G, L) By rule 3, M blocks the path between G and L.
I(B, L) By rule 3, M blocks the path between B and L.
Even using d-separation, probabilistic inference in Bayes networks is, in general, NP-hard.
(C) 2000-2002 SNU CSE Biointelligence Lab
18
19.7 Probabilistic Inference in Polytre19.7 Probabilistic Inference in Polytrees (1/2)es (1/2) Polytree
A DAG for which there is just one path, along arcs in either direction, between any two nodes in the DAG.
(C) 2000-2002 SNU CSE Biointelligence Lab
19
A node is above Q The node is connected to Q only through Q’s parents
A node is below Q The node is connected to Q only through Q’s
immediate successors.
Three types of evidence. All evidence nodes are above Q. All evidence nodes are below Q. There are evidence nodes both above and below Q.
19.7 Probabilistic Inference in Polytre19.7 Probabilistic Inference in Polytrees (2/2)es (2/2)
(C) 2000-2002 SNU CSE Biointelligence Lab
20
Evidence Above (1/2)Evidence Above (1/2)
Bottom-up recursive algorithm Ex. p(Q|P5, P4)
7,6
7,6
7,6
7,6
7,6
4|75|67,6|
4,5|74,5|67,6|
4,5|7,67,6|
4,5|7,64,5,7,6|
4,5|7,6,4,5|
PP
PP
PP
PP
PP
PPpPPpPPQp
PPPpPPPpPPQp
PPPPpPPQp
PPPPpPPPPQp
PPPPQpPPQp
(Structure of The Bayes network)
(d-separation)
(d-separation)
(C) 2000-2002 SNU CSE Biointelligence Lab
21
Evidence Above (2/2)Evidence Above (2/2)
Calculating p(P7|P4) and p(P6|P5)
Calculating p(P5|P1) Evidence is “below” Here, we use Bayes’ rule
2,1
3 3
25|12,1|65|6
34,3|74|34,3|74|7
PP
P P
PpPPpPPPpPPp
PpPPPpPPpPPPpPPp
5
11|55|1
Pp
PpPPpPPp
(C) 2000-2002 SNU CSE Biointelligence Lab
22
Evidence Below (1/2)Evidence Below (1/2)
Top-down recursive algorithm
QpQPPpQPPkp
QpQPPPPkp
PPPPp
QpQPPPPpPPPPQp
|11,14|13,12
|11,14,13,12
11,14,13,12
|11,14,13,121,14,13,12|
9
9
|99|13,12
|9,9|13,12|13,12
P
P
QPpPPPp
QppQPPPpQPPp
8
8,8|9|9P
PpQPPpQPp 9|139|129|13,12 PPpPPpPPPp
(C) 2000-2002 SNU CSE Biointelligence Lab
23
Evidence Below (2/2)Evidence Below (2/2)
10
10
|1010|1110|14
|1010|11,14|11,14
P
P
QPpPPpPPp
QPpPPPpQPPp
1011,10|1511|1011,10|1511|10,15
1111|10,1510,15
1111|10,1510,15|11
1111,10|1510|15
10|1510,15|1110|11
1
11
15
PpPPPpPPpPPPpPPPp
PpPPPpkPPp
PpPPPpPPPp
PpPPPpPPp
PPpPPPpPPp
P
P
(C) 2000-2002 SNU CSE Biointelligence Lab
24
Evidence Above and BelowEvidence Above and Below
||
|,|
|
|,|,|
2
2
QpQpk
QpQpk
p
QpQpQp
}11,14,13,12{},4,5{| PPPPPPQp+ -
(C) 2000-2002 SNU CSE Biointelligence Lab
25
A Numerical Example (1/2)A Numerical Example (1/2) QpQUkpUQp ||
80.099.08.001.095.0
,|,|
,||
RpQRPpRpQRPp
RpQRPpQPpR
019.099.001.001.090.0
,|,|
,||
RpQRPpRpQRPp
RpQRPpQPpR
(C) 2000-2002 SNU CSE Biointelligence Lab
26
A Numerical Example (2/2)A Numerical Example (2/2)
Other techniques Bucket elimination Monte Carlo method Clustering
60.02.02.08.07.0
2.0|8.0||
PUpPUpQUp
21.098.02.0019.07.0
98.0|019.0||
PUpPUpQUp
13.003.035.4|,35.4
20.095.021.0|
03.005.06.0|
UQpk
kkUQp
kkUQp
(C) 2000-2002 SNU CSE Biointelligence Lab
27
Additional Readings (1/5)Additional Readings (1/5)
[Feller 1968] Probability Theory
[Goldszmidt, Morris & Pearl 1990] Non-monotonic inference through probabilistic method
[Pearl 1982a, Kim & Pearl 1983] Message-passing algorithm
[Russell & Norvig 1995, pp.447ff] Polytree methods
(C) 2000-2002 SNU CSE Biointelligence Lab
28
Additional Readings (2/5)Additional Readings (2/5)
[Shachter & Kenley 1989] Bayesian network for continuous random variables
[Wellman 1990] Qualitative networks
[Neapolitan 1990] Probabilistic methods in expert systems
[Henrion 1990] Probability inference in Bayesian networks
(C) 2000-2002 SNU CSE Biointelligence Lab
29
Additional Readings (3/5)Additional Readings (3/5)
[Jensen 1996] Bayesian networks: HUGIN system
[Neal 1991] Relationships between Bayesian networks and neural n
etworks
[Hecherman 1991, Heckerman & Nathwani 1992] PATHFINDER
[Pradhan, et al. 1994] CPCSBN
(C) 2000-2002 SNU CSE Biointelligence Lab
30
Additional Readings (4/5)Additional Readings (4/5)
[Shortliffe 1976, Buchanan & Shortliffe 1984] MYCIN: uses certainty factor
[Duda, Hart & Nilsson 1987] PROSPECTOR: uses sufficiency index and necessity index
[Zadeh 1975, Zadeh 1978, Elkan 1993] Fuzzy logic and possibility theory
[Dempster 1968, Shafer 1979] Dempster-Shafer’s combination rules
(C) 2000-2002 SNU CSE Biointelligence Lab
31
Additional Readings (5/5)Additional Readings (5/5)
[Nilsson 1986] Probabilistic logic
[Tversky & Kahneman 1982] Human generally loses consistency facing uncertainty
[Shafer & Pearl 1990] Papers for uncertain inference
Proceedings & Journals Uncertainty in Artificial Intelligence (UAI) International Journal of Approximate Reasoning