Dutch books and epistemic events

Preview:

DESCRIPTION

ILLC 2005 Interfacing Probabilistic and Epistemic Update. Dutch books and epistemic events. Jan-Willem Romeijn Psychological Methods University of Amsterdam. Outline.  Updating by conditioning  Violations of conditioning  External shocks to the probability - PowerPoint PPT Presentation

Citation preview

Dutch books and epistemic events

Jan-Willem Romeijn

Psychological Methods

University of Amsterdam

ILLC 2005

Interfacing Probabilistic and Epistemic Update

- 2 -

Outline

Updating by conditioning

Violations of conditioning

External shocks to the probability

Meaning shifts in epistemic updates

A Bayesian model of epistemic updates

No-representation theorem

Concluding remarks

- 3 -

Updating by conditioning

probability assignment p

conditioning on events

If probability theory is seen as a logic, updating functions like a deductive inference rule.

Updating by conditioning is a consistency constraint for incorporating new facts in a probability assignment.

events A, B, C, ...

probabilistic conclusions p( | ABC...)

- 4 -

Muddy Venn diagrams

Conditioning on the fact that A is like zooming in on the probability assignment p within the set of possible worlds A.

pA

A

p( | A)

Probability is represented by the size of rectangulars. Apart from normalising the probability of A, no changes are induced by the update operation.

- 5 -

Violating conditioning

Bayesian conditioning is violated if, in the course of the update, we also change the probabilities within A.

B B

The updated probability is pA(B) < p(B|A). This difference makes vulnerable for a Dutch book.

BB B B

p( | A )A pA( )

- 6 -

Rational violations?

In particular cases, violations of conditioning may seem rational.

Violations of the likelihood principle in classical statistics, model selection problems.

Epistemic updates: incorporating facts about knowledge states.

Can we make sense of such violations from within a Bayesian perspective?

- 7 -

Possible resolution

Violations are understandable if they result from changes in meaning. On learning A we may reinterpret B as B'.

Can we represent such a meaning shift as Bayesian update, saying that we actually learned A' ?

p ( B' | A ) p( B | A' )

B' B' B B B B

p( B | A )

?

- 8 -

Probability shocks

Violations of conditioning can be understood as an external shock to the probability assignment p.

B B B Bp= 1/4

p= 1/4

p= 1/4

p= 1/4

p'= 3/8

p'= 3/8

p'= 1/8

p'= 1/8

The events are associated with the same possible worlds, denoted •, but these worlds are assigned probabilities p', according to a new constraint .

A A

- 9 -

Restricting the shock

External shocks to the probability assignment may be governed by further formal criteria, such as minimal distance between p and p'.

p p'

Such criteria may be conservative, but they are not consistent.

- 10 -

Choosing premises

From a logical point of view, the update procedure comes down to choosing new premises.

This is the extra-logical domain of objective Bayesianism: formally constrained prior probabilities.

premise p

events A, B, C, ...

conclusion p( | ABC...)

premise p'

events A, B, C, ...

conclusion p'( | ABC...)

- 11 -

Meaning shifts

The update operation can also be seen as a change to the semantics: p (B' | A) < p (B | A).

B' B' B Bp= 1/4

p= 1/4

p= 1/4

p= 1/4

p= 1/4

p= 1/4

p= 1/4

p= 1/4

A A

The probabilities of possible worlds remain the same, but the update induces an implicit change of the facts involved.

- 12 -

Epistemic updates

Consider two research groups, 1 and 2, that try to discover which of A, B, or C holds:

A B

p= 1/3p= 1/3

D1

The groups use different methods, delivering doubt or certainty in differing sets of possible worlds.

C

p= 1/3

D2

D1

D2

- 13 -

Conditional probability

According to the standard definition of conditional probability, we have p( D2 | D1) = 1/2:

But is this also the appropriate updated probability?

A B

p= 1/2p= 1/2

A B

p= 1/3p= 1/3

C

p= 1/3

D1

D2

D1

D2 D2 D2

D1

- 14 -

Updated probability

It seems that after an update with D1, the second research group has very little to doubt about:

Updating induces a meaning shift D2 D'2 , and the correct updated probability is p ( D'2 | D1) = 0.

A B

p= 1/2p= 1/2

A B

p= 1/2p= 1/2

D2 D2

D1

D'2

D1

- 15 -

Epistemic events

The meaning shift D2 D'2 can be understood by including epistemic states into the semantics.

The diagram shows the accessible epistemic states in the world-state B.

A B CA

B

C

2

1

B

A B

D1

C

D2

- 16 -

External states

After learning that D1, we may exclude world-state C from the state space.

A B CA

B

C

2

1

A

BC

A B C1

A

BC

A

B

C

2 W

W

- 17 -

Epistemic update

But a full update also comprises conditioning on the accessible epistemic states of both research groups.

A B C1

A

B

A

B

C

2

A B C1

A

B

A

B

C

2

This latter step brings about the event change D2 D'2.

- 18 -

Bayesian conditioning

There is no violation of conditioning in the example.

A B C1

A

B

A

B

C

2

It is simply unclear which event we are supposed to update with upon learning that group 1 is in doubt: D1 or D'1.

A B C1

A

BC

A

B

C

2 W

C

D1 D'1

W

?

- 19 -

Choosing semantics

Many puzzles on the applicability of Bayesian updating can be dealt with by making explicit the exact events we update upon.

We must choose the semantics so as to include all these events. Is that always possible?

B Bp= 1/4

p= 1/4

p= 1/4

p= 1/4

A?

B Bp= 1/4

p= 1/4

p= 1/4

p= 1/4

A'

- 20 -

Judy Benjamin updates

In updating a probability p to p by distance minimisation under a partition of constraints , we may have

for some B and all . Now suppose that we can associate the constraints with a partition of events G:

)()(

)()(

BpBpp

BpBpp

.1)( dGp),|()( Gpp

- 21 -

No-representation theorem

In Bayesian conditioning on events A from a partition, the prior is always a convex combination of the posteriors:

.)|()()( dABpApBp

It thus seems that there is no set of events G that can mimic distance minimisation on the constraints .

.)|()()( dGBpGpBp

But because p(B|G) > p(B) for all but one , we have

- 22 -

In closing

Some considerations for further research:

• There is a large gap between the epistemic puzzles and cases like model selection.

• It is unclear what kind of event is behind violations of the likelihood principle, as in the stopping rule.

• Probabilistic consistency may not be the only virtue if we object to a principled distinction between epistemology and logic.

Recommended