73
REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

Embed Size (px)

Citation preview

Page 1: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

REASONING WITH CAUSE AND EFFECT

Judea PearlDepartment of Computer Science

UCLA

Page 2: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

• Modeling: Statistical vs. Causal

• Causal Models and Identifiability

• Inference to three types of claims:

1. Effects of potential interventions

2. Claims about attribution (responsibility)

3. Claims about direct and indirect effects

• Actual Causation and Explanation

• Falsifiability and Corroboration

OUTLINE

Page 3: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

TRADITIONAL STATISTICALINFERENCE PARADIGM

Data

Inference

Q(P)(Aspects of P)

PJoint

Distribution

e.g.,Infer whether customers who bought product Awould also buy product B.Q = P(B|A)

Page 4: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

THE CAUSAL INFERENCEPARADIGM

Data

Inference

Q(M)(Aspects of M)

MData-generating

Model

Some Q(M) cannot be inferred from P.e.g.,Infer whether customers who bought product Awould still buy A if we double the price.

Page 5: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES

Datajoint

distribution

inferencesfrom passiveobservations

Probability and statistics deal with static relations

ProbabilityStatistics

Page 6: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES

Datajoint

distribution

inferencesfrom passiveobservations

Probability and statistics deal with static relations

ProbabilityStatistics

Causal analysis deals with changes (dynamics)i.e. What remains invariant when P changes.

• P does not tell us how it ought to change

e.g. Curing symptoms vs. curing diseases e.g. Analogy: mechanical deformation

Page 7: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES

Datajoint

distribution

inferencesfrom passiveobservations

Probability and statistics deal with static relations

ProbabilityStatistics

CausalModel

Data

Causalassumptions

1. Effects of interventions

2. Causes of effects

3. Explanations

Causal analysis deals with changes (dynamics)

Experiments

Page 8: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES (CONT)

CAUSALSpurious correlationRandomizationConfounding / EffectInstrumentHolding constantExplanatory variables

STATISTICALRegressionAssociation / Independence“Controlling for” / ConditioningOdd and risk ratiosCollapsibility

1. Causal and statistical concepts do not mix.

2.

3.

4.

Page 9: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAUSALSpurious correlationRandomizationConfounding / EffectInstrumentHolding constantExplanatory variables

STATISTICALRegressionAssociation / Independence“Controlling for” / ConditioningOdd and risk ratiosCollapsibility

1. Causal and statistical concepts do not mix.

4.

3. Causal assumptions cannot be expressed in the mathematical language of standard statistics.

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES (CONT)

2. No causes in – no causes out (Cartwright, 1989)

statistical assumptions + datacausal assumptions causal conclusions }

Page 10: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAUSALSpurious correlationRandomizationConfounding / EffectInstrumentHolding constantExplanatory variables

STATISTICALRegressionAssociation / Independence“Controlling for” / ConditioningOdd and risk ratiosCollapsibility

1. Causal and statistical concepts do not mix.

4. Non-standard mathematics:a) Structural equation models (SEM)b) Counterfactuals (Neyman-Rubin)c) Causal Diagrams (Wright, 1920)

3. Causal assumptions cannot be expressed in the mathematical language of standard statistics.

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES (CONT)

2. No causes in – no causes out (Cartwright, 1989)

statistical assumptions + datacausal assumptions causal conclusions }

Page 11: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

WHAT'S IN A CAUSAL MODEL?

Oracle that assigns truth value to causalsentences:

Action sentences: B if we do A.

Counterfactuals: B would be different ifA were true.

Explanation: B occurred because of A.

Optional: with what probability?

Page 12: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

Z

YX

INPUT OUTPUT

FAMILIAR CAUSAL MODELORACLE FOR MANIPILATION

Page 13: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAUSAL MODELS ANDCAUSAL DIAGRAMS

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U•

Page 14: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAUSAL MODELS ANDCAUSAL DIAGRAMS

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U

U1 U2I W

Q P PAQ 222

111uwdqbp

uidpbq

Page 15: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U(iv) Mx= U,V,Fx, X V, x X

where Fx = {fi: Vi X } {X = x}(Replace all functions fi corresponding to X with the constant

functions X=x)•

CAUSAL MODELS ANDMUTILATION

Page 16: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAUSAL MODELS ANDMUTILATION

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U

U1 U2I W

Q P 222

111uwdqbp

uidpbq

(iv)

Page 17: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAUSAL MODELS ANDMUTILATION

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U(iv)

U1 U2I W

Q P P = p0

0

222

111

pp

uwdqbp

uidpbq

Mp

Page 18: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U(iv) Mx= U,V,Fx, X V, x X

where Fx = {fi: Vi X } {X = x}(Replace all functions fi corresponding to X with the constant

functions X=x)

Definition (Probabilistic Causal Model): M, P(u)P(u) is a probability assignment to the variables in U.

PROBABILISTIC CAUSAL MODELS

Page 19: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAUSAL MODELS AND COUNTERFACTUALS

Definition: Potential ResponseThe sentence: “Y would be y (in unit u), had X been x,”denoted Yx(u) = y, is the solution for Y in a mutilated model Mx, with the equations for X replaced by X = x. (“unit-based potential outcome”)

Page 20: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAUSAL MODELS AND COUNTERFACTUALS

Definition: Potential ResponseThe sentence: “Y would be y (in unit u), had X been x,”denoted Yx(u) = y, is the solution for Y in a mutilated model Mx, with the equations for X replaced by X = x. (“unit-based potential outcome”)

)(),()(,)(:

uPzZyYPzuZyuYu

wxwx

Joint probabilities of counterfactuals:

Page 21: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAUSAL MODELS AND COUNTERFACTUALS

Definition: Potential ResponseThe sentence: “Y would be y (in unit u), had X been x,”denoted Yx(u) = y, is the solution for Y in a mutilated model Mx, with the equations for X replaced by X = x. (“unit-based potential outcome”)

)(),()(,)(:

uPzZyYPzuZyuYu

wxwx

Joint probabilities of counterfactuals:

),|(),|'(

)()()|(

')(:'

)(:

'

yxuPyxyYP

uPyYPyP

yuYux

yuYux

x

x

In particular:

)(xdo

Page 22: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

U

D

B

C

A

S5. If the prisoner is dead, he would still be dead if A were not to have shot. DDA

3-STEPS TO COMPUTING3-STEPS TO COMPUTINGCOUNTERFACTUALSCOUNTERFACTUALS

TRUE

Abduction

TRUE

(Court order)

(Captain)

(Riflemen)

(Prisoner)

Page 23: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

U

D

B

C

A

S5. If the prisoner is dead, he would still be dead if A were not to have shot. DDA

3-STEPS TO COMPUTING3-STEPS TO COMPUTINGCOUNTERFACTUALSCOUNTERFACTUALS

TRUE U

D

B

C

A

FALSE

TRUE

Action

TRUE

U

D

B

C

A

FALSE

TRUE

PredictionAbduction

TRUE

Page 24: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

U

D

B

C

A

P(S5). The prisoner is dead. How likely is it that he would be dead if A were not to have shot. P(DA|D) = ?

COMPUTING PROBABILITIESCOMPUTING PROBABILITIESOF COUNTERFACTUALSOF COUNTERFACTUALS

Abduction

TRUE

Prediction

U

D

B

C

A

FALSE

P(u|D)

P(DA|D)

P(u)

Action

U

D

B

C

A

FALSE

P(u|D)

P(u|D)

Page 25: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAUSAL INFERENCEMADE EASY (1985-2000)

1. Inference with Nonparametric Structural Equations made possible through Graphical Analysis.

2. Mathematical underpinning of counterfactualsthrough nonparametric structural equations

3. Graphical-Counterfactuals symbiosis

Page 26: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

IDENTIFIABILITYIDENTIFIABILITYDefinition:Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption.

Q is identifiable relative to A iff

for all M1, M2, that satisfy A.

P(M1) = P(M2) Q(M1) = Q(M2)

Page 27: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

IDENTIFIABILITYIDENTIFIABILITYDefinition:Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption.

Q is identifiable relative to A iff

In other words, Q can be determined uniquelyfrom the probability distribution P(v) of the endogenous variables, V, and assumptions A.

P(M1) = P(M2) Q(M1) = Q(M2)

for all M1, M2, that satisfy A.

Page 28: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

IDENTIFIABILITYIDENTIFIABILITYDefinition:Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption.

Q is identifiable relative to A iff

for all M1, M2, that satisfy A.

P(M1) = P(M2) Q(M1) = Q(M2)

A: Assumptions encoded in the diagramQ1: P(y|do(x)) Causal Effect (= P(Yx=y))Q2: P(Yx=y | x, y) Probability of necessityQ3: Direct Effect)(

'xZxYE

In this talk:

Page 29: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

THE FUNDAMENTAL THEOREMOF CAUSAL INFERENCE

Causal Markov Theorem:Any distribution generated by Markovian structural model M (recursive, with independent disturbances) can be factorized as

Where pai are the (values of) the parents of Vi in the causal diagram associated with M.

)|(),...,,( iii

n pavPvvvP 21

Page 30: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

THE FUNDAMENTAL THEOREMOF CAUSAL INFERENCE

Causal Markov Theorem:Any distribution generated by Markovian structural model M (recursive, with independent disturbances) can be factorized as

Where pai are the (values of) the parents of Vi in the causal diagram associated with M.

)|(),...,,( iii

n pavPvvvP 21

xXXViiin

i

pavPxdovvvP

|)|( ))(|,...,,(|

21

Corollary: (Truncated factorization, Manipulation Theorem)The distribution generated by an intervention do(X=x)(in a Markovian model M) is given by the truncated factorization

Page 31: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

Pre-intervention Post-intervention

RAMIFICATIONS OF THE FUNDAMENTAL THEOREM

U (unobserved)

X = x Z YSmoking Tar in

LungsCancer

U (unobserved)

X Z YSmoking Tar in

LungsCancer

Given P(x,y,z), should we ban smoking?

Page 32: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

RAMIFICATIONS OF THE FUNDAMENTAL THEOREM

U (unobserved)

X = x Z YSmoking Tar in

LungsCancer

U (unobserved)

X Z YSmoking Tar in

LungsCancer

Given P(x,y,z), should we ban smoking?

Pre-intervention Post-interventionu

uzyPxzPuxPuPzyxP ),|()|()|()(),,( u

uzyPxzPuPxdozyP ),|()|()())(|,(

Page 33: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

RAMIFICATIONS OF THE FUNDAMENTAL THEOREM

U (unobserved)

X = x Z YSmoking Tar in

LungsCancer

U (unobserved)

X Z YSmoking Tar in

LungsCancer

Given P(x,y,z), should we ban smoking?

Pre-intervention Post-interventionu

uzyPxzPuxPuPzyxP ),|()|()|()(),,( u

uzyPxzPuPxdozyP ),|()|()())(|,(

To compute P(y,z|do(x)), we must eliminate u. (graphical problem).

Page 34: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

THE BACK-DOOR CRITERIONGraphical test of identificationP(y | do(x)) is identifiable in G if there is a set Z ofvariables such that Z d-separates X from Y in Gx.

Z6

Z3

Z2

Z5

Z1

X Y

Z4

Z6

Z3

Z2

Z5

Z1

X Y

Z4

Z

Gx G

Page 35: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

THE BACK-DOOR CRITERIONGraphical test of identificationP(y | do(x)) is identifiable in G if there is a set Z ofvariables such that Z d-separates X from Y in Gx.

Z6

Z3

Z2

Z5

Z1

X Y

Z4

Z6

Z3

Z2

Z5

Z1

X Y

Z4

Z

Moreover, P(y | do(x)) = P(y | x,z) P(z)(“adjusting” for Z) z

Gx G

Page 36: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

RULES OF CAUSAL CALCULUSRULES OF CAUSAL CALCULUS

Rule 1: Ignoring observations P(y | do{x}, z, w) = P(y | do{x}, w)

Rule 2: Action/observation exchange P(y | do{x}, do{z}, w) = P(y | do{x},z,w)

Rule 3: Ignoring actions P(y | do{x}, do{z}, w) = P(y | do{x}, w)

XG Z|X,WY )( if

Z(W)XGZ|X,WY )( if

ZXGZ|X,WY )( if

Page 37: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

DERIVATION IN CAUSAL CALCULUSDERIVATION IN CAUSAL CALCULUS

Smoking Tar Cancer

P (c | do{s}) = t P (c | do{s}, t) P (t | do{s})

= st P (c | do{t}, s) P (s | do{t}) P(t |s)

= t P (c | do{s}, do{t}) P (t | do{s})

= t P (c | do{s}, do{t}) P (t | s)

= t P (c | do{t}) P (t | s)

= s t P (c | t, s) P (s) P(t |s)

= st P (c | t, s) P (s | do{t}) P(t |s)

Probability Axioms

Probability Axioms

Rule 2

Rule 2

Rule 3

Rule 3

Rule 2

Genotype (Unobserved)

Page 38: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

OUTLINE

• Modeling: Statistical vs. Causal

• Causal models and identifiability

• Inference to three types of claims:

1. Effects of potential interventions,

2. Claims about attribution (responsibility)

3.

Page 39: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

DETERMINING THE CAUSES OF EFFECTS(The Attribution Problem)

• Your Honor! My client (Mr. A) died BECAUSE he used that drug.

Page 40: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

DETERMINING THE CAUSES OF EFFECTS(The Attribution Problem)

• Your Honor! My client (Mr. A) died BECAUSE he used that drug.

• Court to decide if it is MORE PROBABLE THANNOT that A would be alive BUT FOR the drug!

P(? | A is dead, took the drug) > 0.50

Page 41: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

THE PROBLEM

Theoretical Problems:

1. What is the meaning of PN(x,y):“Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

Page 42: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

THE PROBLEM

Theoretical Problems:

1. What is the meaning of PN(x,y):“Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

Answer:

),(),,'(

),|'(),(

'

'

yYxXPyYxXyYP

yxyYPyxPN

x

x

Page 43: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

THE PROBLEM

Theoretical Problems:

1. What is the meaning of PN(x,y):“Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

2. Under what condition can PN(x,y) be learned from statistical data, i.e., observational, experimental and combined.

Page 44: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

WHAT IS INFERABLE FROM EXPERIMENTS?

Simple Experiment:Q = P(Yx= y | z)Z nondescendants of X.

Compound Experiment:Q = P(YX(z) = y | z)

Multi-Stage Experiment:etc…

Page 45: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CAN FREQUENCY DATA DECIDE CAN FREQUENCY DATA DECIDE LEGAL RESPONSIBILITY?LEGAL RESPONSIBILITY?

• Nonexperimental data: drug usage predicts longer life• Experimental data: drug has negligible effect on survival

Experimental Nonexperimental do(x) do(x) x x

Deaths (y) 16 14 2 28Survivals (y) 984 986 998 972

1,000 1,000 1,000 1,000

1. He actually died2. He used the drug by choice

500.),|'( ' yxyYPPN x

• Court to decide (given both data): Is it more probable than not that A would be alive but for the drug?

• Plaintiff: Mr. A is special.

Page 46: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

TYPICAL THEOREMS(Tian and Pearl, 2000)

• Bounds given combined nonexperimental and experimental data

)()(

1

min)(

)()(

0

maxx,yPy'P

PN x,yP

yPyP x'x'

)()()(

)()()(

x,yPyPy|x'P

y|xPy|x'Py|xP

PN x'

• Identifiability under monotonicity (Combined data)

corrected Excess-Risk-Ratio

Page 47: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

SOLUTION TO THE ATTRIBUTION SOLUTION TO THE ATTRIBUTION PROBLEM (Cont)PROBLEM (Cont)

• WITH PROBABILITY ONE P(yx | x,y) =1

• From population data to individual case• Combined data tell more that each study alone

Page 48: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

OUTLINE

• Modeling: Statistical vs. Causal

• Causal models and identifiability

• Inference to three types of claims:

1. Effects of potential interventions,

2. Claims about attribution (responsibility)

3. Claims about direct and indirect effects

Page 49: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

QUESTIONS ADDRESSED

• What is the semantics of direct and

indirect effects?

• Can we estimate them from data? Experimental data?

Page 50: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

tindependen- ))(),(|(

))(|(

DETEIE

ZzdoxdoYEx

DE

xdoYEx

TE

TOTAL, DIRECT, AND INDIRECT EFFECTS HAVE SIMPLE SEMANTICS

IN LINEAR MODELS

X Z

Y

ca

b z = bx + 1

y = ax + cz + 2

a + bc

bc

a

Page 51: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

z = f (x, 1)y = g (x, z, 2)

????

))(),(|(

))(|(

IE

zdoxdoYEx

DE

xdoYEx

TE

X Z

Y

SEMANTICS BECOMES NONTRIVIALIN NONLINEAR MODELS

(even when the model is completely specified)

Dependent on z?

Void of operational meaning?

Page 52: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

z = f (x, 1)y = g (x, z, 2)

X Z

Y

THE OPERATIONAL MEANING OFDIRECT EFFECTS

“Natural” Direct Effect of X on Y:The expected change in Y per unit change of X, when we keep Z constant at whatever value it attains before the change.

In linear models, NDE = Controlled Direct Effect

][001 xZx YYE

x

Page 53: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

POLICY IMPLICATIONS(Who cares?)

f

GENDER QUALIFICATION

HIRING

What is the direct effect of X on Y?

The effect of Gender on Hiring if sex discrimination is eliminated.

indirect

X Z

Y

IGNORE

Page 54: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

z = f (x, 1)y = g (x, z, 2)

X Z

Y

THE OPERATIONAL MEANING OFINDIRECT EFFECTS

“Natural” Indirect Effect of X on Y:The expected change in Y when we keep X constant, say at x0, and let Z change to whatever value it would have under a unit change in X.

In linear models, NIE = TE - DE

][010 xZx YYE

x

Page 55: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

``The central question in any employment-discrimination case is whether the employer would have taken the same action had the employee been of different race (age, sex, religion, national origin etc.) and everything else had been the same’’

[Carson versus Bethlehem Steel Corp. (70 FEP Cases 921, 7th Cir. (1996))]

x = male, x = femaley = hire, y = not hirez = applicant’s qualifications

LEGAL DEFINITIONS TAKE THE NATURAL CONCEPTION

(FORMALIZING DISCRIMINATION)

NO DIRECT EFFECT

',' ' xxxx YYYYxZxZ

Page 56: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

SEMANTICS AND IDENTIFICATION OF NESTED COUNTERFACTUALS

Consider the quantity

Given M, P(u), Q is well defined

Given u, Zx*(u) is the solution for Z in Mx*, call it z

is the solution for Y in Mxz

Can Q be estimated from data?

)]([ )(*uYEQ uxZxu

entalnonexperim

alexperiment

)()(*uY uxZx

Page 57: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

ANSWERS TO QUESTIONS

• Graphical conditions for estimability from experimental / nonexperimental data.

• Graphical conditions hold in Markovian models

Page 58: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

ANSWERS TO QUESTIONS

• Graphical conditions for estimability from experimental / nonexperimental data.

• Useful in answering new type of policy questions involving mechanism blocking instead of variable fixing.

• Graphical conditions hold in Markovian models

Page 59: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

THE OVERRIDING THEME

1. Define Q(M) as a counterfactual expression2. Determine conditions for the reduction

3. If reduction is feasible, Q is inferable.

• Demonstrated on three types of queries:

)()()()( exp MPMQMPMQ or

Q1: P(y|do(x)) Causal Effect (= P(Yx=y))Q2: P(Yx = y | x, y) Probability of necessityQ3: Direct Effect)(

'xZxYE

Page 60: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

ACTUAL CAUSATION ANDACTUAL CAUSATION ANDTHE COUNTERFACTUAL TESTTHE COUNTERFACTUAL TEST

"We may define a cause to be an object followed byanother,..., where, if the first object had not been, thesecond never had existed."

Hume, Enquiry, 1748

Lewis (1973): "x CAUSED y " if x and y are true, and y is false in the closest non-x-world.

Structural interpretation:(i) X(u)=x(ii) Y(u)=y

(iii) Yx (u) y for x x

Page 61: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

PROBLEM WITH THE COUNTERFACTUAL DEFINITION

Back-up to shoot iff Captain does not shoot at 12:00 noon

(Back-up)

(Prisoner)

(Captain)

Y

W

X

Page 62: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

PROBLEM WITH THE COUNTERFACTUAL DEFINITION

Back-up to shoot iff Captain does not shoot at 12:00 noon

(Back-up)

(Prisoner)

(Captain)

Y

W

X

Scenario: Captain shot before noonPrisoner is dead

= 1

= 0

= 1

Page 63: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

PROBLEM WITH THE COUNTERFACTUAL DEFINITION

Back-up to shoot iff Captain does not shoot at 12:00 noon

(Back-up)

(Prisoner)

(Captain)

Y

W

X

Scenario: Captain shot before noonPrisoner is dead

Q Is Captain’s shot the cause of death?A Yes, but the counterfactual test fails!

Intuition: Back-up might fall asleep – structural contingency

= 1

= 0

= 1

Page 64: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

SELECTED STRUCTURAL CONTINGENCIES AND SUSTENANCE

x sustains y against W iff:(i) X(u) = x;(ii) Y(u) = y ;

(iii) Yxw(u) = y for all w; and(iv) Yx w (u) = y for some x x and some w

Y

W

X

= 1

= w = 0

= 1

Y

W

X

= 0

= 0

= w = 0

Page 65: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

Definition: The explanatory power of a proposition X=x relative to an observed event Y=y is given by P(K{x,y}|x), the pre-discovery probability of the set of contexts K in which x is the actual cause of y.

FIRE

AND

Oxygen MatchOxygen

MatchKO,F = KM,F = K

EP(O) = P(K|O) << 1 EP(M) = P(K|M) 1

EXPLANATORY POWER(Halpern and Pearl, 2001)

Page 66: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CORRECTNESS and CORROBORATION

Data D corroborates structure S if S is (i) falsifiable and (ii) compatible with D.

Falsifiability: P*(S) P*

Types of constraints:1. conditional independencies2. inequalities (for restricted domains)3. functional

Constraints implied by S

P*P*(S)

D (Data)

e.g.,

w x y z

x

zyfyxwzPwxP ),(),,|()|(

Page 67: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

FROM CORROBORATING MODELSTO CORROBORATING CLAIMS

x yx ya

e.g., An un-corroborated structure, a is identifiable.

a = rYX

Intuitively, claim a = rYX is not corroborated because the assumptions that entail the claim are not falsifiable. i.e., no data falsifies the assumption rs = 0.

x yx ya

rs

a = rYX - rs

Page 68: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

FROM CORROBORATING MODELSTO CORROBORATING CLAIMS

e.g., x y zx ya a

x y z

A corroborated structure can imply uncorroboratedclaims.

Page 69: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

FROM CORROBORATING MODELSTO CORROBORATING CLAIMS

ax y ze.g., x y zx y

a

Some claims can be more corroborated than others.

a a b

b = rZY is corroborated because the assumptions needed for entailing this claim constrain the data by

YX

ZXZY r

rr

b

Page 70: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

b b

FROM CORROBORATING MODELSTO CORROBORATING CLAIMS

Definition: An identifiable claim C is corroborated by data if the union of all minimal sets of assumptions sufficient for identifying C is corroborated by the data.

e.g., x y zx yaa aa

x y z

Some claims can be more corroborated than others.

Page 71: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

abax y z

b

GRAPHICAL CRITERION FORCORROBORATED CLAIMS

Theorem: An identifiable claim C is corroborated by data if The intersection of all maximal supergraphs sufficientfor identifying C is corroborated by the data.

e.g., x yxa

x y z

Page 72: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

b

GRAPHICAL CRITERION FORCORROBORATED CLAIMS

Theorem: An identifiable claim C is corroborated by data if The intersection of all maximal supergraphs sufficientfor identifying C is corroborated by the data.

e.g.,ba

x y zx yx

ax y z

abax y zx yx

bax y zx yx

bax y z

Intersection:x yx

Maximal supergraphs:

Page 73: REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

CONCLUSIONS

Structural-model semantics enriched

with logic + graphs leads to formal

interpretation and practical assessments

of wide variety of causal and counterfactual

relationships.