CAUSAL INFERENCE AS A MACHINE LEARNING EXERCISE

Preview:

DESCRIPTION

CAUSAL INFERENCE AS A MACHINE LEARNING EXERCISE. Judea Pearl Computer Science and Statistics UCLA www.cs.ucla.edu/~judea/. OUTLINE. Learning: Statistical vs. Causal concepts. Causal models and identifiability. Learnability of three types of causal queries:. - PowerPoint PPT Presentation

Citation preview

CAUSAL INFERENCE AS A MACHINE LEARNING

EXERCISE

Judea PearlComputer Science and Statistics

UCLAwww.cs.ucla.edu/~judea/

OUTLINE

• Learning: Statistical vs. Causal concepts

• Causal models and identifiability

• Learnability of three types of causal queries:

1. Effects of potential interventions,

2. Queries about attribution (responsibility)

3. Queries about direct and indirect effects

TRADITIONAL MACHINELEARNING PARADIGM

Data

Learning

Q(P)(Aspects of P)

PJoint

Distribution

e.g.,Learn whether customers who bought product Awould also buy product B.Q = P(B|A)

THE CAUSAL ANALYSISPARADIGM

Data

Learning

Q(M)(Aspects of M)

MData-generating

Model

Some Q(M) cannot be inferred from P.e.g.,Learn whether customers who bought product Awould still buy A if we double the price.• Data-mining vs. knowledge mining

THE SECRETS OFCAUSAL MODELS

Causal Model = Data-generating model satisfying:

1. Modularity (Symbol-mechanism correspondence)2. Uniqueness (Variable-mechanism correspondence)

THE SECRETS OFCAUSAL MODELS

Causal Model = Data-generating model satisfying:

Causal Model Joint DistributionPR = f (CP,PS, 1) P(PR, PS, CP, ME, QS)QS = g(ME, PR, 2); P(1, 2)

Q1: P(QS|PR=2) computable from P (and M)Q2: P(QS|do(PR=2) computable from M (not P)

f

Quantity Sold (QS)

g

Cost Proj.Prev. SaleOthers () Others ()

Marketing (MS)

PRICE (PR)

1. Modularity (Symbol-mechanism correspondence)2. Uniqueness (Variable-mechanism correspondence)

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES

Datajoint

distribution

inferencesfrom passiveobservations

Probability and statistics deal with static relations

ProbabilityStatistics

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES

Datajoint

distribution

inferencesfrom passiveobservations

Probability and statistics deal with static relations

ProbabilityStatistics

Causal analysis deals with changes (dynamics)i.e. What remains invariant when P changes.

• P does not tell us how it ought to change

e.g. Curing symptoms vs. curing diseases e.g. Analogy: mechanical deformation

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES

Datajoint

distribution

predictionsfrom passiveobservations

Probability and statistics deal with static relations

ProbabilityStatistics

CausalModel

Data

Causalassumptions

1. Effects of interventions

2. Causes of effects

3. Explanations

Causal analysis deals with changes (dynamics)

Experiments

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES (CONT)

CAUSALSpurious correlationRandomizationConfounding / EffectInstrumentHolding constantExplanatory variables

STATISTICALRegressionAssociation / Independence“Controlling for” / ConditioningOdd and risk ratiosCollapsibilityPropensity score

1. Causal and statistical concepts do not mix.

2.

3.

4.

CAUSALSpurious correlationRandomizationConfounding / EffectInstrumentHolding constantExplanatory variables

STATISTICALRegressionAssociation / Independence“Controlling for” / ConditioningOdd and risk ratiosCollapsibilityPropensity score

1. Causal and statistical concepts do not mix.

4. Non-standard mathematics:a) Structural equation models (SEM)b) Counterfactuals (Neyman-Rubin)c) Causal Diagrams (Wright, 1920)

3. Causal assumptions cannot be expressed in the mathematical language of standard statistics.

FROM STATISTICAL TO CAUSAL ANALYSIS:1. THE DIFFERENCES (CONT)

2. No causes in – no causes out (Cartwright, 1989)

statistical assumptions + datacausal assumptions causal conclusions }

WHAT'S IN A CAUSAL MODEL?

Oracle that assigns truth value to causalsentences:

Action sentences: B if we do A.

Counterfactuals: B would be different ifA were true.

Explanation: B occurred because of A.

Optional: with what probability?

Z

YX

INPUT OUTPUT

FAMILIAR CAUSAL MODELORACLE FOR MANIPILATION

CAUSAL MODELS ANDCAUSAL DIAGRAMS

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U•

CAUSAL MODELS ANDCAUSAL DIAGRAMS

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U

U1 U2I W

Q P PAQ 222

111uwdqbp

uidpbq

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U(iv) Mx= U,V,Fx, X V, x X

where Fx = {fi: Vi X } {X = x}(Replace all functions fi corresponding to X with the constant

functions X=x)•

CAUSAL MODELS ANDMUTILATION

CAUSAL MODELS ANDMUTILATION

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U(iv)

U1 U2I W

Q P P = p0

0

222

111

pp

uwdqbp

uidpbq

Mp

CAUSAL MODELS ANDMUTILATION

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U

U1 U2I W

Q P 222

111uwdqbp

uidpbq

(iv)

Definition: A causal model is a 3-tupleM = V,U,F

with a mutilation operator do(x): M Mx where:

(i) V = {V1…,Vn} endogenous variables,(ii) U = {U1,…,Um} background variables(iii) F = set of n functions, fi : V \ Vi U Vi

vi = fi(pai,ui) PAi V \ Vi Ui U(iv) Mx= U,V,Fx, X V, x X

where Fx = {fi: Vi X } {X = x}(Replace all functions fi corresponding to X with the constant

functions X=x)

Definition (Probabilistic Causal Model): M, P(u)P(u) is a probability assignment to the variables in U.

PROBABILISTIC CAUSAL MODELS

CAUSAL MODELS AND COUNTERFACTUALS

Definition: Potential ResponseThe sentence: “Y would be y (in unit u), had X been x,”denoted Yx(u) = y, is the solution for Y in a mutilated model Mx, with the equations for X replaced by X = x. (“unit-based potential outcome”)

CAUSAL MODELS AND COUNTERFACTUALS

Definition: Potential ResponseThe sentence: “Y would be y (in unit u), had X been x,”denoted Yx(u) = y, is the solution for Y in a mutilated model Mx, with the equations for X replaced by X = x. (“unit-based potential outcome”)

)(),()(,)(:

uPzZyYPzuZyuYu

wxwx

Joint probabilities of counterfactuals:

CAUSAL MODELS AND COUNTERFACTUALS

Definition: Potential ResponseThe sentence: “Y would be y (in unit u), had X been x,”denoted Yx(u) = y, is the solution for Y in a mutilated model Mx, with the equations for X replaced by X = x. (“unit-based potential outcome”)

)(),()(,)(:

uPzZyYPzuZyuYu

wxwx

Joint probabilities of counterfactuals:

),|(),|'(

)()()|(

')(:'

)(:

'

yxuPyxyYPN

uPyYPyP

yuYux

yuYux

x

x

In particular:

)(xdo

CAUSAL INFERENCEMADE EASY (1985-2000)

1. Inference with Nonparametric Structural Equations made possible through Graphical Analysis.

2. Mathematical underpinning of counterfactualsthrough nonparametric structural equations

3. Graphical-Counterfactuals symbiosis

NON-PARAMETRICSTRUCTURAL MODELS

Given P(x,y,z), should we ban smoking?

x = u1,z = x + u2,y = z + u1 + u3.

Find: Find: P(y|do(x))

x = f1(u1),z = f2(x, u2),y = f3(z, u1, u3).

Linear Analysis Nonparametric Analysis

U

X Z Y

1

U2

Smoking Tar in Lungs

Cancer

U3

U

X Z Y

1

U2

Smoking Tar in Lungs

Cancer

U3

f1 f2f3

2f2

Given P(x,y,z), should we ban smoking?

x = u1,z = x + u2,y = z + u1 + u3.

Find: Find: P(y|do(x)) = P(Y=y) in new model

x = const.z = f2(x, u2),y = f3(z, u1, u3).

Linear Analysis Nonparametric Analysis

U

X = x Z Y

1

U

Smoking Tar in Lungs

Cancer

U3

U

X Z Y

1

U2

Smoking Tar in Lungs

Cancer

U3f3

LEARNING THE EFFECTS OF ACTIONS

IDENTIFIABILITYIDENTIFIABILITYDefinition:Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption.

Q is identifiable relative to A iff

for all M1, M2, that satisfy A.

P(M1) = P(M2) Q(M1) = Q(M2)

IDENTIFIABILITYIDENTIFIABILITYDefinition:Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption.

Q is identifiable relative to A iff

In other words, Q can be determined uniquelyfrom the probability distribution P(v) of the endogenous variables, V, and assumptions A.

P(M1) = P(M2) Q(M1) = Q(M2)

for all M1, M2, that satisfy A.

IDENTIFIABILITYIDENTIFIABILITYDefinition:Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption.

Q is identifiable relative to A iff

for all M1, M2, that satisfy A.

P(M1) = P(M2) Q(M1) = Q(M2)

A: Assumptions encoded in the diagramQ1: P(y|do(x)) Causal Effect (= P(Yx=y))Q2: P(Yx=y | x, y) Probability of necessityQ3: Direct Effect)(

'xZxYE

In this talk:

THE FUNDAMENTAL THEOREMOF CAUSAL INFERENCE

Causal Markov Theorem:Any distribution generated by Markovian structural model M (recursive, with independent disturbances) can be factorized as

Where pai are the (values of) the parents of Vi in the causal diagram associated with M.

)|(),...,,( iii

n pavPvvvP 21

THE FUNDAMENTAL THEOREMOF CAUSAL INFERENCE

Causal Markov Theorem:Any distribution generated by Markovian structural model M (recursive, with independent disturbances) can be factorized as

Where pai are the (values of) the parents of Vi in the causal diagram associated with M.

)|(),...,,( iii

n pavPvvvP 21

xXXViiin

i

pavPxdovvvP

|)|( ))(|,...,,(|

21

Corollary: (Truncated factorization, Manipulation Theorem)The distribution generated by an intervention do(X=x)(in a Markovian model M) is given by the truncated factorization

RAMIFICATIONS OF THE FUNDAMENTAL THEOREM

U (unobserved)

X = x Z YSmoking Tar in

LungsCancer

U (unobserved)

X Z YSmoking Tar in

LungsCancer

Given P(x,y,z), should we ban smoking?

• •

RAMIFICATIONS OF THE FUNDAMENTAL THEOREM

U (unobserved)

X = x Z YSmoking Tar in

LungsCancer

U (unobserved)

X Z YSmoking Tar in

LungsCancer

Given P(x,y,z), should we ban smoking?

Pre-intervention Post-interventionu

uzyPxzPuxPuPzyxP ),|()|()|()(),,( u

uzyPxzPuPxdozyP ),|()|()())(|,(

RAMIFICATIONS OF THE FUNDAMENTAL THEOREM

U (unobserved)

X = x Z YSmoking Tar in

LungsCancer

U (unobserved)

X Z YSmoking Tar in

LungsCancer

Given P(x,y,z), should we ban smoking?

Pre-intervention Post-interventionu

uzyPxzPuxPuPzyxP ),|()|()|()(),,( u

uzyPxzPuPxdozyP ),|()|()())(|,(

To compute P(y,z|do(x)), we must eliminate u. (graphical problem).

THE BACK-DOOR CRITERIONGraphical test of identificationP(y | do(x)) is identifiable in G if there is a set Z ofvariables such that Z d-separates X from Y in Gx.

Z6

Z3

Z2

Z5

Z1

X Y

Z4

Z6

Z3

Z2

Z5

Z1

X Y

Z4

Z

Gx G

THE BACK-DOOR CRITERIONGraphical test of identificationP(y | do(x)) is identifiable in G if there is a set Z ofvariables such that Z d-separates X from Y in Gx.

Z6

Z3

Z2

Z5

Z1

X Y

Z4

Z6

Z3

Z2

Z5

Z1

X Y

Z4

Z

Moreover, P(y | do(x)) = P(y | x,z) P(z)(“adjusting” for Z) z

Gx G

RULES OF CAUSAL CALCULUSRULES OF CAUSAL CALCULUS

Rule 1: Ignoring observations P(y | do{x}, z, w) = P(y | do{x}, w)

Rule 2: Action/observation exchange P(y | do{x}, do{z}, w) = P(y | do{x},z,w)

Rule 3: Ignoring actions P(y | do{x}, do{z}, w) = P(y | do{x}, w)

XG WX,|ZY )( if

ZXGWXZY ),|( if

)(),|( if

WZXGWXZY

DERIVATION IN CAUSAL CALCULUSDERIVATION IN CAUSAL CALCULUS

Smoking Tar Cancer

P (c | do{s}) = t P (c | do{s}, t) P (t | do{s})

= st P (c | do{t}, s) P (s | do{t}) P(t |s)

= t P (c | do{s}, do{t}) P (t | do{s})

= t P (c | do{s}, do{t}) P (t | s)

= t P (c | do{t}) P (t | s)

= s t P (c | t, s) P (s) P(t |s)

= st P (c | t, s) P (s | do{t}) P(t |s)

Probability Axioms

Probability Axioms

Rule 2

Rule 2

Rule 3

Rule 3

Rule 2

Genotype (Unobserved)

A RECENTIDENTIFICATION RESULT

Theorem: [Tian and Pearl, 2001]The causal effect P(y|do(x)) is identifiable whenever the ancestral graph of Y contains no confounding path ( ) between X and any of its children.

Y

X

Z1

(b)

Z2

X

Z1

Y

(a)

Y

X

Z1

(c)

Z2

OUTLINE

• Learning: Statistical vs. Causal concepts

• Causal models and identifiability

• Learnability of three types of causal queries:

1. Distinguishing direct from indirect effects

2. Queries about attribution (responsibility)

3.

DETERMINING THE CAUSES OF EFFECTS(The Attribution Problem)

• Your Honor! My client (Mr. A) died BECAUSE he used that drug.

DETERMINING THE CAUSES OF EFFECTS(The Attribution Problem)

• Your Honor! My client (Mr. A) died BECAUSE he used that drug.

• Court to decide if it is MORE PROBABLE THANNOT that A would be alive BUT FOR the drug!

P(? | A is dead, took the drug) > 0.50

THE PROBLEM

Theoretical Problems:

1. What is the meaning of PN(x,y):“Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

2.

THE PROBLEM

Theoretical Problems:

1. What is the meaning of PN(x,y):“Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

Answer:

),(),,'(

),|'(),(

'

'

yYxXPyYxXyYP

yxyYPyxPN

x

x

THE PROBLEM

Theoretical Problems:

1. What is the meaning of PN(x,y):“Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

2. Under what condition can PN(x,y) be learned from statistical data, i.e., observational, experimental and combined.

WHAT IS LEARNABLE FROM EXPERIMENTS?

Simple Experiment:Q = P(Yx= y | z)Z nondescendants of X.

Compound Experiment:Q = P(YX(z) = y | z)

Multi-Stage Experiment:etc…

CAN FREQUENCY DATA DECIDE CAN FREQUENCY DATA DECIDE LEGAL RESPONSIBILITY?LEGAL RESPONSIBILITY?

• Nonexperimental data: drug usage predicts longer life• Experimental data: drug has negligible effect on survival

Experimental Nonexperimental do(x) do(x) x x

Deaths (y) 16 14 2 28Survivals (y) 984 986 998 972

1,000 1,000 1,000 1,000

1. He actually died2. He used the drug by choice

500.),|'( ' yxyYPPN x

• Court to decide (given both data): Is it more probable than not that A would be alive but for the drug?

• Plaintiff: Mr. A is special.

TYPICAL THEOREMS(Tian and Pearl, 2000)

• Bounds given combined nonexperimental and experimental data

)()(

min)(

)()(max

x,yPy'P

PN x,yP

yPyP x'x'

10

)()()(

)()()(

x,yPyPy|x'P

y|xPy|x'Py|xP

PN x'

• Identifiability under monotonicity (Combined data)

corrected Excess-Risk-Ratio

SOLUTION TO THE ATTRIBUTION SOLUTION TO THE ATTRIBUTION PROBLEM (Cont)PROBLEM (Cont)

• WITH PROBABILITY ONE P(yx | x,y) =1

• From population data to individual case• Combined data tell more that each study alone

OUTLINE

• Learning: Statistical vs. Causal concepts

• Causal models and identifiability

• Learnability of three types of causal queries:

1. Effects of potential interventions,

2. Queries about attribution (responsibility)

3. Queries about direct and indirect effects

QUESTIONS ADDRESSED

• What is the semantics of direct and

indirect effects?

• Can we estimate them from data? Experimental data?

z = f (x, 1)y = g (x, z, 2)

X Z

Y

THE OPERATIONAL MEANING OFDIRECT EFFECTS

“Natural” Direct Effect of X on Y:The expected change in Y per unit change of X, when we keep Z constant at whatever value it attains before the change.

In linear models, NDE = Controlled Direct Effect

][001 xZx YYE

x

z = f (x, 1)y = g (x, z, 2)

X Z

Y

THE OPERATIONAL MEANING OFINDIRECT EFFECTS

“Natural” Indirect Effect of X on Y:The expected change in Y when we keep X constant, say at x0, and let Z change to whatever value it would have under a unit change in X.

In linear models, NIE = TE - DE

][010 xZx YYE

x

``The central question in any employment-discrimination case is whether the employer would have taken the same action had the employee been of different race (age, sex, religion, national origin etc.) and everything else had been the same’’

[Carson versus Bethlehem Steel Corp. (70 FEP Cases 921, 7th Cir. (1996))]

x = male, x = femaley = hire, y = not hirez = applicant’s qualifications

LEGAL DEFINITIONS TAKE THE NATURAL CONCEPTION

(FORMALIZING DISCRIMINATION)

YxZx = Yx, YxZx

= Yx

NO DIRECT EFFECT

SEMANTICS AND IDENTIFICATION OF NESTED COUNTERFACTUALS

Consider the quantity

Given M, P(u), Q is well defined

Given u, Zx*(u) is the solution for Z in Mx*, call it z

is the solution for Y in Mxz

Can Q be estimated from data?

)]([ )(*uYEQ uxZxu

entalnonexperim

alexperiment

)()(*uY uxZx

ANSWERS TO QUESTIONS

• Graphical conditions for estimability from experimental / nonexperimental data.

• Graphical conditions hold in Markovian models

IDENTIFICATION INMARKOVIAN MODELS

X

Y

Z

z

xxZ

xzPzxYEzxYE

YEYEYxxDEx

*)|()]*,|(),|([

)()()*;,( **

]*)|()|()[*,|(

)()()*;,( **

z

xZx

xzPxzPzxYE

YEYEYxxIEx

ANSWERS TO QUESTIONS

• Graphical conditions for estimability from experimental / nonexperimental data.

• Useful in answering new type of policy questions involving mechanism blocking instead of variable fixing.

• Graphical conditions hold in Markovian models

CONCLUSIONS

• General theme:1. Define Q(M) as a counterfactual expression2. Determine conditions for the reduction

3. If reduction is feasible, Q is learnable.

• Demonstrated on three types of queries:

)()()()( exp MPMQMPMQ or

Q1: P(y|do(x)) Causal Effect (= P(Yx=y))Q2: P(Yx = y | x, y) Probability of necessityQ3: Direct Effect)(

'xZxYE

Recommended