350 likes | 530 Views
Causal Inference and Ambiguous Manipulations. Richard Scheines Grant Reaber, Peter Spirtes Carnegie Mellon University. 1. Motivation. Wanted: Answers to Causal Questions: Does attending Day Care cause Aggression? Does watching TV cause obesity?
E N D
Causal Inference and Ambiguous Manipulations Richard Scheines Grant Reaber, Peter Spirtes Carnegie Mellon University
1. Motivation Wanted: Answers to Causal Questions: • Does attending Day Care cause Aggression? • Does watching TV cause obesity? • How can we answer these questions empirically? • When and how can we estimate the size of the effect? • Can we know our estimates are reliable?
Causation & Intervention P(Lung Cancer | Tar-stained teeth = no) P(Lung Cancer | Tar-stained teeth set= no) Conditioning is not the same as intervening Show Teeth Slides
Causal Inference: Experiments Gold Standard: Randomized Clinical Trials - Intervene: Randomly assign treatment - Observe Response Estimate P( Response | Treatment assigned)
Causal Inference: Observational Studies Collect a sample on - Potential Causes (X) - Response (Y) - Covariates (potential confounders Z) Estimate P(Y | X, Z) • Highly unreliable • We can estimate sampling variability, but we don’t know how to estimate specification uncertainty from data
2. Progress 1985 – Present • Representing causal structure, and connecting it to probability • Modeling Interventions • Indistinguishability and Discovery Algorithms
Representing Causal Structures Causal Graph G = {V,E} Each edge X Y represents a direct causal claim: X is a direct cause of Y relative to V
Direct Causation X is a direct cause of Y relative to S, iff z,x1 x2 P(Y | X set= x1 , Zset=z) P(Y | X set= x2 , Zset=z) where Z = S - {X,Y}
Causal Bayes Networks The Joint Distribution Factors According to the Causal Graph, i.e., for all X in V P(V) = P(X|Immediate Causes of(X)) P(S = 0) = .7 P(S = 1) = .3 P(YF = 0 | S = 0) = .99 P(LC = 0 | S = 0) = .95 P(YF = 1 | S = 0) = .01 P(LC = 1 | S = 0) = .05 P(YF = 0 | S = 1) = .20 P(LC = 0 | S = 1) = .80 P(YF = 1 | S = 1) = .80 P(LC = 1 | S = 1) = .20 P(S,Y,F) = P(S) P(YF | S) P(LC | S)
Post Modeling Ideal Interventions Interventions on the Effect Pre-experimental System Room Temperature Wearing Sweater
Modeling Ideal Interventions Interventions on the Cause Post Pre-experimental System Room Temperature Wearing Sweater
Interventions & Causal Graphs • Model an ideal intervention by adding an “intervention” variable outside the original system • Erase all arrows pointing into the variable intervened upon Intervene to change Inf Post-intervention graph? Pre-intervention graph
Calculating the Effect of Interventions Pre-manipulation Joint Distribution P(Exp,Inf,Rash) = P(Exp)P(Inf | Exp)P(Rash|Inf) Intervention on Inf Post-manipulation Joint Distribution P(Exp,Inf,Rash) = P(Exp)P(Inf | I) P(Rash|Inf)
Equivalence Class with Latents:PAGs: Partial Ancestral Graphs • Assumptions: • Acyclic graphs • Latent variables • Sample Selection Bias • Equivalence: • Independence over measuredvariables
Knowing when we know enough to calculate the effect of Interventions The Prediction Algorithm (SGS, 2000) Causal Inference from Observational Studies
3. The Ambiguity of Manipulation • Assumptions • Causal graph known (Cholesterol is a cause of Heart Condition) • No Unmeasured Common Causes Therefore The manipulated and unmanipulated distributions are the same: P(H | TC = x) =P(H | TC set= x)
The Problem with Predicting the Effects of Acting Problem – the cause is a composite of causes that don’t act uniformly, E.g., Total Blood Cholesterol (TC) = HDL + LDL • The observed distribution over TC isdetermined by the unobserved joint distribution over HDL and LDL • Ideally Intervening on TC does not determine a jointdistribution for HDL and LDL
The Problem with Predicting the Effects of Setting TC • P(H | TC set1= x) puts NO constraints onP(H | TC set2= x), • P(H | TC = x)puts NO constraints onP(H | TC set= x) • Nothing in the data tips us off about our ignorance, i.e., we don’t know that we don’t know.
Possible Ways Out • Causal Graph is Not Known: • Cholesterol does not really cause Heart Condition • Confounders (unmeasured common causes) are present: • LDL and HDL are confounders
Cholesterol is not really a cause of Heart Condition Relative to a set of variables S (and a background), X is a cause of Y iff x1 x2 P(Y | X set= x1) P(Y | X set= x2) • Total Cholesterol isa cause of Heart Disease
Cholesterol is not really a cause of Heart Condition Is Total Cholesterol is a direct cause of Heart Condition relative to: {TC, LDL, HDL, HD}? • TC is logically related to LDL, HDL, so manipulating it once LDL and HDL are set is impossible.
LDL, HDL are confounders • No way to manipulate TCl without affecting HDL, LDL • HDL, LDL are logically related to TC
Logico-Causal Systems • S: Atomic Variables • independently manipulable • effects of all manipulations are unambiguous • S’: Defined Variables • defined logically from variables in S • For example: • S: LDL, HDL, HD, Disease1, Disease2 • S’: TC
Logico-Causal Systems: Adding Edges S: LDL, HDL, HD, D1, D2 S’: TC System over S System over S U S’ TC HD iff manipulations of TC are unambiguous wrt HD
Logico-Causal Systems: Unambiguous Manipulations For each variable X in S’, let Parents(X’) be the set of variables in S that logically determine X’, i.e., X’ = f(Parents(X’)), e.g., TC = LDL + HDL Inv(x’) = set of all values p of Parents(X’) s.t., f(p) = x’ A manipulation of a variable X’ in S’ to a value x’ wrt another variable Y is unambiguous iff p1≠ p2 [P(Y | p1 Inv(x’)) = P(Y | p2 Inv(x’))] TC HD iff all manipulations of TC are unambiguous wrt HD
Logico-Causal Systems: Removing Edges S: LDL, HDL, HD, D1, D2 S’: TC System over S System over S U S’ Remove LDL HD iff LDL _||_ HD | TC
Logico-Causal Systems: Faithfulness Faithfulness: Independences entailed by structure, not by special parameter values. Crucial to inference • Effect of TC on HD unambiguous • Unfaithfulness: LDL _||_ HDL | TC Because LDL and TC determine HDL, and similarly, HDL and TC determine TC
Effect on Prediction Algorithm Still sound – but less informative Observed System: TC, HD, D1, D2
Effect on Prediction Algorithm Observed System: TC, HD, D1, D2, X Not completely sound No general characterization of when the Prediction algorithm, suitably modified, is still informative and sound. Conjectures, but no proof yet. • Example: • If observed system has no deterministic relations • All orientations due to marginal independence relations are still valid
Effect on Causal Inference ofAmbiguous Manipulations • Experiments, e.g., RCTs: • Manipulating treatment is • unambiguous sound • ambiguous unsound • Observational Studies, e.g., Prediction Algorithm: • Manipulation is • unambiguous potentially sound • ambiguous potentially sound
Causation, Prediction, and Search, 2nd Edition, (2000), by P. Spirtes, C. Glymour, and R. Scheines ( MIT Press) Causality: Models, Reasoning, and Inference, (2000), Judea Pearl, Cambridge Univ. Press Spirtes, P., Scheines, R.,Glymour, C., Richardson, T., and Meek, C. (2004), “Causal Inference,” in Handbook of Quantitative Methodology in the Social Sciences, ed. David Kaplan, Sage Publications, 447-478 Spirtes, P., and Scheines, R. (2004). Causal Inference of Ambiguous Manipulations. in Proceedings of the Philosophy of Science Association Meetings, 2002. Reaber, Grant (2005). The Theory of Ambiguous Manipulations. Masters Thesis, Department of Philosophy, Carnegie Mellon University References