400 likes | 514 Views
Is Bayesian Statistics Ready to Become the Standard Methodological Framework of Psychological Research?. John Miyamoto Department of Psychology University of Washington Seattle, WA, USA Kyoto University, Kyoto, Japan: June 25, 2012. Outline. What is Bayesian statistics?
E N D
Is Bayesian Statistics Ready to Become the Standard Methodological Framework of Psychological Research? John MiyamotoDepartment of PsychologyUniversity of WashingtonSeattle, WA, USA Kyoto University, Kyoto, Japan: June 25, 2012 Miyamoto: Bayesian Statistics in Psychology, June 2012
Outline • What is Bayesian statistics? • Will Bayesian statistics become the new methodological standard in research psychology departments? • Illustrate the basic concepts of Bayesian statistical analyses Reverend Thomas Bayes & Bayes Rule Miyamoto: Bayesian Statistics in Psychology, June 2012
Bayes Rule • Reverend Thomas Bayes, 1702 – 1761British Protestant minister & mathematician • Bayes Rule is fundamentally important to: • Bayesian statistics • Bayesian decision theory • Bayesian models in psychology Bayes Rule Miyamoto: Bayesian Statistics in Psychology, June 2012
Bayes Rule – Explanation Prior Probability of the Hypothesis Posterior Probability of the Hypothesis Likelihood of the Data NormalizingConstant Odds Form of Bayes Rule UW Psych 466, Miyamoto, Aut '10
Bayes Rule (Odds Form) Prior Odds(base rate) Likelihood Ratio(diagnosticity) Posterior Odds H = a hypothesis, e.g., H = hypothesis that the patient has cancer = the negation of the hypothesis, e.g., = hypothesis that the patient does not have cancer D = the data, e.g., D = + test result for a cancer test Same Slide with All Elements Visible UW Psych 466, Miyamoto, Aut '10
Bayes Rule (Odds Form) Prior Odds(base rate) Likelihood Ratio(diagnosticity) Posterior Odds H = a hypothesis, e.g., H = hypothesis that the patient has cancer = the negation of the hypothesis, e.g., = hypothesis that the patient does not have cancer D = the data, e.g., D = + test result for a cancer test Simple Example of Bayesian Inference – Binomial Example UW Psych 466, Miyamoto, Aut '10
Simple Example of Bayesian Statistical Inference • We will flip a coin many times.Probability of “heads” (winning)is unknown. • Statistical model: = P(win), is unknown N = number of flips, N is known. • Data: Number of wins (heads) in N flips. General Structure of Bayesian Statistical Inference Miyamoto: Bayesian Statistics in Psychology, June 2012
General Structure of Bayesian Statistical Inference More complex models have more parameters and more complex relationships among the parameters, but the ideas are the same. Why 20th Century Stats Favored Classical over Bayesian? Miyamoto, Quant Brownbag, 11/2/09
Why Did 20th Century Social Science Prefer Classical Statistics over Bayesian Statistics? • Some argued that the Bayesian prior probability distributions were arbitrary and subjective (not objective). • Bayesian response: Reference priors. Priors are chosen to be minimally controversial. • Sensitivity analysis: Consider implications of alternative prior distributions. • Bayesian inference was computationally intractable, with the exception of a few simple cases. • Bayesian response: Development of Markov Chain Monte Carlo (MCMC)for approximating a posterior distribution. • Computation of a Bayesian statistical inference is becoming increasingly easy by using R, OpenBUGS and JAGS. What is this Lecture About? Miyamoto: Bayesian Statistics in Psychology, June 2012
What Is This Lecture About? • Is Bayesian statistics ready to become the main methodology framework for a research psychology department? • Disclaimer: I am not a statistician, and not a Bayesian statistician, but I have taught graduate statistics for many years at UW psych. • What are the requirements for a methodology framework in a psych department? What is Required of a Methodological Framework in a Psychology Dept? Miyamoto: Bayesian Statistics in Psychology, June 2012
What Is Required from the Methodological Framework of a Graduate Psychology Program • Rigorous statistical analysis • Applicable to many types of psychological research data • Understandable by research psychologists • Must require only a moderate level of mathematical expertise • Must be computational accessible to psychologists • Instructional context • Instructors with training in both psychology and statistics • Textbooks • Available software that is not too expensive today’s talk today’s talk work has begun; this is our job My Goals in this Talk Miyamoto: Bayesian Statistics in Psychology, June 2012
My Goals for this Talk • Thesis: We are close to the time when Bayesian statistics can become the main framework for methodology in a research psychology department. • Demonstrate that Bayesian statistics gives intuitive, transparent answers to psychological research questions. • Demonstrate the modern software (R, OpenBUGS, JAGS) makes it easy to compute Bayesian analyses. • Discuss briefly the advantages of a Bayesian approach to statistics Lawyer/Engineer Problem Miyamoto: Bayesian Statistics in Psychology, June 2012
Kahneman & Tversky’s Lawyer/Engineer Problem This is a famous example of base rate neglect but ..... For this lecture, ... • it is an example of a problem in data analysis. Lawyer/Engineer Problem: Experimental Design Miyamoto: Bayesian Statistics in Psychology, June 2012
Lawyer/Engineer Problem (K&T, 1973) • Subjects are shown a description of Jack. • Likelihood Factor: • Typical Engineer Condition: The description sounds like a typical engineer. • Typical Lawyer Condition: The description sounds like a typical lawyer. • Base Rate Factor • High Base Rate for Engineer (30:70 Condition): Jack's description was drawn at random from a set of 30 descriptions of lawyers and 70 descriptions of engineers. • Low Base Rate for Engineer (70:30 Condition): Jack's description was drawn at random from a set of 70 descriptions of lawyers and 30 descriptions of engineers. Bayes Rule Analysis of Lawyer/Engineer Problem UW Psych 466, Miyamoto, Aut '10
Bayesian Analysis of Lawyer/Engineer Problem • High Base Rate Condition versus Low Base Rate Condition: Same in High & Low Base Rate Conditions Different in High & Low Base Rate Conditions Results for Lawyer/Engineer Problem UW Psych 466, Miyamoto, Aut '10
Experimental Results • Diagonal line represents complete neglect of base rate. • Black dots – conditions where the description favored either engineer or lawyer. Two Odd Points on this Graph Miyamoto: Bayesian Statistics in Psychology, June 2012
Experimental Results (continued) Subjects were given no description about Jack. They were only told that random sampling of names from a 30:70 or 70:30 pile had occurred. Subjects were given an uninformative description about Jack, e.g., Jack is 53 years old and is married. Line of Normative Response Miyamoto: Bayesian Statistics in Psychology, June 2012
Derive Exact Bayesian Response Exact Bayesian responses are on this line. Derive Exact Bayesian Response Miyamoto: Bayesian Statistics in Psychology, June 2012
Derive Exact Bayesian Response Same Slide without Annotations Miyamoto: Bayesian Statistics in Psychology, June 2012
Derive Exact Bayesian Response Statistical Hypotheses in Lawyer/Engineer Problem Miyamoto: Bayesian Statistics in Psychology, June 2012
Statistical Hypotheses in the Lawyer/Engineer Problem Notation: • Prob.lo = PL(Eng|D) = Population mean of judged posterior probability of engineer when the base rate of engineers is LOW • Prob.hi = PH(Eng|D) = Population mean of judged posterior probability of engineer when the base rate of engineers is HIGH Hypothesis of Base Rate Neglect Prob.lo = Prob.hi Hypothesis of Perfect Bayesian Judgment Classical Approach to Base Rate Neglect Test Miyamoto: Bayesian Statistics in Psychology, June 2012
Classical Approach to Testing Base Rate Neglect Hypothesis of Base Rate Neglect: H0: Prob.lo = Prob.hi • Independent Samples T Test:Standard null hypothesis in classical statistical framework. • 95% confidence interval forProb.lo - Prob.hi Classical Approach to Testing Perfect Bayesian Judgment Miyamoto: Bayesian Statistics in Psychology, June 2012
Classical Approach to Testing Perfect Bayesian Responding • Hypothesis of Perfect Bayesian Judgment • Classical test: ??? • 95% confidence interval for How to compute it? Data Structure for Lawyer/engineer Problem Miyamoto: Bayesian Statistics in Psychology, June 2012
Data Structure • Judgments in the High Base Rate Condition: Normal( Prob.hi, sigma.hi ) • Judgments in the Low Base Rate Condition: Normal( Prob.lo, sigma.lo ) • Data structure: Subject Low Base Rate Subject High Base Rate 1 51N1 + 1 87 2 70N1 + 2 73 3 74N1 + 3 82 ..... ..... ...... ..... N1 ..... N1 + N2 ..... Statistical Model for the Lawyer/engineer Problem Miyamoto: Bayesian Statistics in Psychology, June 2012
A Simple Statistical Model for the Lawyer/Engineer Problem Likelihood of the Data data.hi~ Normal( Prob.hi, sigma.hi) data.lo~ Normal( Prob.lo, sigma.lo) Prior Probability Distribution Prob.hi ~ Uniform( 0, 100 ) Prob.lo ~ Uniform( 0, 100 ) sigma.hi ~ Uniform( 0.1, 25.0 ) sigma.hi ~ Uniform( 0.1, 25.0 ) How Can We Compute Posterior for this Problem? Miyamoto: Bayesian Statistics in Psychology, June 2012
Evaluating Bayes’ Theorem • 4 parameter model: • Bayes Theorem: • How can we evaluate Bayes theorem for this particular case? • Mathematical analysis (only possible in a few cases) • Sample from the posterior distribution (see example) Diagram of Relations Btwn R & BUGS Miyamoto: Bayesian Statistics in Psychology, June 2012
Demonstration: Approximating a Distribution by Sampling • Run demonstration of random sampling to approximate a distribution Miyamoto: Bayesian Statistics in Psychology, June 2012
Basic Structure of Bayesian Computation in R WinBUGS, OpenBUGS, or JAGSComputes approximate posterior distribution of model parameters R PackageBRugsrjags R Data PreparationAnalysis of Results PackageFunctions Start Discussion of Sampling from the Posterior Distribution Miyamoto: Bayesian Statistics in Psychology, June 2012
Sampling from the Posterior Distribution of the Parameters • OpenBUGS & JAGS use Markov Chain Monte Carlo (MCMC)to sample from the posterior distribution of a statistical model. [Run R and JAGS analysis of Lawyer/Engineer Study] Spreadsheet Display of Samples Miyamoto: Bayesian Statistics in Psychology, June 2012
Output from OpenBUGS or JAGS Step by Step Explanation of MCMC Miyamoto: Bayesian Statistics in Psychology, June 2012
Output from OpenBUGS or JAGS When we have a very large sample (possibly millions), we will have a very good approximation to the posterior distribution of the model. Miyamoto: Bayesian Statistics in Psychology, June 2012
Example: Posterior Distribution of Prob.hi • Figure shows a histogram andsmooth density approximationfor samples from the posterior distribution of Prob.hi. Digression: Classical Versus Bayesian Confidence Intervals Miyamoto: Bayesian Statistics in Psychology, June 2012
Classical Versus Bayesian Confidence Intervals Classical Statistics: • Correct: The 95% confidence interval was computed by a method that has a 95% chance of including the true value of the parameter. • Mistake: There is a 95% chance that thetrue value of the parameter is in the 95% confidence interval. Bayesian Statistics: • Correct: There is a 95% chance that the true value of the parameter is in the 95% confidence interval. • The truth of the assertion depends on the validity of the assumptions. • Bayesians use the term “credible interval” instead of “confidence interval”. Look Briefly at Posterior Distribution of Prob.lo Miyamoto: Bayesian Statistics in Psychology, June 2012
Posterior Distributions of Prob.hi and Prob.lo • Before we had any data, we had only a vague idea of the true values of the parameters Prob.hi and Prob.lo. • These graphs show our new knowledge about the values of these parameters after observing data. Step-by-Step – How We Compute Difference Scores Miyamoto: Bayesian Statistics in Psychology, June 2012
Next: Test the Hypothesis of Base Rate Neglect • We will look at a 95% credible interval for Difference = Prob.hi - Prob.lo Graph of Posterior Distribution of Difference Miyamoto: Bayesian Statistics in Psychology, June 2012
Posterior Distribution for Difference = Prob.hi – Prob.lo Conclusion: For these hypothetical data, it is highly likely that the posterior probability of engineer is higher in the high base rate condition than in the low base rate condition. This conclusion is based on the fact that the 99% credible interval does not overlap zero, and most of the posterior probability mass is distinctly greater than zero. 99% credible interval: An interval that contains 99% of the posterior probability. Step-by-Step – How to Compute Difference Score Miyamoto: Bayesian Statistics in Psychology, June 2012
Hypothesis of Perfect Bayesian Judgment Difference = Prob.hi – Predicted.hi Graph Showing Posterior Distribution of Prob.hi – Predicted.hi Miyamoto: Bayesian Statistics in Psychology, June 2012
Posterior Probability of Prob.hi – Predicted.hi Conclusion: The posterior distribution of Prob.hi- Predicted.hi is clearly different from zero. There is essentially no probability mass in the vicinity of zero. We can definitely reject the hypothesis that all subjects conform to the requirements of Bayesian reasoning. (Reminder: The data are hypothetical.) Reminder: Conclusion: Will Bayesian Stats become Methodological Standard Miyamoto: Bayesian Statistics in Psychology, June 2012
Will Bayesian Methods Become the New Standard in Research Psychology Departments? • Bayesian methods provide better answers to research questions than do classical methods. • This was not argued in this lecture, but others like Kruschke and Gelman have argued this point. • Epistemological status of prior probabilities is no longer controversial. • Bayesian reasoning is a better framework for psychological research than classical statistics. • Strong relations to research into judgment and decision making. • Strong relations to neuroscience generally and neuroeconomics in particular. Continue this Slide: END Miyamoto: Bayesian Statistics in Psychology, June 2012
Will Bayesian Methods Become the New Standard in Research Psychology Departments? • Bayesian analyses are easy to compute with free, high quality software. • Bayesian analyses are easy to interpret. • Of course, there are difficulties. • Need to think carefully about model structure. • Need to think carefully about prior distributions. • Need to examine whether the MCMC samples have converged. • Learning Bayesian thinking helps students understand statistical and probabilistic models. • Benefits their statistical work. • Benefits their understanding of cognitive and neuroscience theory END Miyamoto: Bayesian Statistics in Psychology, June 2012