180 likes | 378 Views
Thomas Bayes to the rescue. st5219 : Bayesian hierarchical modelling lecture 1.4. Bayes theorem: maths alert. (You know this already, right?). Bayes theorem: application. You are GP in country like SP Foreign worker comes for HIV test HIV test results come back + ve
E N D
Thomas Bayes to the rescue st5219: Bayesian hierarchical modelling lecture 1.4
Bayes theorem: maths alert (You know this already, right?)
Bayes theorem: application • You are GP in country like SP • Foreign worker comes for HIV test • HIV test results come back +ve • Does worker have HIV? How to work out? Test sensitivity is 98% Test specificity is 96% ie f(test +ve | HIV +ve) = 0.98 f(test +ve | HIV --ve) = 0.04
Bayes theorem: application • Analogy to hypothesis testing • Null hypothesis is not infected • Test statistic is test result • p-value is 4% • Reject hypothesis of non-infection, conclude infected But we calculated: f(+ test | infected) NOT f(infected | + test)
Bayes theorem: application How to work out? Test sensitivity is 98% Test specificity is 96% Infection rate is 1% ie f(test +ve | HIV +ve) = 0.98 f(test +ve | HIV --ve) = 0.04 f(HIV +ve) = 0.01
AIDS and H0s • Frequentists happy to use Bayes’ formula here • But unhappy to use it to estimate parameters • But... If you think it is wrong to use the probability of a positive test given non-infection to decide if infected given a positive test why use the probability of (imaginary) data given a null hypothesis to decide if a null hypothesis is true given data?
The Bayesian Id and frequentist Ego • How do you normally estimate parameters? • Is theta hat the most likely parameter value?
The Bayesian Id and frequentist Ego • The parameter that maximises the likelihood function is not the most likely parameter value • How can we get the distribution of the parameters given the data? • Bayes’ formula tells us likelihood prior posterior (this is a constant)
Updating information via Bayes • Can also work with • Start with information before the experiment: the prior • Add information from the experiment: the likelihood • Update to get final information: the posterior • If more data come along later, the posterior becomes the prior for the next time
Updating information via Bayes Start with information before the experiment: the prior Add information from the experiment: the likelihood Update to get final information: the posterior
Updating information via Bayes Start with information before the experiment: the prior Add information from the experiment: the likelihood Update to get final information: the posterior
Updating information via Bayes Start with information before the experiment: the prior Add information from the experiment: the likelihood Update to get final information: the posterior
Summarising the posterior Mean: Median: Mode:
Summarising the posterior • 95% credible interval: chop off 2.5% from either side of posterior
Summarising the posterior Bye bye delta approximations!!!
Sounds too easy! What’s the catch?! • Here are where the difficulties are: • building the model • obtaining the posterior • model assessment • Same issues arise in frequentist statistics (1, 3); estimating MLEs and CIs difficult for non à la carte problems • Let’s see an example! Back to AIDS!