460 likes | 1.05k Views
Bayesian Linear Regression In regression analysis, we look at the conditional distribution of the response variable at different levels of a predictor variable. Bayesian Linear Regression. Response variable - Also called “dependent” or “outcome” variable - What we want to explain or predict
E N D
Bayesian Linear Regression In regression analysis, we look at the conditional distribution of the response variable at different levels of a predictor variable
Bayesian Linear Regression • Response variable • - Also called “dependent” or “outcome” variable • - What we want to explain or predict • - In simple linear regression, response variable is continuous • Predictor variables • Also called “independent” variables or “covariates” • In simple linear regression, predictor variable usually is also continuous • How we define which variable is response and which is predictor depends on our research question
Quick review of linear functions Y is a response variable that is a linear function of the predictor variable β0: intercept; the value of Y when X=0 β1: slope; how much Y changes when X increases by 1 unit
Intro to Bayesian simple linear regression Likelihood Prior distribution The posterior distribution is not straighforward We have to implement MCMC techniques with WinBUGS
Examples Willingness to Pay for Environmental Improvements in a Large City. For example, we can study the social benefits of a set of environmental and urban improvements planned for the waterfront of the City of Valencia (Spain): Response variable: How much are you willing to pay for this policy? Covariates: Sex, Age, Income Data: 80 individuals
Discrete choice experiment Random Utility Model Probit Model Logit Model
Objectives: • Revealed preference models use random utilities • Probit models assume that utilities are multivariate normal • Probit MCMC generates latent, random utilities • Logit models assume that the random utilities have extreme value distributions. • Logit MCMC uses the Hasting-Metropolis algorith
Random Utility Model Utility for alternative m is: where there are n subjects (sample), M+1 alternatives in the choice set, and Ji choice ocassiones for subject i
Subject picks alternative k if for all m The probability of selecting k is for all m Statistical Models: {εi,j,m} are Normal Probit Model {εi,j,m} are Extreme Value Logit Model
Logit model in WinBUGS Likelihood: Prior distributions:
Logit model in WinBUGS Example: Discrete choice experiment To study the value that car consumers place upon environmental concerns when purchasing a car Response variable: Yes/No Attributes: safety (Yes/No), carbon dioxide emissions, acceleration from 0 to 100 km/h(<10sec. And < 7.5 sec)2.5 sec), second hand, and annual cost (900€, 1400€, 2000€). Sample size: 150
Probit model in WinBUGS Likelihood: Prior distributions:
Hierarchical Logit The hierarchical logistic regression model is a very easy extension of standard logit. Likelihood yij ~ Bernoulli(pij), logit(pij) <- b1j + b2jx2ij + … bkjxkij Priors bjk ~ N(Bjk, Tk) for all j,k Bjk <- k1 + k2 zj2 + … + km zjm qr ~ N(0, .001) for all q,r Tk ~ Gamma(.01, .01)