60 likes | 212 Views
GLMs, Binomial & Poisson. ‘least-squares’ linear regression. outcome variable (y). x (causal) variable. line should go ‘close’ to the points ‘collectively’ measure ‘deviations’ of points from line
E N D
‘least-squares’ linear regression outcome variable (y) x (causal) variable line should go ‘close’ to the points ‘collectively’ measure ‘deviations’ of points from line place line so sum of the squares of the deviations is minimum => ‘least-squares’ fit (unique) = MLE for normal distribution
Poisson regression: E(Y) v X for where Y follows a Poisson distribution, with Expectation modelled by X for counts (eg of deaths), where var(Y)=E(Y) E(Y)≥0 to respect that, use model E(Y)=exp(a+bX) inverse of that is log transform log(E(Y))=a+bX log here is the ‘link’ function, which ‘straightens out’ the model
Poisson regression: E(Y) v X for where Y follows a Poisson distribution, with Expectation modelled by X for counts (eg of deaths), where var(Y)=E(Y) E(Y)≥0 to respect that, use model E(Y)=exp(a+bX) inverse of that is log transform log(E(Y))=a+bX log here is the ‘link’ function, which ‘straightens out’ the model reln between var and mean is known effect of link fn on var is known => weighted least squares fit of a linear model
Binomial regression: E(Y) v X for where Y follows a Binomial distribution, with Expectation modelled by X for R/N data, where E(Y)=P, and var(Y)=P(1-P)/N 0≤E(Y)≤1 to respect that, use model E(Y)=1/(1+exp[-(a+bX)]) inverse of that is logit transform log(P/(1-P))=a+bX logit here is the ‘link’ function, which ‘straightens out’ the model
significance test for the comparison of two binomial proportions • which is the ‘more significant’ difference: • 30/50v 120/300 (60% v 40%) • 30/150 v 20/200 (20% v 10%)