410 likes | 885 Views
Bias. Sam Bracebridge. By the end of the lecture fellows will be able to. Define bias Identify different types of bias Explain how bias affects risk estimates Critique study designs for bias Develop strategies to minimise bias. Epidemiologic Study. What do epidemiologists do?
E N D
Bias Sam Bracebridge
By the end of the lecture fellows will be able to • Define bias • Identify different types of bias • Explain how bias affects risk estimates • Critique study designs for bias • Develop strategies to minimise bias
Epidemiologic Study What do epidemiologists do? • Measure effects • Attempt to define a cause • an estimate of the truth • Implement public health measure
True association? Bias? Chance? Confounding? Estimated effect: the truth? Mayonnaise Salmonella RR = 4.3
Warning! • Chance and confounding can be evaluated quantitatively • Bias is much more difficult to evaluate • Minimise by design and conduct of study • Increased sample size will not eliminate bias
Definition of bias Anysystematic errorin the design or conduct of an epidemiological study resulting in a conclusion which is different from the truth
Errors in epidemiological studies Error Random error (chance) Systematic error (bias) Study size Source: Rothman, 2002
Main sources of bias • Selection bias • Information bias
Selection bias Two main reasons: • Selection of study subjects • Factors affecting study participation • association between exposure and disease differs between those who participate and those who don’t
Types of selection bias • Sampling bias • Ascertainment bias • referral, admission • Diagnostic/surveillance • Participation bias • self-selection (volunteerism) • non-response, refusal • survival
Selection of controls Estimate association of alcohol intake and cirrhosis How representative are hospitalised trauma patients of the population which gave rise to the cases? OR = 6
a b d c Selection of controls Higher proportion of controls drinking alcohol in trauma ward than non-trauma ward OR = 6 OR = 36
Some worked examples • Work in pairs • In 2 minutes: • Identify the reason for bias • How will it effect your study estimate? • Discuss strategies to minimise the bias
a b d c • Overestimation of “a” overestimation of OR • Diagnostic bias Oral contraceptive and uterine cancer You are aware OC use can cause breakthrough bleeding • OC use breakthrough bleeding increased chance of testing & detecting uterine cancer
a b d c • Overestimation of “a” overestimation of OR • Admission bias Asbestos and lung cancer Prof. “Pulmo”, head specialist respiratory referral unit, has 145 publications on asbestos/lung cancer • Lung cancer cases exposed to asbestos not representative of lung cancer cases
Healthy worker effect Association between occupational exposure X and disease Y Source: Rothman, 2002
Healthy worker effect Source: Rothman, 2002
Prospective cohort study- Year 1 lung cancer yes no Smoker 90 910 1000 Non-smoker 10 990 1000
Loss to follow up – Year 2 lung cancer yes no Smoker 45 910 955 Non-smoker 10 990 1000 50% of cases that smokedlost to follow up
Minimising selection bias • Clear definition of study population • Explicit case, control and exposure definitions • CC: Cases and controls from same population • Same possibility of exposure • Cohort: selection of exposed and non-exposed without knowing disease status
Sources of bias • Selection bias • Information bias
Information bias During data collection Differences in measurement of exposure data between cases and controls of outcome data between exposed and unexposed
Information bias Arises if the information about or from study subjects is erroneous
Information bias • 3 main types: • Recall bias • Interviewer bias • Misclassification
Overestimation of “a” overestimation of OR Recall bias Cases remember exposure differently than controls e.g. risk of malformation • Mothers of children with malformations remember past exposures better than mothers with healthy children
Overestimation of “a” overestimation of OR Interviewer bias Investigator asks cases and controls differently about exposure e.g: soft cheese and listeriosis • Investigator may probe listeriosis cases about consumption of soft cheese (knows hypothesis) Cases of Controls listeriosis Eats soft cheese a b Does not eat c d soft cheese
Misclassification Measurement error leads to assigning wrong exposure or outcome category Exposure Outcome
Misclassification • Systematic error • Missclassification of exposure DIFFERS between cases and controls • Missclassification of outcome DIFFERS between exposed & nonexposed => Measure of association distortedin any direction
Misclassification OR = ad/bc = 3.0; RR = a/(a+b)/c/(c+d) = 1.6
Misclassification OR = ad/bc = 1.5; RR = a/(a+b)/c/(c+d) = 1.2
Minimising information bias • Standardise measurement instruments • questionnaires + train staff • Administer instruments equally to • cases and controls • exposed / unexposed • Use multiple sources of information
Summary: Controls for Bias • Choose study design to minimize the chance for bias • Clear case and exposure definitions • Define clear categories within groups (eg age groups) • Set up strict guidelines for data collection • Train interviewers
Summary: Controls for Bias • Direct measurement • registries • case records • Optimise questionnaire • Minimize loss to follow-up
The epidemiologist’s role • Reduce error in your study design • Interpret studies with open eyes: • Be aware of sources of study error • Question whether they have been addressed
Bias: the take home message • Should be prevented !!!! • At PROTOCOL stage • Difficult to correct for bias at analysis stage • If bias is present: • Incorrect measure of true association • Should be taken into account in interpretation of results • Magnitude = overestimation? underestimation?
References Rothman KJ; Epidemiology: an introduction. Oxford University Press 2002, 94-101 Hennekens CH, Buring JE; Epidemiology in Medicine. Lippincott-Raven Publishers 1987, 272-285