260 likes | 384 Views
C2 Training: May 9 – 10, 2011. Data Evaluation: Initial screening and Coding Adapted from David B. Wilson and Mark W. Lipsey. Overview. Coding protocol: essential feature of systematic review Goal: transparent and replicable description of studies extraction of findings
E N D
C2 Training: May 9 – 10, 2011 Data Evaluation: Initial screening and Coding Adapted from David B. Wilson and Mark W. Lipsey
Overview • Coding protocol: essential feature of systematic review • Goal: transparent and replicable • description of studies • extraction of findings • Forms should be part of C2 protocol
Topics • Eligibility criteria and screening form • Development of coding protocol • Assessing reliability of coding • Common mistakes
Study Eligibility Criteria • Flow from research question • Identify specifics of: • Defining features of the program/policy/intervention • Eligible designs; required methods • Key sample features • Required outcomes • Required statistical data • Geographical/linguistic restrictions, if any • Time frame, if any • Also explicitly states what is excluded
Study Eligibility Screening Form • Develop a screening form with criteria • Complete form for all studies retrieved as potentially eligible • Modify criteria after examining sample of studies (controversial) • Double-code eligibility • Maintain database on results for each study screened • Example from MST review in handouts
Screening Form Effects of Multisystemic Therapy (MST) Initial Screening Form 1.0
Effects of Multisystemic Therapy (MST): Eligibility Screening Form 1.2
Screening Coding Guide for “Internet-based Interventions for English Language Learners”
Coding practice exercise 1 • For the articles provided, code Levels 1 and 2 from the MST coding sheet • Use Brunk and either Bourduin or Henggler & Melton
Development of Coding Protocol • Goal of protocol • Describe studies • Differentiate studies • Extract findings (effect sizes if possible) • Coding forms and manual • Both important • Sample coding item from form • Sample manual instructions for item
Development of Coding Protocol • Types of Information to Code • Setting, study context, authors, publication date and type, etc. • Methods and method quality • Program/intervention • Participants/clients/sample • Outcomes • Findings, effect sizes
Types of Information to Code • Setting, study context, authors, publications date and type, etc. • Multiple publications; “study” vs “report” • Geographical/national setting; language • Publication type and publication bias issue • Publication date vs study date • Research, demonstration, practice studies • Example from MST review in handouts
Types of Information to Code • Methods: Basic research design • Nature of assignment to conditions • Attrition, crossovers, dropouts, other changes to assignment • Nature of control condition • Multiple intervention and/or control groups • Design quality dimensions • Initial and final comparability of groups • Treatment-control contrast • treatment contamination • blinding
Types of Information to Code • Methods: Other aspects • Issues depend on specific research area • Procedural, e.g., • monitoring of implementation, fidelity • credentials, training of data collectors • Statistical, e.g., • statistical controls for group differences • handling of missing data
Types of Information to Code • Method quality ratings (or not) • More than 200 scales and checklists available, few if any appropriate for systematic reviews (Deeks et al., 2003) • Overall study quality scores have questionable reliability/validity (Jüni et al., 2001) • Conflate different methodological issues and study design/implementation features, which may have different impacts on reliability/validity • Preferable to examine potential influence of key components of methodological quality individually • Weighting results by study quality scores is not advised!
Cochrane risk of bias framework • Focus on identifying potential sources of bias in studies: • Selection bias - Systematic differences between groups at baseline • Performance bias - Something other than the intervention affects groups differently • Attrition bias - Participant loss affects initial group comparability • Detection bias - Method of outcome assessment affects group comparisons • Reporting bias - Selective reporting of outcomes
GRADE system for method quality • Quality of evidence across trials • Outcome-specific • Considers: sparse data, consistency/inconsistency of results across trials, study designs, reporting bias, possible influence of confounding variables • Software available at: www.ims.cochrane.org/revman/gradepro • Also see: www.gradeworkinggroup.org
Types of Information to Code • Program/Intervention • General program type (mutually exclusive or overlapping?) • Specific program elements (present/absent) • Any treatment received by the comparison group • Treatment implementation issues • integrity • amount, “dose” • Goal is to differentiate across studies • Examples
Types of Information to Code • Participants/clients/sample • Data is at aggregate level • Mean age, age range • Gender mix • Racial/ethnic mix • Risk, severity • Restrictiveness; special groups (e.g., clinical) • Examples
Types of Information to Code • Outcome measures • Construct measured • Measure or operationalization used • Source of information • Composite or single indicator (item) • Scale: dichotomous, count, discrete ordinal, continuous • Reliability and validity • Time of measurement (e.g., relative to treatment) • Examples
Types of Information to Code • Findings • Compute effect sizes when possible • May need to aggregate data or reconfigure findings • Add back the “dropouts” • Compute weighted means of subgroups (e.g., boys and girls) • Code data on which computations based (common situations) • We will look at this part of the coding in the next section
Development of Coding Protocol • Iterative nature of development • Structuring data • Data hierarchical (findings within studies) • Coding protocol needs to allow for this complexity • Analysis of effect sizes needs to respect this structure • Flat-file (example) • Relational hierarchical file (example)
Data extraction Double data extraction • Cohen’s kappa • Agreement on key decisions • Study inclusion/exclusion, key characteristics, risk of bias, coding of results • Pilot-test and refine codes!
Example of a Flat File Multiple ESs handled by having multiple variables, one for each potential ES. Note that there is only one record (row) per study
Example of a Hierarchical Structure Study Level Data File Effect Size Level Data File Note that a single record in the file above is “related” to five records in the file to the right
Coding exercise 2 • For either Borduin or Henggler & Melton, please code the Level 3 items (do not do the outcomes and effect sizes) • Report back: what was easy/difficult?