170 likes | 280 Views
CaDEA Workshop 3 Input. Brad Cousins University of Ottawa October 2010. Overview. Evaluation design options Data quality assurance validity/credibility reliability/dependability Instrument development and validation Data collection strategies. Design Choices. Comparison groups?
E N D
CaDEA Workshop 3 Input Brad Cousins University of Ottawa October 2010
Overview • Evaluation design options • Data quality assurance • validity/credibility • reliability/dependability • Instrument development and validation • Data collection strategies
Design Choices • Comparison groups? • Yes, no, hybrid • Black box, grey box, glass box • Data collected over time? • Yes, no, hybrid • Mixed methods • Quant, qual., simultaneous, sequential
Evaluation design alternatives • One shot, post only • X O1 • Comparative post only • X O1 • O2 • Randomized control trial • R X O1 • R O2
Evaluation design alternatives • Time series design • O1 O2 O3 O4 X O5 O6 O7 O8 • Pre-post comparative group design • O1 X O3 • O2 O4 • Delayed treatment group design • O1 X O3 O5 • O2 O4 X O6
Major Concepts VALIDITY/CREDIBILITY • Key points • Degrees on a continuum; • Describes the results or inferences; NOT the instrument; • Depends on the instrument and the process; • Involvesevidence and judgment; • Internal validity/credibility • Attribution: how confident canwebethat the observedeffects are attributable to the intervention?
Threats to internal validity • Actual but non-program related changes in participants • Maturation • History • Apparent changes dependent on who was observed • Selection • Attrition • Regression • Changes related to methods of obtaining observations • Testing • Instrumentation
Instrument Development • General Principles • Build on existing instruments and resources • Ensure validity: face, content, construct, • Ensure reliability (eliminate ambiguity) • Consider task demands • Obtrusive vs unobtrusive measures • Use of conceptual framework as guide • Demographic information solicited at end • Pilot test
Questionnaire Development • Scales: Nominal, ordinal, interval • Selected response • Multiple choice (tests) • Fixed option: • Check all that apply • Check ONE option only • Likert type rating scales • Frequency (observation): N R S F A • Agreement (opinion): SD D A SA
Questionnaire Development • Selected response (cont) • Rank ordered preferences (avoid) • Paired comparison • Constructed response • Open-ended comments • Structured • Unstructured • If ‘other’ (specify)
Questionnaire Development • Data collection formats • Hardcopy – data entry format • Hardcopy – scan-able format • Internet format • Over specify instructions • Judicious use of bold/italics and font variation • Response options on right hand side • Stapling: booklet > upper left > left margin • Judicious determination of length (8 p. max)
Interview / Focus Group Instrument Development • Review of purpose / expectations • Spacing of questions to permit response recording • Questions vs prompts • Use of quantification
Data Collection Ethics • Ethics review board procedures/protocols • Letters of informed consent • Purpose • How/why selected • Demands / Right to refusal • Confidential vs. anonymous • Contact information • Issues and tensions
Data collection • Interview tips • Small talk – set the tone • Audio tape recording – permission • Develop short-hand or symbolic field note skills • Permit some wandering but keep on track • Minimize redundancy
Sampling • Quantitative for representation • proportionate to population • random • Qualitative to maximize variation • Purposive sampling: based on prior knowledge of case(s)
Useful References Colton, D. & Covert, R. W. (2007). Designing and constructing instruments for social research and evaluation. San Fransisco: John Wiley and Sons, Inc. Cresswell, J. W. & Miller, D. L. (2000). Determiningvalidity in qualtitativeinquiry. Theoryinto practice, 39(3), 124-130. Fraekel, J.R. & Wallen, N.E. (2003). How to Design and Evaluate Research in Education. New York: McGraw-Hill. McMillan, J. H. (2004). 4th Ed. Educational Research. Toronto: Pearson, Bacon and Allen, pp. 172-174. Shultz, K.S. & Whitney, D.J. (2005). Measurement Theory in Action: Case Studies and Exercises. Thousand Oaks, CA: SAGE Publications.