410 likes | 495 Views
Focus On… “ Data Collection Choices ”. Presented by: Tom Chapel. This Module…. Why and how of : Developing indicators Making good data collection choices Using mixed methods effectively. CDC’s Evaluation Framework. Indicator development bridges evaluation focus and data collection.
E N D
Focus On…“Data Collection Choices” Presented by: Tom Chapel
This Module… • Why and how of: • Developing indicators • Making good data collection choices • Using mixed methods effectively
CDC’s Evaluation Framework • Indicator development • bridges evaluation focus • and data collection STEPS Engage stakeholders Ensure use and share lessons learned Describe the program Standards Utility Feasibility Propriety Accuracy Focus the evaluation design Justify conclusions Gather credible evidence
What is an indicator? Specific, observable, and measurable characteristics that show progress towards a specified activity or outcome.
Why Indicators? • “Gray” area between abstract concepts framed in evaluation questions and methods/sources of data collection • Indicators “operationalize” – restate abstract concepts in a tangible way • Tangible indicators help find/match appropriate data sources/methods • May, but need not, be S-M-A-R-T objectives
Selecting Good Indicators 1. Construct Validity The indicator measures an important dimension of the activity or outcome. i.e., measure “quality” or “timeliness”.
Selecting Good Indicators 2. Measure the activity or outcome itself, NOT the “fruits” or “so what” of the activity or outcome. For example: What constitutes a measure of good training? “Successful training implementation” is an indicator for good training. “Did participants learn something?” is a fruit of good training.
Selecting Good Indicators 3. There must be at least one indicator for each activity or outcome of interest-- BUT, you may need multiple indicators. The use of multiple indicators is called “triangulation”.
Provider Education: Our Evaluation Focus • Activities:Outcomes: • Conduct trainings Provider KAB increase • MD peer education and rounds Provided policies • Nurse Educator presentation to LH Providers know registry and their role in it • Activities: Outcomes: • Providers attend trainings and rounds Providers motivation to do • Providers receive and use Tool Kits immunization increases • LHD nurses do private provider consults
Provider Education: Possible Indicators • Activities:Indicators: • Providers attend trainings Number of participants in trainings • and rounds Number of participants completing series of trainings • Per cent participants by discipline • Per cent participants by region
Provider Education: Possible Indicators • Activities:Indicators: • Providers receive and Per cent providers who report use Tool Kits use of toolkit • Number of “call-to-action” cards received from toolkit
Data Collection Choices The Framework approach emphasizes use of findings: Not “Collect Data”, BUT “Gather Credible Evidence” Not “Analyze Data”, BUT “Justify Conclusions”
Characterizing Data Collection Methods and Sources • Primary vs. secondary • primary: collecting data for first time for the purpose of this project • secondary: making use of pre-existing data • Obtrusive vs. unobtrusive: • to what extent does the respondent know that data are being collected • Quantitative vs. qualitative • quantitative: deals with numbers • qualitative: deals with descriptions
Quantitative and Qualitative • Quantitative → Quantity • Numbers - data which can be measured. • Length, height, area, volume, weight, speed, time, temperature, humidity, sound levels, cost. • Qualitative → Quality • Descriptions - data can be observed but not measured. • Colors, textures, smells, tastes, appearance, beauty, etc.
Six (Most) Common Ways to Collect Data People Surveys Interviews Focus groups Observation Document review Secondary data
CDC’s Evaluation Framework • Standards inform • good choices • at Step 4 STEPS Engage stakeholders Ensure use and share lessons learned Describe the program Standards Utility Feasibility Propriety Accuracy Focus the evaluation design Justify conclusions Gather credible evidence
Choosing Methods—Cross-Walk to Evaluation Standards • Standards • Utility • Feasibility • Propriety • Accuracy
Choosing Methods—Cross-Walk to Evaluation Standards • Standards • Utility - What is the purpose of the data collection? • Feasibility • Propriety • Accuracy
Choosing Methods—Cross-Walk to Evaluation Standards • Standards • Utility • Feasibility - How much time? How much cost/budget? • Propriety • Accuracy
Choosing Methods—Cross-Walk to Evaluation Standards • Standards • Utility • Feasibility Propriety - Any ethical considerations? • Accuracy
Choosing Methods—Cross-Walk to Evaluation Standards • Standards • Utility • Feasibility • Propriety • Accuracy - How valid and reliable do data need to be? What does “valid” and “reliable” mean in context of study?
Trade-offs of Different Data Collection Methods • Method/Factor Personal Interview Focus Groups Document Review Survey: Phone Secondary Data Observation • Time • Cost • Sensitive Insures • Hawthorne Effect • Ethics • Survey: Mail
Example 1:Sexual Behavior of High School Males • Point-in-time estimate— sexual behavior of high school males • Indicator: • What % of high school males have had a sexual encounter by the end of their junior year? • Criterion: • Sensitive issue (consider accuracy)
Example 1:Sexual Behavior of High School Males • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is WORST? • WHY?
Example 1:Sexual Behavior of High School Males • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is WORST? Focus groups • WHY? Sensitive issue - peer group is likely to distort responses.
Example 1:Sexual Behavior of High School Males • Sexual behavior of high school males. • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is BEST? Surveys • WHY? Anonymous (more accurate)
Example 2: Intimate Partner Violence • Understanding context—intimate partner violence • Indicator: • Understand context and identify patterns of intimate partner violence. • Criterion: • Sensitive issue (consider accuracy)
Example 2: Intimate Partner Violence • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is WORST? • WHY?
Example 2: Intimate Partner Violence • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is WORST? Surveys • WHY? Unethical and will not elicit the • data we need (consider utility).
Example 2: Intimate Partner Violence • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is BEST? Interviews or focus groups • WHY? Build rapport through shared experiences
Example 3: Reduce Lead Burden in Household • Aggressive housekeeping and nutrition behaviors to reduce lead burden. • Indicator: • Assess adoption of housekeeping and nutrition behaviors. • Criterion: • Sensitive issue • Hawthorne effect
Example 3: Reduce Lead Burden in Household • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is WORST? Surveys, interviews • WHY? Inaccurate(desire to give socially acceptable responses)
Example 3: Reduce Lead Burden in Household • Possible methods: • Surveys • Interviews • Focus groups • Observation • Which method is BEST? Observation (garbage, coupons) • WHY? Passive and unobtrusive
The Best Method Depends on the Specific Situation • All three examples involve a sensitive issue: • sexual behavior • intimate partner violence • good nutrition and housekeeping • Even though the criterion (sensitive issue) was the same, the best data collection method was different for each situation.
Provider Education: Our Evaluation Focus • Activities:Outcomes: • Conduct trainings Provider KAB increase • MD peer education and rounds Provided policies • Nurse Educator presentation to LH Providers know registry and their role in it • Activities: Outcomes: • Providers attend trainings and rounds Providers motivation to do • Providers receive and use Tool Kits immunization increases • LHD nurses do private provider consults
Provider Education: Possible Indicators • Activities:Indicators: • Providers attend trainings Number of participants in trainings • and rounds Number of participants completing series of trainings • Per cent participants by discipline • Per cent participants by region
Provider Education: Possible Methods • Providers attend trainings and rounds • Indicators Methods/Sources • Number of participants in trainings Training logs • and rounds • Number of participants completing Registration info series of trainings • Per cent participants by discipline • Per cent participants by region
Provider Education: Possible Methods • Providers receive and use Tool Kits • Indicators Methods/Sources • Per cent providers who Survey of providers • report use of toolkit • Number of “call-to-action” cards Analysis/count of call-to-action received from toolkit cards
Mixed Methods: Definition • A combination of methods that has • complementary strengths and • non-overlapping weaknesses. • The purpose is to • supplement orcomplement • the validity and reliability of the information.
Why Mixed Methods? • “The Cs and the Es” • Corroboration and Clarification • understanding more defensibly, validly, credibly • ”triangulation” • Explanation and Exploration • understanding more clearly • understanding the “why” behind the “what”