420 likes | 586 Views
Supplemental slides and activity: developing a baseline questionnaire . To accompany MEASURE Evaluation PHE M&E Training Guide. Learning objectives of this module. By the end of this module, participants will be able to: Identify what can be measured through a questionnaire
E N D
Supplemental slides and activity: developing a baseline questionnaire To accompany MEASURE Evaluation PHE M&E Training Guide
Learning objectives of this module By the end of this module, participants will be able to: • Identify what can be measured through a questionnaire • Adapting/using existing tools • Determining gaps individual-level or household-level information? • Use the M&E plan to match information needs to questionnaire components • Write effective, appropriate, valid, and reliable questions • Adapt an existing questionnaire tool to fit specific programmatic needs
Elements of Questionnaire Design • Identify the questionnaire content • Write or select questions to measure variables of interest • Construct the questionnaire
A Questionnaire Is More than the Questions! • Opening/cover page • Instructions (skip patterns, probes, optional wording) • Introductions to questions • Definitions and explanations • Privacy concerns
Before Designing a Questionnaire • Decide the study’s purpose (aims, research questions, hypotheses) • Identify what you need to measure • Use your M&E plan to determine what information you will need to measure • What individual-level or household-level information can you *not* get somewhere else? (think: internal versus external data) • Develop a preliminary analysis plan • Decide the data collection mode (e.g., interview, paper/pencil, computer-assisted)
Designing Questionnaires • Don’t recreate the wheel! – Use existing tools where possible • Type of questionnaire/questions depends on study design • Self-administered survey • Self-reported test results • Clinical measurements • Pre-/post-test tests for training • When the survey takes place (before, after, during, unrelated to a clinical visit) • Motivations: Inform policy vs. inform programs
Objectives When Writing Questions • To get reliable and valid reports of respondents’ experiences • Good survey questions provide consistent (reliable) and accurate (valid) measures • When 2 respondents are in the same situation they should answer the question the same way.
Reasons Why Respondents Report Events with Less than Perfect Accuracy • They do not know the information • They cannot recall it, although they do know it • They do not understand the question • They do not want to report the answer in the context in which they are being asked it.
Considerations When Writing Questions • Type of question • Response formats • Question wording
Types of Questions • Open-ended questions • Close-ended questions • Ordered response categories • Unordered response categories • Partially close-ended questions
Open-ended Questions • Responses are not provided to the respondent • Advantages: • Researcher does not need to know universe of possible answers • Respondent not influenced by specific alternatives suggested • Respondent can reveal what is most salient • Useful in exploratory work • Can be used to build rapport in interview
Open-ended Questions Disadvantages: • Effort required of respondent • Respondents may vary in ability/willingness to articulate • Respondents may be reluctant to reveal detailed information or socially unacceptable opinions or behaviors • Large amount of information may be revealed, information may be vague or irrelevant • Difficulties in recording and in reducing and coding material
Close-ended Questions • A list of acceptable responses is provided to the respondent • Advantages: • Easier for respondent • Communicates same frame of reference to respondents • Standardization • Less variability in interviewer performance • Less time to administer and record response
Close-ended Questions Disadvantages: • Need to know appropriate response categories in advance • Lack of spontaneity permitted respondent • Respondent may be forced into an unnatural frame of reference • May suggest response categories respondent has not thought of • Respondent may not feel as involved or motivated by questionnaire
Response Formats • Multiple categories that exhaust all meaningful answers and are mutually exclusive • During a typical work week (40 hours), how many hours do you spend on health promotion in the community?: • _____less than 10 hours • _____10 to 19 hours • _____20 to 29 hours • _____30 to 39 hours • _____40 hours or more
Ordered Response Formats • Response categories are ordered along a gradient. Examples: • Strongly agree to strongly disagree (3 to 7 point scale; include or not include a neutral response category) • Excellent, Good, Fair, Poor • Numerical rating scales No Complete Confidence Confidence at All ______________________________ 1 2 3 4 5 6 7 8 9 10
Unordered Response Format • No single dimension underlies response categories. Respondent must evaluate each. Example: • Which one of the following do you think is most responsible for the long waiting period in the clinic? (Choose only ONE answer.) • Low staff morale • Poor staff training. • Many patients. • No other healthcare options near by.
Partially Close-ended Questions • Answer choices are provided and respondents have the opportunity to create their own responses. Example: • What is your position at this school? • Classroom teacher • Principal • Guidance counselor • Nurse • Other position (Please specify:____________)
Visual Analog Scale • Subjective format for collecting data • Instead of defining all the categories, you define only the extremes • Leads to a more personal (but variable) perspective of response • Often used to assess pain levels • Sometimes uses symbols that are recognizable, especially for children or illiterate . . . . . .
Formatting • Create a form to obtain consent • Explain reasons for study • Explain how to respond to questions • Put questions in blocks that are related • Begin with emotionally neutral questions – demographics • Remember questionnaire fatigue and don’t put anything important at the end
Consider skip patterns • These are questions that are not appropriate for everyone • Use of these may cause confusion • Use may depend on whether the questionnaire is self-administered or administered by a computer or interviewer
Wording • Use language that is simple, free of ambiguity and encourages accurate and honest answers • Avoid embarrassing or offending respondent • Make sure that there is clarity in how to express questions (if interviewer led) • Use vocabulary appropriate for your audience • Write questions like people talk not like people write • Translate and back-translate
Question Wording - DO • Use techniques for enhancing recall • Shorten the reference period • Use landmarks to aid dating • Provide a helpful context • Provide cues to stimulate recall • Ask about typical behavior
Question Wording – DON’T • Don’t use double-barreled questions Where do you go to get information about agricultural technologies and obtain seeds? When I get ill, I know it is because I have not been eating right or washing my hands.
Question Wording – DON’T Do not use leading questions • Do you agree that all children should be vaccinated? • With economic conditions the way they are these days, is it fair to have more than one or two children? • Can you tell me when you last visited the clinic?
Question Wording – DON’T • Provide incomplete or overlapping response categories Where have you received health care in the past 12 months? ____Health clinic ____Hospitals ____Private clinic *** Use check-all-that-apply format
Question Wording – DON’T Do not ask respondents to make unnecessary calculations. • Out of 100 women your age, how many to you think take pills? Do not use loaded questions or loaded words • Have you ever stolen anything?
Practical Standardsfor Evaluating Questions • Is this a question that will mean the same thing to everyone? • Is this a question that people can answer? • Is this a question that people will be willing to answer, given the data collection procedures (i.e., sexual health or income questions)?
Constructing the Questionnaire:Putting the Questions In Order • Beginning – inviting, interesting, non-threatening questions • Demographic information • Middle – most important, put difficult and sensitive toward end • Sexual health • income • Closing – easy questions again, often routine, background • Participation in community activities
Question Design • Include instructions, as needed, with questions - not at the beginning of the questionnaire • Clearly differentiate response categories from questions • Be consistent in placement of answer boxes • Ask one question at a time: don’t stack side-by-side • Number questions consecutively and simply from beginning to end
Other Elements • Consent forms • Title/cover page • General instructions • Identifiers (e.g., respondent ID) • Transitions
Creating New Questionnaires • Generate potential items for the instrument • Use qualitative data collection to inform – focus group or in-depth interviews • Test it • Correct it • Pilot it again • Train interviewers and data entry people well
Steps in Assembling Instruments • List of variables potentially useful (conceptual framework) • Collect existing measures (justification) • Draft survey (long version to be revised and shortened later) • Pre-test • Validate –– are items measuring what you think they are…
Administering Instrument • Questionnaires vs. Interviews • Questionnaires – • Self-administered (may cause bias in responses) • Less expensive • Interviews – • Administered verbally (advantage when person is illiterate) • Helps for complicated surveys • More costly and time-consuming • Choice depends on costs and complexity of study
Interviewing • Standardize approach • Train, train, train, • Document, document, document • Standardize wording, stick to it • Avoid interviewer bias • Neutral probing
Data coding, entry, and analysis • Beyond the scope of this workshop • Many available resources • Some are in your CD in your packet, including the UNICEF, ORC Macro, CARE, and UNAID survey guides
Summary • Decide what information (variables) is needed • Draft or obtain questions to elicit that information • Put questions in meaningful order • Add other elements of questionnaire • Pretest questionnaire • Repeat • Remember: you can use these guidelines for pre- and post-test too! • Allow more time than you think!
Group activity preparation discussion • Now, you will adapt components of the example PHE baseline questionnaire to monitor and evaluate your community-based PHE program. • If you already have a program/project, think about a mid-way or final program survey (or you could consider questions to use in focus groups, interview, etc.) • Determine general study design (what communities, where, how many people, who (men, women, youth, etc.) • How will you collect the data? Will the survey be self-administered, interviewer administered, etc. • Can you use skip patterns? If so, which types of questions would you skip and for whom?
Small group activity • Go back to your M&E plan (logic model and framework) - 6 indicators • What indicators require a household survey? Focus group? Interviews? Records? • Which indicators are standard indicators [what number from the Guide]? • Look through the PHE baseline example questionnaire • Using the PHE baseline example questionnaire, determine: • Can you get your 6 indicator information from the existing tool? • What questions/sections would you keep? • Which questions/sections would you delete? • What sections or components would you add? • Each indicator may require more than one question to get the information • Think through the numerator information • Consider the denominator information • Think about your data and indicator needs. Do costs, timing, other constraints make you rethink your chosen indicators? Can you collect them?