1 / 22

Evaluation Methods

Evaluation Methods. Overview, Strengths and Weaknesses. Qualitative AND Quantitative. Qualitative Interviews Observations Focus groups Quantitative Surveys Clinical data extraction. Collecting Qualitative Data. In-depth interviews Interview guide Trained interviewer Recording device

druce
Download Presentation

Evaluation Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Methods Overview, Strengths and Weaknesses

  2. Qualitative AND Quantitative • Qualitative • Interviews • Observations • Focus groups • Quantitative • Surveys • Clinical data extraction

  3. Collecting Qualitative Data • In-depth interviews • Interview guide • Trained interviewer • Recording device • Capacity to have recordings transcribed • Observations • Clear parameters of what to observe • Trained observer • Time to write field notes • Focus groups • Facilitators • Guide • Space • Content analysis

  4. When to use Qualitative Methods • Formative Evaluation • Overall purpose: to improve programs • Process Evaluation • Focus on how & why things happen • Identify how a product or outcome is produced • Identify strengths & weaknesses of a program • Create detailed description of the program • Outcome Evaluation • Add depth & meaning to quantitative analysis

  5. Analyzing Qualitative Data • Driven by goals and aims of the evaluation • Describe program • Elucidate program dynamics • Identify patterns • Confirm or disconfirm quantitative findings • Is time consuming • Working from transcribed data: Read, code, discuss, read, code, discuss

  6. While analyzing your qualitative data it is important that you continuously ask yourself the following types of questions: • What patterns/common themes emerge around specific items in the data?   • How do these patterns (or lack thereof) help to shed light on the broader study question(s)? • Are there any deviations from these patterns? • If, yes, what factors could explain these atypical responses? • What interesting stories emerge from the data? • How can these stories help to shed light on the broader study question? • Do any of the patterns/emergent themes suggest that additional data needs to be collected? • Do any of they study questions need to be revised? • Do the patterns that emerge support the findings of other corresponding qualitative analyses that have been conducted?

  7. Qualitative - Benefits • Participant defined concepts • In-depth understanding • Good for complex phenomena • Can understand individual cases • Can conduct cross-case comparisons and analysis • “Personalizes” experiences of phenomena (i.e., the emic or insider’s viewpoint) • Rich detail embedded in local contexts • Captures context and setting factors

  8. Qualitative - Benefits • Good for understanding dynamic processes (i.e., documenting sequential patterns and change) • Contributes to Inductive understanding • Can determine how participants interpret constructs (e.g., self-esteem, IQ) • Happens in naturalistic settings • Responsive to local situations, conditions, and stakeholders’ needs • Responsive to changes that occur during the conduct of a study • “Thick description”: words and categories of participants lend themselves to exploring how and why phenomena occur • Cases are instructive • Determine idiographic causation (i.e., determination of causes of a particular event)

  9. Qualitative - Weakenesses • Findings less likely to be generalizable. • Not useful for predictions. • Not conducive to hypotheses testing • Lower credibility with some. • Takes more time to collect data. • Data analysis is often time consuming. • The results are more easily influenced by the researcher’s personal biases and idiosyncrasies

  10. Collecting Quantitative Data • Surveys • Well constructed instrument • A useful strategy for judging whether you have the right questions is to create the tables for the final report before finalizing the survey. • Well defined populationswell defined sample • Data capture capacity (paper, electronic) • Analysis capacity • Clinical data • Access to datasets • Data extraction capacity • Analysis capacity

  11. When to use Quantitative Methods • Process Evaluation • To predict outcomes • Outcome Evaluation • When practicable, generalizable results are needed

  12. Good survey design… • Gives clear instructions • Keeps question structure simple • Asks one question at a time • Maintains a parallel structure for all questions • Defines terms before asking the question • Is explicit about the period of time being referenced by the question • Provides a list of acceptable responses to closed questions • Ensure that response categories are both exhaustive and mutually exclusive • Labels response categories with words rather than numbers • Asks for number of occurrences, rather than providing response categories such as often, seldom, never • Saves personal and demographic questions for the end of the survey

  13. Bad survey design… • Uses jargon or complex phrases • Frames questions in the negative • Uses abbreviations, contractions or symbols • Mixes different words for the same concept • Uses “loaded” words or phrases • Combines multiple response dimensions in the same question • Gives the impression that you are expecting a certain response • Bounces around between topics or time periods • Inserts unnecessary graphics or mix many font styles and sizes • Forgets to provide instructions for returning the completed survey!

  14. Quantitative - Benefits • Good for testing and validating already constructed theories • Good for testing hypotheses that are constructed before the data are collected • (often) Generalizable • Allow quantitative predictions to be made • Cause and effect can be determined • Data collection is relatively quick (e.g., telephone interviews) • Provides precise, quantitative, numerical data • Data analysis is relatively less time consuming (using statistical software) • Results are relatively independent of the researcher (e.g., statistical significance) • May have higher credibility (e.g., administrators, politicians, people who fund programs) • Facilitates large evaluations

  15. Quantitative - Weaknesses • Researcher’s concepts =/= local constituencies • May cause confirmation bias (focus on theory or hypothesis testing rather than on theory or hypothesis generation) • Knowledge produced may not be “practicable” to local situations, contexts, and individuals

  16. Mixed Methods - Strengths • Helps with interpretation • Words, pictures, and narrative can be used to add meaning to numbers. • Numbers can be used to add precision to words, pictures, and narrative. • Brings strengths from both types of methods • Facilitates theory testing. • Broader, more complex evaluation and understanding (practice). • Methods “complement” each other. • Triangulation • May increase generalizability of the results.

  17. Mixed Methods - Weaknesses • Daunting, intensive • Requires multiple skill sets • Methodological “purity” bias • Expense. • Time. • Innovation = unknown

  18. UCSF CHR Definitions • DHHS Regulations define research as a systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge. (45 CFR 46.102(d)) • For purposes of CHR review, the UCSF HRPP further defines the following terms: • A “systematic investigation” as an activity involving a prospective plan that incorporates: • the organized collection of quantitative and/or qualitative data, or biological specimens, and analysis (or anticipation of analysis) of those data or specimens to answer a question or questions. • “Generalizable knowledge” is information based on results or findings that are expected: to be reproducible, and apply broadly with the expectation of predictable outcomes.

  19. Case 1: The Bigger Picture Campaign • To evaluate the effectiveness of The Bigger Picture Campaign, a youth-targeted diabetes prevention campaign that uses youth-generated “spoken word” messages around key environmental and social type 2 diabetes prevention targets. • Aims to increase: • youth knowledge about social determinants of type 2 diabetes • motivation to participate in activities related to diabetes prevention • discussion of and prioritization of diabetes prevention

  20. Case 2:The Tribal Health Initiative as a Vehicle for Non-Communicable Disease Control in India To assess the community-level impact of: 1. An "old age" program, in which persons over the age of 60 have been selectively targeted for education and control of factors such as hypertension, coronary artery disease, and COPD associated with modifiable lifestyle issues such as tobacco use and diet; 2. A mental health outreach program, in which persons known or suspected to suffer from mental illness can be locally evaluated for care such as therapy and, at times, medications; and 3. A community-wide hypertension screening project that includes otherwise healthy-individuals as well as those admitted to hospital for other reasons. Persons who screen positive receive education, follow-up, at in some cases medication.

  21. Case 3: Evaluation of the “Warmline” • Goal of the program is to deliver clinical consultation to health care providers with clinical questions related to treatment of people with HIV disease. • AIMS: • Increase knowledge of individual clinician on individual case • Improve understanding for treatment, testing, prevention for future cases • Increase knowledge of clinicians who will then go on to increase the knowledge of other clinicians • Promotes positive changes in clinical practice behavior

More Related