1 / 50

Evaluation

Evaluation. By Jennifer Hillebrand@emcdda.europa.eu. Purpose of presentation. To broaden understanding of the basic concepts of evaluation To introduce the logic model. What is evaluation?.

erik
Download Presentation

Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation By Jennifer Hillebrand@emcdda.europa.eu

  2. Purpose of presentation • To broaden understanding of the basic concepts of evaluation • To introduce the logic model

  3. What is evaluation? • a process that attempts to determine as systematically and objectively as possible the relevance, effectiveness, and impact of activities in the light of their objectives

  4. Why is evaluation important? • To find out whether an intervention is making a difference? • Are targets being met? • What can be done better? • Evaluation can clarify interventions/programs goals and purpose • Other reasons?

  5. Empowerment evaluation - why evaluate? • The capacity of self-evaluation and reflection helps develop a project and gives strength and determination • And external evaluator may have the role of an animator in this process • It involves the stakeholders and their interests better • Quality control • Give more sense the own work • Plan resources better • Be more transparent • Improve communication • Make the own work more visible • Achieve more political weight • Consider participatory evaluation

  6. Logic model – as a basis for program development and evaluation • A logic model is a graphic representation of a program that describes the program’s essential components and expected accomplishments and conveys the logical relationship between these components and their outcomes.

  7. Logic model • Some call this program theory (Weiss, 1998) or the program's theory of action (Patton, 1997). It is a "plausible, sensible model of how a program is supposed to work." (Bickman, 1987, p. 5). • It portrays the underlying rationale of the program or initiative. (Chen, Cato & Rainford, 1998-9; Renger & Titcomb, 2002)

  8. What’s the problem? Step 1: Needs assessment Planned outcome & theory Step 2a: Clarify goals & working hypothesis Step 2b: Define contents Components Activities Methods Step 3: Select strategies & delivery Coherence Step 4: Feasibility checks Planned delivery versus actual delivery Step 5: Implementation & process evaluation Planned change and actual change Step 6: Outcome evaluation

  9. Why a logic model? • Brings detail to broad goals; helps in planning, evaluation, implementation, and communications. • Helps to identify gaps in our program logic and clarifies assumptions so success may be more likely. • Builds understanding and promotes consensus about what the program is and how it will work--builds buy-in and teamwork. • Makes underlying beliefs explicit. • Helps to clarify what is appropriate to evaluate, and when, so that evaluation resources are used wisely. • Summarizes complex programs to communicate with stakeholders, funders, audiences. • Enables effective competition for resources. (Many funders request logic models in their grant requests.)

  10. Logic model Can be applied to: • Programme planning • Programme implementation • Programme evaluation

  11. What’s the problem? Step 1: needs assessment

  12. Needs assessment • Leads to a working hypothesis and why your intervention is necessary • New or large studies are not always needed • Use existing sources and interpreting existing data • Environmental context: beyond drug problem and its extent • Global indicators and conditions  define range of action and potentials • Instruments for needs assessment in the EIB • Qualitative research methods add explanatory elements (background information) plot data (e.g. on risks, problems or problem perception) in a map or other graphical format.

  13. Planned outcome & theory Step 2a: Clarify goals & working hypothesis

  14. How will this work? Theory models underlying the chosen strategy Part of an existing models Based on own / known experiences Can be eclectically composed Explains how the intervention structures are going to work

  15. Terminology ... Could be ... • Background assumptions • Working mechanisms • Theory • Model • Evidence base • Working hypothesis • Theory/model of change

  16. Some theories • Health promotion model • Cognitive models • Informative communicative models • Reasoned action theory (Fishbein and Azjen) • Social influence models • Social influence Model (Bandura) • Life skill model (Botivn) • Sociological models • Combined models • Social development (Catalano & Hawkins • Problem/risk behaviour (Jessor)

  17. In summary • Theories provide different elements that influence or predict drug use (problems) • Describe the interaction of these elements • Give a realistic and logical overview on how an intervention is supposed to work • Therefore they provide us variables for measuring and following-up of interventions • Protect us from diffuse and instinctive actionism

  18. Components Step 2b: Define contents

  19. Components • Personal skills • Social skills • Information, awareness and knowledge • Attitudes and normative beliefs • Alternatives to drug use • Affective education This relates back to the theories

  20. Activities Methods Step 3: Select strategies & delivery

  21. Strategies • Mass media: deliver universal prevention to large target groups. Value not beyond information provision and awareness rising. • Leisure-time alternatives: a common method of reaching vulnerable groups. Can be important for delivering social influence components. • Peer-led approaches: used for school-based prevention as well as in community settings. Can entail several components, including normative beliefs. • Delivery through police officers: typical (and contested) classroom approach, mostly only information based.

  22. Strategies cont. • Outreach or youth work techniques essential for reaching vulnerable young people. • Motivational interviewing, especially for vulnerable groups and in unstable settings. • Regulatory measures important at local level, impact on normative beliefs and social rules. • Some popular strategies have no positive effects, for instance visits from or lectures by ‘experts’ (including police officers) or even ex-drug addicts, one-off activities, drug days and other awareness-raising events. •  Fidelity of implementation.

  23. Define contents Through which means, strategies, methods are you going to reach the envisaged goals?  operational objectives

  24. Coherence Double checking Step 4: Feasibility checks

  25. Feasibility • Does your theoretical framework and needs assessment match with your existing resources and the best strategies? • It is crucial to focus the efforts of your team on what is most needed and what you can realistically achieve • This helps to avoid the typical pitfall of extending the intervention beyond the capacity of your team and your financial resources • At this stage you must also decide which kind of evaluation you can carry out or whether you will undertake any evaluation at all

  26. Planned delivery versus actual delivery Step 5: Implementation & process evaluation

  27. Implementation – process evaluation • Most frequent form of evaluation found in European projects • Most projects do not go beyond this stage • Crucial step in assuring the quality of an intervention • If resources are too scarce you canprove through process evaluation that a proven approach (i.e. one already positive evaluated elsewhere or confirmed in research literature) has been successfully and correctly implemented so that a positive effect can be assumed

  28. Implementation – process evaluation • Process indicators are indicators regarding the intervention itself, e.g. the accuracy of implementation, adherence to the original plan, the extent to which timetables are being met and tasks achieved, and whether data collection is running smoothly and correctly • Balance the importance of fidelity against the need for flexibility (respond to specific needs of the target group).

  29. Programme indicators: variables related directly to the programme -process indicators • Number of participants & participant involvement • Intensity of participation • Retention in programme • Participants’ opinion about the programme  questionnaire • Fidelity to Plan I. Martínez (CEPS)

  30. Planned change and actual change Step 6: Outcome evaluation

  31. Outcome evaluation • The most asked evaluation question -is the intervention effective? • Indicators are derived from objectives • This explains the importance of objectives not exclusively being formulated in terms of drug consumption; and being realistic • Most theories propose large sets of intermediate (or mediating) variables that predict or explain drug use • That’s why theories are relevant: you can measure the variables, but you can’t always directly measure drug use

  32. Outcomes are… • The tangible results of a program • Ultimately what we want to achieve with the program • What we need to know to measure to know if we are achieving what we want to achieve • Short, medium and longer term

  33. Outcomes

  34. Outcome related variables • Prevalence rates of alcohol, tobacco, medicines & drug uses //after-before • Intention to change risk behaviours • Intention to use drugs in the future • Number of cigarettes smoked per week • Number of times got drunk in last year • Effects in the classroom / school (before/after) • Depressiveness (Kandel scale) • Rate of suicide attempts • Perception of well-being in the school & family environments • Aggressive behaviour, robbery, vandalism last year • The amount of money spent in bars, discos each week • Decrease of student’s academic stress • Number of students mentioning personal changes

  35. From objectives to indicators (an example) • Objective • Increase the social skills of school pupils by 30% (from a baseline) by 2006 • Indicator • Level of assertiveness • Instrument • Questionnaire/scale on assertiveness

  36. Objectives vs. Indicators • Indicators reduce an objective into a measurable unit • Their selection should be based on literature or previous experiences • They either measure quality/fidelity of implementation (process evaluation) • Outcomes in the target group • They look beyond drug use: social inclusion, delinquency, social relations, school performance, traffic accidents, etc

  37. Types of Indicators • Global level • Target group level • Variables related to the community I. Martínez (CEPS)

  38. Environment Indicators (Direct/close) -environment or social context variables • Drug use (peers, family)… • Norms about drugs • Drug use approval I. Martínez (CEPS)

  39. Target Group: Drug Use Indicators -drug related variables  behaviours & cognitions • Drug use & intentions of use • Believes about drug use consequences • Perception of risks of drug use • Drug use perception in the group of friends I. Martínez (CEPS)

  40. Global indicators -social variables and variables related to health aspect of drugs • Prevalence of use, risk perception & availability perception  Source: National or regional drug use surveys • Related health problems  Source: National health surveys • Social problems related to drugs  Source: Arrests for drug related crime & drug seizures • Promotion of legal drugs  Source: nº. activities sponsored by the alcohol & tobacco industry I. Martínez (CEPS)

  41. Community Indicators • Opinion about drug measures by key persons in the community • Perception of the drug problem extension by key persons in the community • Perceived need of prevention by key leaders I. Martínez (CEPS)

  42. Target group: intermediate indicators –not related directly with the drug use • Problem behaviour • Health behaviour • Self-control • Assertiveness • Cognitive & social skills: decision-making, coping, problem solving … • School performance / school grades • Bonding to family & school I. Martínez (CEPS)

  43. The linkage • The choice of indicators should be made before the intervention begins • Mirror the objectives and the components (or theoretical model) of the interventions • They approximately mirror an intervention model (i.e. they INDICATE): often, no direct measurement of objectives and components is possible • The choice of indicators shows if a programme is logical and does what it promises (in terms of theory) • Therefore, indicators are not a matter of theory. They testify if the programme leader (or evaluator) knows what he/she is doing

  44. Indicators should be: • Specific regarding quantities, quality, time and situation • Verifiable by statistical data, observation, registries • Relevant in the context of the intervention • In short, they have to be SMART: Specific, Measurable, Appropriate, Realistic, Time-bound.

  45. OBJECTIVE Develop personal and social skills MAIN COMPONENT Alternatives to drug use Theoretical MODEL Health Belief Model INDICATORS Level of decision making skills (EIB) OUTCOME The result indicate that the teachers need education on the programme and the pupils need more time to work with the commercial Simple logic model -weak

  46. Theories & Concepts • Basic concepts: • Logic model • Universal Selective, Indicated prevention Models and Theories: Social Influences Cognitive-Info Comprehensive- Combined Components and their effectiveness Mediating factors Settings Delivery: intensity, interactivity Fidelity & Adaptation in Implementation Efficiency Efficacy Relevance Impact Interest Map Logic Model Steps NEEDS ASSESSMENT PROCESS EVALUATION OUTCOME EVALUATION WORKING HYPOTHESIS DEFINE CONTENTS (FOCUS EFFORTS) FEASIBILITY CHECK Indicators Ressources Practice Examples Global Indicators: Situational Community Epidemiology Risk mapping (NIDA MAP) Risk and Protection factors Risk and Protective Factors Risk groups EDDRA examples Group Indicators Individual Indicators Common Target Groups: Ethnicity, Truancy, Gender EDDRA examples Prioritizing (CSAP) Coordination Integration Resources Existing Programmes Info sources EDDRA examples Project Indicators Community Indicators Qualitative Tools EDDRA examples EDDRA: Best Practices – Promising practice Model Programmes (SAMHSA)

  47. Thanks to Gregor Burkhartgregor.burkhart@emcdda.eu.int

  48. Good evaluation reflects clear thinking END

More Related