370 likes | 607 Views
Planning for Evaluation: An Introduction to Logic Models Public Health Institute, LJMU 4 th July 2019. Learning outcomes for the event: Understand evaluation and the role of logic models in the first step of the planning process. Understand how to create a logic model.
E N D
Planning for Evaluation: An Introduction to Logic Models Public Health Institute, LJMU 4th July 2019
Learning outcomes for the event: • Understand evaluation and the role of logic models in the first step of the planning process. • Understand how to create a logic model. • Understand how to use a logic model to inform the evaluation planning process, including: • Confirming key stakeholders. • Identifying the scale and scope of the evaluation. • Identifying what data are already available/being collected. • Identifying what types of data to collect and when.
Systematic reviews Guidance, recommendations, professional standards Evidence Ecosystem Primary research, real world evidence, big data Examples: (1) The Digital and Trustworthy Evidence Ecosystem (2) How to achieve more effective services: the evidence ecosystem (3) Evidence for the Frontline Real world evidence
Evidence synthesisers Universities Government departments NGO/Charities Private Sector NICE/‘What Works’ Centres Evidence processors and disseminators Professional bodies/networks Policy organisations NGOs, Media, Private Sector Local Government Actors in the Ecosystem Evidence producers Universities Government departments Research Councils Private Sector Primary research, real world evidence, big data Evidence implementers Practitioners Professional bodies/networks Local commissioners
Evaluation in the evidence ecosystem “Choosing an evidence-based intervention is the foundation, but there are additional necessary tools that adept agencies/organisations must wield to successfully construct an intervention program.” Dr Carolyn Webster–Stratton
Commissioning Cycle https://www.england.nhs.uk/participation/ resources/commissioning-engagement-cycle/
What is evaluation? Conducted to define or judge current care. Explore current standards. Measure service without reference to another. Involves an intervention which is designed and delivered in accordance with guidance, professional standards. Involves existing data but may include new data. No allocation to an intervention. No randomisation.
Why evaluate? Assess service users and or service providers actual experience of a service. Assess how the objectives of the service or project are being met and any areas where they are not. Assess value for money. Assess whether a service is progressing according to plan. Identify opportunities for improvement. Document lessons to be learned for others and for the future. Establish a baseline of performance against which the impact of future initiatives can be compared.
What are the questions you need to answer? • Are things going according to plan? Why or why not? • Are there things we could do to improve what we are doing? • Is what we are doing making any difference? How do you know? • Is the difference we are making worth the time/effort/money? Can you show this?
Why think about this from the start? What is the plan? Are you collecting the right information from the beginning that will help you understand why it is working (or not)? • Are things going according to plan? Why or why not? • Are there things we could do to improve what we are doing? • Is what we are doing making any difference? How do you know? • Is the difference we are making worth the time/effort/money? Can you show this? Are you setting up the necessary process from the beginning to identify areas for improvement as you go along? Have you thought what success will look like before you start? What information do you need to gather to demonstrate this when the time comes? Will you keep a detailed record of the resource invested so that you can make this judgement further down the line?
Be planned from the start Evaluation needs to… Collect data, reflect and refine throughout the life cycle of the programme
What to include What to exclude Identify scope
Evaluation needs to… Have specific aims and objectives. Have a clear purpose and focus. Have a clear time-frame. Use stakeholder involvement(PPI). Ideally use mixed-methods. Have clear reporting deadlines.
Identify stakeholders (individual, significant others, wider stakeholders) Have clearly defined roles, responsibilities, resources Consider different perspectives
Process and outcome evaluation should be carried out throughout to ensure ongoing programme development e.g. Are the right people attending? Is the targeting appropriate? Is the delivery right for your population?
Logic models help with this… • Identify programme gaps in activity • Identify if the right data are being collected
This process (a logic model) helps identify programme gaps in activity and whether the right data are being collected to evidence outcomes. Can be desk based or captured with stakeholders during a meeting. This is your theory of change. i.e. the delivery of these activities will achieve these outcomes in the short, medium and longer-term. Evaluation will then test whether this happens and explore how and why.
A convincing picture of what you are trying achieve that shows the links between your intended activities, outputs and outcomes A framework for integrating planning, delivery and evaluation It’s not reality but your best prediction of what needs to happen to get to your outcomes Part of a wider planning and performance cycle What are logic models?
What DOES A logic model LOOK LIKE? Display of boxes and arrows, vertical or horizontal Any shape possible Level of detail – simple or complex
Input The stuff that is done Output The results that are seen Outcome The impact you are looking for
Why use them? • Evidence-based story telling (road map) • Communicate your (agreed) vision and plans • Provide clarity re activities and outcomes • Engage and sustain stakeholders (inspiration) • Aid planning and management • Focuses and improves implementation • Helps you know what and when resources are needed • Highlights assumptions and risks • Shows similarities and differences to other programme activities • Links with bigger picture
Developing a logic model… Define the outcomes The changes achieved as a result of the activities
Developing a logic model… Define the activities What does the programme actually do?
Developing a logic model… Define the outputs The countable products
Input The stuff that is done Output The results that are seen Outcome The impact you are looking for
Now, lets consider whether we are collecting the right data to evidence whether the outcomes are achieved. • Use the arrows to connect • - the activities to outputs • - and the outputs to outcomes
Now consider the gaps… How to prioritise what to evaluate? RE:AIM (Glasgow, Boles & Vogt, 1999) Reach Effectiveness Adoption Implementation Maintenance
Key points: Develop a shared sense of purpose amongst key stakeholders – identify and acknowledge roles and responsibilities within the delivery of a programme. Includes commissioners but also others who would benefit and/or be affected by the delivery of a programme. Who will analyse, collect and report on the data? Up to 10% of a programme budget should be set aside for evaluation.
Next steps: Designing data collection tools How and when to collect different types of data How to analyse and interpret different types of data
Types of process evaluation data to collect and may include: • (qualitative: interviews, focus groups, surveys, monitoring data) • Service user: • How did they find out about the service? • Why did they attend? • How easy was it to attend? • What was their experience of the service? • Were their needs met? • Reach • Service provider: • How easy was it to implement the service? • Non-service users: • Awareness and barriers to use
Types of outcome evaluation data to collect may include: (quant & qual: interviews, focus groups, surveys, monitoring data) Service user: Achievement of intended outcomes Unintended outcomes Impact on quality of life Service provider: Intended and unintended outcomes Wider system-level outcomes: Impact on partnerships and pathways Do other organisations benefit from the intervention? Significant others: Impact on quality of life
More information: Public Health Institute Faculty of Health, Education and Community Liverpool John Moores University 0151 2314382 H.Timpson@ljmu.ac.uk