190 likes | 261 Views
Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation. United Way of Peel Region Roundtable Discussion. For more information, contact: Jason Newberry jason@communitybasedresearch.ca. Centre for Community Based Research. Purpose of Today’s Session.
E N D
Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation United Way of Peel Region Roundtable Discussion For more information, contact: Jason Newberry jason@communitybasedresearch.ca Centre for Community Based Research
Purpose of Today’s Session • To identify and discuss challenges associated with measuring the outcomes of anonymous services or services whose clients are hard to reach • To provide agencies with some strategies/tips for improving their outcome evaluation for anonymous services • To engage United Way staff and service providers in a round table discussion of the possibilities and limitations of measuring outcomes for anonymous services
A ROUGH AGENDA • Who you are…the types of agencies present and the services offered. • Evaluation that you have done in the past. • What you are expected to demonstrate through evaluation (by funders, by your board, by the community, etc.). • Common evaluation problems. • Generating solutions – what are reasonable expectations for evaluating these types of services?
INTRODUCTIONS Who you are… Who you serve… What you provide, or do… What you expect to achieve immediately with participants… What you expect to achieve in the longer term with participants…
What we have in common… • Providing services in which the organization does not know, exactly, who is being served. • Providing services in which confidentiality is guaranteed and where sensitive information is exchanged. • Providing single time services (no formal follow up contact). • Providing “low dose” services; many other social & personal factors contribute to outcomes • Providing services to a mobile, transient population, often with very difficult needs. • Providing services that are crisis oriented and preventive; focus is on maintenance, may not focus on improvement • Program types are very common; however, quality evaluation of programs is scarce • Success of services depends on other agencies
Expectations of Evaluation • Funding bodies, boards, the community, government expect… • Data about people served (who they are, what they are like, how many of them, etc.) • Data about impact (how did people improve as a result of services)
Why do you think your organization is making a difference? • Evidence from others (other services, other evaluations, research literature) • Evidence from ourselves (informal and formal evaluation) • Logic, reason, intuition
Given these circumstances, challenges, and expectations, what resources & strategies are available to agencies so that they can speak to program impact? • Needs assessment • Program logic and theory • Evidence from the literature • Detailed process evaluation • Theoretically important immediate outcomes • Strategic use of qualitative data • Innovative, program-specific ideas about outcome evaluation
Needs assessment • Ongoing demonstration of community need suggests that community believes in importance of program and that it carries benefits • Key informant interviews, focus groups, community surveys help warrant the program by gaining buy in from potential service users
Program logic and theory • Even if you cannot collect data on outcomes, you can still comprehensively describe the logic of your program – the links between what you do and the subsequent impacts on people • Create a program logic model linking activities to outcomes • Provide a list of validity assumptions that support all the links made in your model
Program logic and theory (cont.) Your program guarantees anonymity, confidentiality and/or serves people that are difficult to reach BECAUSE it is crucial to the purpose, logic, or success of the program. For example, anonymity is a validity assumption that, if violated, compromises the program theory. Therefore, an evaluation that requires breaking anonymity is not an evaluation of the program as it was designed. If your program’s theory does not rely on anonymity, then it need not be anonymous for the purposes of outcome evaluation
Evidence from the literature • Evidence from the literature (academic, research, government, best practices) will help demonstrate that your program follows a theoretical rationale that is empirically supported. • Often the research cited is not something you could actually do
Detailed process evaluation • A detailed process evaluation can take the place of outcome evaluation by demonstrating the theoretically conditions under which an outcome would be expected. • Structure a process evaluation around testing the validity assumptions that link activities to short-term outcomes • are we serving the right people? • are the services being delivered as planned?
Theoretically important immediate outcomes • If you are engaged in direct service, there is always the theoretical possibility of observing very immediate outcomes. Where possible, data on these can be gathered and assessed against process information
Strategic use of qualitative data • Qualitative data is often readily accessible and can be strategically used • Use QD to complement quantitative data. • Testimonials from staff, volunteers, and clients • Journals, observations, media, etc.
Innovative, program-specific ideas about assessing outcomes Even though outcomes may be difficult to gather…. …are their still creative ways to find out about outcomes?
Validity Assumptions Validity Assumptions Validity Assumptions Central Focus of Anonymous (or other problematic) Evaluations Activities Main focus of evaluationfocus is on process and implementation and direct examination of validity assumptions; theoretically important immediate outcomes are assessed (if it does not compromise the service) Immediate outcomes Secondary focus of evaluation(only possible is practical/ethical constraints are addressed; creative innovation) Short-term outcomes Likely not evaluable(unlikely to have resources for systematic investigation; probable violation of program theory; ethical considerations; program theory is weak (diluted) at this point Long-term outcomes
The Role of Funding Bodies • What does this mean to funders and expectations of outcome evaluation?