1 / 19

Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation

Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation. United Way of Peel Region Roundtable Discussion. For more information, contact: Jason Newberry jason@communitybasedresearch.ca. Centre for Community Based Research. Purpose of Today’s Session.

gianna
Download Presentation

Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation United Way of Peel Region Roundtable Discussion For more information, contact: Jason Newberry jason@communitybasedresearch.ca Centre for Community Based Research

  2. Purpose of Today’s Session • To identify and discuss challenges associated with measuring the outcomes of anonymous services or services whose clients are hard to reach • To provide agencies with some strategies/tips for improving their outcome evaluation for anonymous services • To engage United Way staff and service providers in a round table discussion of the possibilities and limitations of measuring outcomes for anonymous services

  3. A ROUGH AGENDA • Who you are…the types of agencies present and the services offered. • Evaluation that you have done in the past. • What you are expected to demonstrate through evaluation (by funders, by your board, by the community, etc.). • Common evaluation problems. • Generating solutions – what are reasonable expectations for evaluating these types of services?

  4. INTRODUCTIONS Who you are… Who you serve… What you provide, or do… What you expect to achieve immediately with participants… What you expect to achieve in the longer term with participants…

  5. What we have in common… • Providing services in which the organization does not know, exactly, who is being served. • Providing services in which confidentiality is guaranteed and where sensitive information is exchanged. • Providing single time services (no formal follow up contact). • Providing “low dose” services; many other social & personal factors contribute to outcomes • Providing services to a mobile, transient population, often with very difficult needs. • Providing services that are crisis oriented and preventive; focus is on maintenance, may not focus on improvement • Program types are very common; however, quality evaluation of programs is scarce • Success of services depends on other agencies

  6. Expectations of Evaluation • Funding bodies, boards, the community, government expect… • Data about people served (who they are, what they are like, how many of them, etc.) • Data about impact (how did people improve as a result of services)

  7. Why do you think your organization is making a difference? • Evidence from others (other services, other evaluations, research literature) • Evidence from ourselves (informal and formal evaluation) • Logic, reason, intuition

  8. Given these circumstances, challenges, and expectations, what resources & strategies are available to agencies so that they can speak to program impact? • Needs assessment • Program logic and theory • Evidence from the literature • Detailed process evaluation • Theoretically important immediate outcomes • Strategic use of qualitative data • Innovative, program-specific ideas about outcome evaluation

  9. Needs assessment • Ongoing demonstration of community need suggests that community believes in importance of program and that it carries benefits • Key informant interviews, focus groups, community surveys help warrant the program by gaining buy in from potential service users

  10. Program logic and theory • Even if you cannot collect data on outcomes, you can still comprehensively describe the logic of your program – the links between what you do and the subsequent impacts on people • Create a program logic model linking activities to outcomes • Provide a list of validity assumptions that support all the links made in your model

  11. Program logic and theory (cont.) Your program guarantees anonymity, confidentiality and/or serves people that are difficult to reach BECAUSE it is crucial to the purpose, logic, or success of the program. For example, anonymity is a validity assumption that, if violated, compromises the program theory. Therefore, an evaluation that requires breaking anonymity is not an evaluation of the program as it was designed. If your program’s theory does not rely on anonymity, then it need not be anonymous for the purposes of outcome evaluation

  12. Evidence from the literature • Evidence from the literature (academic, research, government, best practices) will help demonstrate that your program follows a theoretical rationale that is empirically supported. • Often the research cited is not something you could actually do

  13. Detailed process evaluation • A detailed process evaluation can take the place of outcome evaluation by demonstrating the theoretically conditions under which an outcome would be expected. • Structure a process evaluation around testing the validity assumptions that link activities to short-term outcomes • are we serving the right people? • are the services being delivered as planned?

  14. Theoretically important immediate outcomes • If you are engaged in direct service, there is always the theoretical possibility of observing very immediate outcomes. Where possible, data on these can be gathered and assessed against process information

  15. Strategic use of qualitative data • Qualitative data is often readily accessible and can be strategically used • Use QD to complement quantitative data. • Testimonials from staff, volunteers, and clients • Journals, observations, media, etc.

  16. Innovative, program-specific ideas about assessing outcomes Even though outcomes may be difficult to gather…. …are their still creative ways to find out about outcomes?

  17. Validity Assumptions Validity Assumptions Validity Assumptions Central Focus of Anonymous (or other problematic) Evaluations Activities Main focus of evaluationfocus is on process and implementation and direct examination of validity assumptions; theoretically important immediate outcomes are assessed (if it does not compromise the service) Immediate outcomes Secondary focus of evaluation(only possible is practical/ethical constraints are addressed; creative innovation) Short-term outcomes Likely not evaluable(unlikely to have resources for systematic investigation; probable violation of program theory; ethical considerations; program theory is weak (diluted) at this point Long-term outcomes

  18. The Role of Funding Bodies • What does this mean to funders and expectations of outcome evaluation?

More Related