750 likes | 762 Views
This training covers the new FVPSA program mandate for evaluation, background on previous evaluation efforts, strategies for collecting data from survivors, and reporting data to enhance services. Learn about outcome evaluation, FVPSA program requirements, advisory workgroup insights, and prior evaluation efforts by Cris Sullivan and NRCDV. Discover tools like self-assessment and client feedback surveys to strengthen program practices. Research findings show increased safety and well-being outcomes for survivors. Join the training to improve your program evaluation processes! 8 Relevant
E N D
Outcome Evaluation Training for FVPSA Grantees Provided by Cris Sullivan, Ph.D. Eleanor Lyon, Ph.D.
What This Training Covers • New FVPSA program mandate for evaluation • Background information about prior evaluation efforts of domestic violence programs • Specific strategies for collecting this (and other) information directly from survivors • Specific strategies for reporting the data and using the information to improve services
New FVPSA Requirements • In 2005, the Family Violence Prevention and Services (FVPSA) Program within DHHS was reviewed by the Office of Management and Budget (OMB). • ‘Results were not adequately demonstrated.’ • FVPSA Program is now required to have grantees collect outcome data.
So What is Outcome Evaluation? • Outcome evaluation involves examining change that has occurred as a result of a service being provided. • An outcome is a change in knowledge, attitude, skill, behavior, expectation, emotional status, or life circumstance due to the service being provided.
FVPSA Program Response: • In response, Bill Riley, then Director of the FVPSA Programs, convened a national advisory workgroup to develop strategies for local programs. • He wanted the new requirement to be useful to programs, and not to be too burdensome.
Advisory Workgroup • Consisted of coalition directors, national resource centers, state FVPSA administrators, local program directors, and evaluation specialists • Discussed needing outcomes to reflect the complex nature of our services • Wanted outcomes to be evidence-based and meaningful to local programs • Looked to prior evaluation efforts to inform this work
Prior Evaluation Efforts • In late 1990s Cris Sullivan worked with PCADV and Pennsylvania programs to identify reasonable outcomes of our work and how to evaluate those outcomes • This resulted in an outcome evaluation manual that included tools for programs to use • She also began providing one-day workshops for programs to learn about outcome evaluation.
Prior Evaluation Efforts • Also around that time the National Resource Center on Domestic Violence (NRCDV) had been facilitating discussions among state coalition directors, women of color activists, and others to think critically about our work • As a result, in 1998 NRCDV initiated the Documenting Our Work Project (DOW)
The People Behind DOW • Project coordinators: Eleanor Lyon, Anne Menard and Sujata Warrier • A work group of coalition directors, local program directors, evaluation specialists, state administrators, and national resource centers
Why Document Our Work? • To develop consensus on definitions, goals, and outcomes of our work. • Individual funders are increasingly requiring outcome evaluation. • Can use to strengthen and inform program practice, policy and research. • Can use to encourage accountability to survivors and their children.
DOW Products • Self-assessment tools for state coalitions and local programs • Community assessment tools for state coalitions and local programs • Outcome evaluation surveys for local programs to evaluate hotlines, counseling, support groups, support services & advocacy, and shelter services
Client Feedback Surveys • Please see the DOW client feedback surveys in the manual appendix • The surveys were created to hear specifically from survivors about their experiences, in a simple and straightforward way • They were tested in programs across four states and found to be useful
The Advisory Group Noted That: • Similar outcomes were identified across the DOW project as well as Cris’s work in Pennsylvania and other states. • Programs were finding these outcomes, trainings, manuals and survey tools to be useful to them.
Advisory Group Also Looked to Research for Guidance • Research on the effectiveness of domestic violence services is limited. • However, some longitudinal research has found that increasing women’s knowledge about and access to community resources decreases their risk of re-abuse and increases their well-being. • The manual includes more info about these studies
Research Also Shows: • The two strategies survivors have identified as most likely to make the (abuse) situation better are contacting a domestic violence victim service program (72%) and actually staying at a domestic violence shelter (79%). • Women who have stayed in shelters are more likely to generate escape plans, use active resistance strategies against abusers and seek help from professionals when faced with abuse.
Workgroup Consensus: They identified two outcomes that: • Are appropriate given the varied nature of survivors’ contact with programs • crisis contacts and non-crisis contacts, varying lengths of contact, contact within different service contexts, such as hotline, shelter, advocacy, and support groups) • Have been empirically shown by research to lead to long-term outcomes of increased safety and well-being
Workgroup’s TwoRecommended Outcomes As a result of contact with the domestic violence program, 65% or more of domestic violence survivors will have more: 1. strategies for enhancing their safety. 2. knowledge of available community resources.
Additional Recommendations • Roll out these new expectations over time. • Provide training and technical assistance to states. • Provide actual tools and databases to states. • Pilot the DOW forms across four states to see if people find them to be manageable and useful.
Timeline for Rolling Out FVPSA Outcomes Oct ’05 – Sept ’06: • Introduced the outcomes to FVPSA grantees • Collaborated with pilot sites (MO, NE, PA, WI) to identify needs related to outcome evaluation
Timeline for Rolling Out FVPSA Outcomes Oct ’06 – Sept ’07: • Worked with pilot sites to refine data collection strategies, data collection tools, and reporting procedures. • Added additional pilot sites (NH, VT, ME); finalized an outcome manual that includes specific strategies and tools for grantees.
Timeline for Rolling Out FVPSA Outcomes Oct ’07 – Sept ’08: • Create a DVD training • Provide Train the Trainer workshops • Work with additional sites until outcome evaluation is fully implemented. By December of 2009: • all programs will be submitting a full year of data on the 2 outcomes to their FVPSA administrators (and then on to DHHS)
The Pilot Project A few findings to demonstrate the type of information you can get from survivors themselves
What Did the Pilot Involve? • In-person training of program staff • Follow-up TA—conference calls & listserv • Strong encouragement to use complete DOW-derived forms • Use of forms: shelter (2), support group, support services/advocacy, & counseling—some revised during training
Forms Have in Common: • Completed voluntarily by survivors • Checklist of services women may have wanted & what they received • Outcomes of the service, including two new FVPSA outcomes: • I know more ways to plan for my safety • I know more about community resources • Respect and support received • Overall satisfaction with service • Basic demographics
Why Not Just Ask the Two Outcome Questions? • Would look odd to clients if there were only two questions on a survey • The usefulness of the info is limited • Does not give contextual information to programs • Does not capture important process information (such as respect, autonomy)
Examples of Results • Data submitted anonymously from programs with data entered • Represents survivors who received services across three states in 2007
Conclusions from Pilot • Overall, survivors found the forms easy to fill out • Overall, staff found the process relatively simple • Programs found the information useful • Some programs want fewer questions
Changes Made Based on Pilot • We have created a “menu” of questions that programs can use to create their own surveys • Databases have been created in Access and Excel for those programs using the entire DOW forms • We have created “cheat sheets” staff can use to remind them how to gather the information • Continuing to translate the forms into languages other than English
Collecting the New FVPSA Outcomes: Getting Started • Getting staff buy-in • Deciding who on staff will do what • Deciding what questions to ask, how often to collect data, when to collect, and from whom • Treating clients respectfully
Staff Buy-in The Problem: • Staff are generally already overworked and tired of paperwork that feels meaningless • Staff often don’t understand why they have to collect the information they do, or what happens to it • Staff often don’t ever see the tabulated information they DO collect
Getting Staff Buy-in • Involve them in understanding how the information can be used by the program • Explain the new requirement and have them participate in developing a protocol for gathering the information • Share the findings with them periodically • Discuss with them how to make program changes based on the findings
Deciding Who on Staff Will Do What • In the manual is a form entitled CREATING A PLAN WITH STAFF FOR COLLECTING OUTCOME EVALUATION DATA
Data Collection Protocol • Forms should be handy and visible to the staff who will hand them out to clients • Staff should understand when and how to ask clients to participate • Supervision of this process, especially in the beginning, is important
What Will be Used? • We recommend using the forms available at http://pubs.pcadv.net/FVPSA_Outcome/. At the login screen, type: User name: outcomes Password: outcomes • If not, incorporate the two questions into forms already being used by the program • Important we have consistent information to share with FVPSA administrator
When Will Data be Collected? • Do not collect data when clients are in crisis • Allow enough time for change to occur • You can’t expect change to occur, for example, after a woman attends only one support group • But collect often enough that you don’t miss those clients who receive short-term services
How Often Will Data be Collected? • Depends on service: • Close to exit for shelter residents • Every 3-6 weeks for support groups and counseling • Support services is the most difficult to determine because you often don’t know when you’ve “finished.” Allow enough time for change to occur (at least 2 contacts with an advocate, at minimum)