310 likes | 552 Views
Rural Health Outreach Tracking and Evaluation Program: Identifying Impacts on the Health of Rural Communities. 2010 Rural Health Care Services Outreach Program and Rural Health Network Development Program Grantee Meeting Alana Knudson, PhD August 2 ~ Washington, DC. Partners.
E N D
Rural Health Outreach Tracking and Evaluation Program: Identifying Impacts on the Health of Rural Communities 2010 Rural Health Care Services Outreach Program and Rural Health Network Development Program Grantee Meeting Alana Knudson, PhD August 2 ~ Washington, DC
Partners • NORC Walsh Center for Rural Health Analysis • Michael Meit, MA, MPH • Alana Knudson, PhD • University of Minnesota Rural Health Research Center • Ira Moscovice, PhD • Michelle Casey, MS • Walt Gregg, MPH • National Rural Health Association • National Organization of State Offices of Rural Health • Rural Health Resource Center
Advisory Committee Linda Breland, RN, MPH Lisa Davis, MHA Lynette Dickson, MS Paul Duncan, PHD Sylvia Elexpuru, BSN, RN Amy Elizondo, MPH L. Gary Hart, PhD Terry Hill, MPA John Rugge, MD Chris Tilden, PhD
Overview of ORHP 330A Grant Evaluation Project ORHP 330A Outreach Authority Grant Programs • Created as part of the Public Health Service Act of 1991 • Under the authority of section 301 • More than $460 million awarded since program inception • Nearly 900 consortia projects have participated and sought to: • Expand rural health care access • Coordinate resources • Improve rural health care service quality • Five grant programs operate under the authority of section 330A • Rural Health Care Services Outreach (Outreach) • Network Development Planning (Network Planning) • Rural Health Network Development (Network Development) • Small Health Care Provider Quality Improvement (Quality) • Delta States Rural Development Network (Delta)
Current Evaluation Projects • Exploring the Performance Improvement and Measurement System (PIMS) Database • Exploring Opportunities to Strengthen the Rural Health Network Development Planning Grant Program • Identifying Common Evaluation Metrics Across the 330A Outreach Authority Grant Programs • Applying Evidence-Based Models in Rural Communities
Background • Performance Improvement and Measurement System began collecting grantee data in 2009 • Contains measures on: • Access to care • Population demographics • Under- and uninsured individuals who receive care • Worker recruitment • Sustainability • Quality improvement • Clinical measures
Research Questions • What steps are needed to prepare the data in order to accurately describe grant activities? • What are the baseline frequencies for the measures? • Do measures vary based on… • Rurality of grantees? • Duration of grant? • Types of entities participating in the networks? • Availability of health IT? • What additional data elements could enhance understanding of how grant program meets its goals?
Exploring Opportunities to Strengthen the Rural Health Network Development Planning Grant Program
Project Overview Identify key challenges to rural health network development and function through a review of the relevant literature and a structured dialogue with a panel of rural health network expert informants Identify cross-cutting themes for the 2006 – 2010 Rural Health Network Planning Grantees Assess the extent to which 2006 - 2008 Rural Health Network Planning Grantees have sustained operations to address local health care needs that would otherwise be difficult to accomplish as individual providers Examine the role of grant review feedback, technical assistance, and other key factors in grantee success Identify and recommend opportunities to strengthen the Network Development Planning Grant guidance and review process
Research Questions What are the common cross-cutting themes, weaknesses and strengths of the 2009 and 2010 Rural Health Network Development Planning Grant applications? Are the cross-cutting themes of the 2009 and 2010 awardees similar or different from those that applied for funding in 2006, 2007 and 2008? What are the common features of network collaborations most associated with sustained efforts addressing the health care needs of rural populations? What is the status of Network Planning grantees who did not subsequently receive funding from the Network Development Grant Program? Did they apply for funding from other sources and if so did they submit a similar project proposal to their Network Planning application? Did they become self-sufficient or cease to exist?
Research Questions (cont.) How helpful were the Network Planning Grant award and related resources (e.g., feedback received from the grant review process, technical assistance from ORHP, the TA contractor, and/or SORH) to grantees? Are there other resources or venues that could have better assisted the applicants to make their effort more successful? How can the Network Development Planning Grant program guidance and review process be strengthened to help ensure that selected grantees are successful in further developing and sustaining their networks?
Identifying Common Evaluation Metrics Across the 330A Outreach Authority Grant Programs
Project Overview • Develop and refine a protocol to identify common data elements collected through grantee-level evaluation activities. • Goals: • Offer recommendations to guide grantee-level evaluation activities by identifying common elements within grant programs • Gather shared goals and promising practices to aid effective communication of program findings for each grant program. • Help validate current PIMS measures by identifying common elements across grant programs • Offer recommendations for future measures that could be adopted both in the grantee performed self evaluations and as additional ORHP PIMS measures.
Research Questions Two Levels of Research Questions – within and across programs • Within Programs • What types of evaluation questions did grantees address in their program evaluations? • Is there consistency in data collected within 330A Outreach Authority grant programs? • What factors facilitate or inhibit grantee data collection? • Are the ORHP PIMS measures used to inform self evaluation activities? • Based on results of analysis, what program specific metrics can be added to individual program evaluations and/or ORHP PIMS? • Across Programs • What are the common data elements collected across 330A Outreach Authority grant programs? • Based on the results of analysis, what additional data elements should be included in the ORHP PIMS?
Project Goals • Identify evidence-based models that may benefit the 330A Outreach Authority Program grantees • Document the scope of their use in the field • Build an Evidence-Based Model Warehouse around topic areas specific to rural health • Year 1: Community Health Workers
What is evaluation? e·val·u·ate (-vly-t) tr.v.e·val·u·at·ed, e·val·u·at·ing, e·val·u·ates 1. To ascertain or fix the value or worth of. 2. To examine and judge carefully; appraise. See Synonyms at estimate. 3. Mathematics To calculate the numerical value of; express numerically. Source: The Free Dictionary
What is evaluation? "The process of determining the merit, worth, or value of something.” Source: Scriven, M. (1991). Evaluation thesaurus (4th ed.). Newbury Park, CA: Sage. Program evaluation: The systematic and comprehensive approach to studying the effectiveness of programs for the purposes of program improvement, making judgments about the programs, informing decision making, and generating practical knowledge. Volkov, 2009
Why are evaluations important? • To gain an understanding of program operation • Verify that you're doing what you think you're doing • To document program effectiveness • To examine strengths and weaknesses of program • To determine future funding Source: Free Management Library
Myths of Evaluation Evaluations must be scientifically rigorous Evaluations prove success or failure Evaluations must be conducted by evaluation experts Evaluations are cost-prohibitive It all depends on the context of the program…
Key Considerations of Evaluation • Why evaluate? • What do you want to be able to decide (or to do) as a result of the evaluation? • How are you planning to use the evaluation findings? • What are the evaluation purpose(s)? • Who are the audiences for the information from the evaluation? Who are the key stakeholders? • Clients, funders, board, management, staff, others?
Key Considerations of Evaluation • What kinds of information are needed to make the decision you need to make ? • Information to really understand the process/implementation of the product or program (its inputs, activities and outputs) • The clients who experience the product or program • The staff who provide who work with the clients • Strengths and weaknesses of the product or program • Benefits to clients (outcomes) • Did the outcome achieve what the program intended? • What data should be collected?
Evaluation Data Collection Options Questionnaires Surveys Checklists Interviews Document review Observation Focus Groups Case Studies
Key Considerations of Evaluation • When is the information needed? • What resources are available to collect and analyze the information? • What resources are available to “package” the information for different audiences? • Funders, boards, clients, staff, other stakeholders • What forums are available to share promising practices or lessons learned from the evaluation?
Types of Evaluation • Goal-based Evaluations • Is the program meeting predetermined goals or objectives? • Process-based Evaluations • How does it produce results? • Outcomes-based Evaluations • Are you really doing the right program activities to bring about the outcomes you believe (or better yet, you've verified) to be needed by your clients?
Evaluation Overview • Engage stakeholders in the evaluation process • Develop and implement an evaluation strategy and/or plan • Begin with the end in mind • There is no "perfect" evaluation design • Include interviews in your evaluation methods • Track both successes and challenges • Share your evaluation results and lessons learned with evaluation stakeholders
“One of the great mistakes is to judge policies and programs by their intentions ratherthan their results.” Milton Friedman, PhD American Economist and Statistician
Contact Information Alana Knudson, PhD Co-Director, Walsh Center for Rural Health Analysis 301-634-9326 Knudson-alanal@norc.org Michael Meit, MA, MPH Co-Director, Walsh Center for Rural Health Analysis 301-634-9324 meit-michael@norc.org Ira Moscovice, PhD Mayo Professor and Division Head of the Division of Health Policy and Management at the University of Minnesota 612-624-8618 mosco001@umn.edu