1 / 26

Evaluation Advisory Group Ross Brownson, Ph.D. (co-chair) Alan Cross, M.D.

Creating National Performance Indicators that are Relevant to Stakeholders: Participatory Methods for Indicator Development, Selection, and Refinement. Demia L. Sundra, M.P.H., Margaret Gwaltney, M.B.A., Lynda A. Anderson, Ph.D., Ross Brownson, Ph.D., Jennifer Scherer, Ph.D. Evaluation 2004

geoffreys
Download Presentation

Evaluation Advisory Group Ross Brownson, Ph.D. (co-chair) Alan Cross, M.D.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Creating National Performance Indicators that are Relevant to Stakeholders:Participatory Methods for Indicator Development, Selection, and Refinement Demia L. Sundra, M.P.H., Margaret Gwaltney, M.B.A., Lynda A. Anderson, Ph.D., Ross Brownson, Ph.D., Jennifer Scherer, Ph.D. Evaluation 2004 November 3, 2004

  2. Contributors to Project DEFINE [Developing an Evaluation Framework: Insuring National Excellence] Evaluation Advisory Group Ross Brownson, Ph.D. (co-chair) Alan Cross, M.D. Robert Goodman, Ph.D., M.P.H. Richard Mack, Ph.D. Randy Schwartz, M.S.P.H. Tom Sims, M.A. Avra Warsofsky Carol White, M.P.H. CDC Lynda A. Anderson, Ph.D., Robert Hancock, Demia Sundra, M.P.H. COSMOS Corporation Jennifer Scherer, Ph.D., Margaret Gwaltney, M.B.A. (Currently with Abt Associates), Thérèse van Houten, D.S.W., Cynthia Carter Concept Systems, Inc. Dan McLinden, Ed.D., Mary Kane

  3. Goals of Presentation • Describe the context in which the performance indicators for the CDC’s Prevention Research Centers (PRCs) program were developed • Review the participatory methodology utilized to develop and select national indicators • Discuss the benefits, challenges, and lessons learned from developing performance indicators in an established, diverse, national program

  4. Background and Context

  5. Features of CDC’s Prevention Research Centers Program • 33 academic-based extramural research centers across United States • Academic centers partner with community and state organizations to develop, conduct, and disseminate prevention research through participatory methods • Diversity across centers: • When founded (newly funded to 18 yrs) • Community setting and partners • Focus of research

  6. Context for Developing a National Evaluation • National evaluation planning project initiated in response to: • Institute of Medicine (IOM) report on the PRC program • Support for evaluation at CDC • Growth of program and related need for accountability • Project DEFINE Goals (Planning Phase) • Engage stakeholders, develop a logic model and performance indicators, and draft an evaluation plan • Maintain a participatory and utilization-focused approach throughout

  7. Intended Purposes of Performance Indicators • Individual data on each PRC • Evaluation • Monitoring • Technical assistance needs • Cross-center summary data • Accountability • Program improvement • Information sharing and communications with internal and external stakeholders

  8. Anticipated Challenges in Developing PRC Indicators • Centers strive to achieve diverse health outcomes • Program had few previous cross-center requirements • Centers are at various stages of growth and maturity • Interests of diverse stakeholders had to be considered • Concern existed about how performance indicators would be used • Indicators had to be meaningful and impose minimal burden on PRCs in terms of time and cost

  9. Methodology

  10. Engage the Community Diversity & Sensitivity Research Methods Relationships & Recognition Research Agenda Training Technical Assistance Core Expertise & Resources Active Dissemination Basis of Project DEFINE • Concept Mapping • Gained national and community perspectives on PRC program through 2-tiered approach • Engaged diverse stakeholders in brainstorming statements describing PRC program • Statements analyzed to create visual maps of concepts • Concepts used to build draft logic models • Community and national logic models combined

  11. Development of Draft Indicators • More than 70 indicators first drafted • Mapped to all components of program logic model • Some indicators dropped, others refined based on input received at regional meetings and PRC contextual visits • 52 candidate indicators remained on the list

  12. Stakeholder Recommendations from Regional Meetings • Select a limited number of indicators focused on features common across Centers • Collect data on some components of logic model in other ways as part of the national evaluation, rather than through indicators • Develop indicators through an iterative process, with multiple opportunities for input • Link the performance indicators to the PRC Information System

  13. Stakeholder Selection of the National Performance Indicators • 52 indicators listed in structured feedback tool (workbook) • All stakeholder groups provided feedback and comments • PRCs, Community, State, and CDC • Planned on having core and optional indicators

  14. Results of Performance Indicator Feedback • 100% response rate received on workbooks • Comments from workbook summarized within each stakeholder group • 3 of the 4 stakeholder groups recommended 8 indicators • 2 of the 4 stakeholder groups recommended an additional 11 indicators

  15. Resulting National Performance Indicators • Evaluation advisory group selected and refined 13 indicators based on • Results and feedback of workbook • Map of indicators across the logic model • Cross-walk of recommended indicators with IOM report recommendations • Indicators correspond to various logic model components, e.g. • Community input on selecting health priorities (input) • Existence of explicit research agenda (activity) • Evidence of peer-reviewed publications (output)

  16. Collecting the Information • Performance indicators were integrated into the information system that was in development • Conceptualized from the beginning • Reinforced through stakeholder feedback • Fields were created in the information system for each performance indicator • Information system was developed and reviewed by: • Evaluation Contractors • Centers and partners (usability and pilot tests) • CDC staff • Evaluation Advisory Group • PRC Steering Committee

  17. Core and Optional Indicators • Only core indicators developed through Project DEFINE. Consensus allowed us to: • Focus on 13 indicators • Not use resources for optional indicator development and integration into information system • 11 out of 28 PRCs developed center-specific indicators on their own • Topics areas such as • community satisfaction with partnership; funding generated; web site hits; infrastructure measures; research methods appropriate for minority population

  18. PRC Performance Indicators: Summary • Specific component requirements across all grantees • Indicators reflect both process and outcome measures, with focus on process • Initial requirements as part of new funding cycle • Prospective evaluation • Assess general information across PRCs rather than specific health topics • Defining common outcomes, e.g. community capacity • Indicators will be refined during evaluation implementation

  19. Challenges, Benefits, and Lessons Learned

  20. Current Challenges with Performance Indicators • Requests for more guidance on how to further define and collect data • Development of summary reports and provision of feedback to all stakeholders • Need to increase specificity of indicators over time • Balance between participatory processes and program requirements

  21. Benefits of Participatory Approach for Performance Indicator Development • Buy-in, support, and ownership of indicators • Evaluation advisory group was critical for trust and support from larger stakeholder groups • Community voice is reflected in indicators • Perspective of the PRCs’ staff and partners reflected in utility and feasibility issues surrounding indicators and information system

  22. Lessons Learned and Recommendations • Utilize participatory methods for selecting and refining indicators to increase stakeholder support • Build sufficient time into schedule to allow multiple opportunities for stakeholder input • Acknowledge inherent challenge in developing indicators for an established program • Include community input in indicator development to increase accountability to partners

  23. For more information on the Prevention Research Centers Program • http://www.cdc.gov/prc/ • Click on “about the program” to view the Conceptual Framework (logic model) and narrative • Contact information: • Demia Sundra: dsundra@cdc.gov

  24. PRC IS: General Information Page

  25. PRC IS: Community Committees

  26. PRC IS: Health Priorities

More Related