1 / 48

Readiness for Program Evaluation

Readiness for Program Evaluation. Assessing and supporting readiness for evaluation among community partners delivering Healthy Weight Programs. Session 2: Collaborative Learning Project. Session 1 We Considered…. Purpose of evaluation Types of program evaluations

toscano
Download Presentation

Readiness for Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Readiness for Program Evaluation Assessing and supporting readiness for evaluation among community partners delivering Healthy Weight Programs Session 2: Collaborative Learning Project

  2. Session 1 We Considered… • Purpose of evaluation • Types of program evaluations • Logic models and framework for evaluation • Standards for and step-by-step process for evaluation

  3. Today We Will… • Conceptualize a framework for thinking about a healthy weight program’s (HWP) readiness for evaluation • Learn how to assess an HWP’s readiness for evaluation using existing tools • Anticipate and plan for challenges that might limit readiness when evaluating in community settings • Review case studies to see “getting ready to evaluate” in action!

  4. Conceptual Frameworks What does it mean to be “ready” to conduct an evaluation? Session 2: Collaborative Learning Project

  5. Framework 1: Ready, Set, Change! Holt et al, 2010.

  6. Framework 2: The NIRN Hexagon Metz, A. & Louison, L. (2018). The Hexagon Tool: Exploring Context. Chapel Hill, NC: National Implementation Research Network, Frank Porter Graham Child Development Institute, University of North Carolina at Chapel Hill.

  7. Assessing Readiness to Conduct an Evaluation Finding strengths and areas for improvement to prepare for a robust evaluation Session 2: Collaborative Learning Project

  8. Wilder Collaboration Inventory Mattessich and Johnson, 2018

  9. Wilder Scoring

  10. Organizational Readiness for Implementing Change (ORIC) Shea CM, Jacobs SR, Esserman DA, Bruce K, Weiner BJ. (2014). Organizational readiness for implementing change: a psychometric assessment of a new measure. Implementation Science.

  11. Measures: Other Individual Readiness for Organizational Change • (Holt et al., 2007) Perceived Organizational Readiness for Change • (Cinite et al., 2009) Organizational Change Recipients’ Beliefs Scale • (Armenakis et al., 2007) Organizational Change Questionnaire – Process, Context, and Readiness • (Bouckenooghe et al., 2009) TCU Organizational Readiness for Change • (Lehman et al., 2002)

  12. The Right Assessment for Your Program • http://readiness.knowledgetranslation.ca/

  13. Preparing to Evaluate in a Community-Based Setting Differences from traditional research and strategies to address common challenges Session2: Collaborative Learning Project

  14. Common Challenges Faced When Evaluating in Community-Based Settings

  15. Active Recreation through Community-Healthcare Engagement Study (ARCHES)

  16. Mapping Challenges to Solutions

  17. Case Study 1 • Evaluation Readiness: Start with the Big Picture • Ihuoma Eneli, MD, MS • Center for Healthy Weight and Nutrition • Nationwide Children’s Hospital • Columbus, Ohio Session 2: Collaborative Learning Project

  18. Evaluation Readiness: Establishing the Need and Urgency • Need for evaluation has to be driven by a clarity of the purpose, goals, and survivability of the program • All team members have to appreciate their role in attaining the purpose, goals, and survivability of the program • Starts with defining mission and vision (OR goals) for the program • Link to a larger organization

  19. Phases and Types of Evaluation FORMATIVE SUMMATIVE Based on slides from Jennifer Nichols, Porter Novelli **Credit to NCCOR Collaborative Learning Session 1

  20. Evaluability Assessment (EA) • Developed by Joseph Wholey in 1979 • Systematic process that helps identify whether program evaluation is justified, feasible, and likely to provide useful information • Cheaper than an evaluation plan (10% rule of thumb) • Can provide useful information regardless if a program has an evaluation plan or not • Provides stakeholders with valuable information on adapting the program structure to support future evaluations • Ideally conducted by an external evaluator BUT depending on resources, can be conducted by program staff

  21. Examples of Evaluability Questions • Does the program serve the population for whom it was designed? • Does the program have the resources discussed in the program design? • Are the program activities being implemented as designed? • Does the logical connection between activities and goals and objectives exist? • Does the program have the capacity to provide data for an evaluation • How well are data collected? Are they reliable? Is the program capable of collecting and managing the data needed for an evaluation? • Who enters the data? What are their qualifications?

  22. Evaluability Case Study: Program Model Not Sound • Community-based afterschool program with programming targeted at healthy eating and physical activity • Outcome measures: fitness level and change in weight • Evaluability assessment • No reliable documentation from programs to assess delivery of the lessons • Evaluation periods not standardized • Attendance was routinely tracked • Only 15% of the same children have pre- and post-testing • Full evaluation plan suspended

  23. The Primary Care Obesity Network (PCON) • Network of primary care practices linked to a tertiary care obesity center at a children’s hospital to prevent and treat childhood obesity • Driven by policy initiative in the State of Ohio • Embedded into a tertiary care obesity center • Goals in both prevention and treatment • Functions as an outreach program for the hospital and center Eneli, IU, et al. (2017). The primary care obesity network: Translating expert committee guidelines on childhood obesity into practice. Clinical Pediatrics.

  24. PCON Model of Care The Hub: • Provides training and support to practices • Serves as integrator between the community and practices • Provides access for nutrition services • Stage 3 & 4 referral center PC PC PC Tertiary Care Hub • The Practices (PC): • Participate in prevention activities geared to all patients • Stage 1 & 2 support for patients • Catch patients earlier and refer as needed to tertiary care (Hub) • Link patients to community resources PC PC PC PC • The Community: • Provides community-based resources to practices • Participates in educational activities for the network, patients, and families PC Community

  25. Readiness for Evaluation: Examples of Questions Metz, A. & Louison, L. (2018) The Hexagon Tool: Exploring Context. Chapel Hill, NC: National Implementation Research Network, Frank Porter Graham Child Development Institute, University of North Carolina at Chapel Hill.

  26. Evaluation Plan: Outcomes • Patient/client outcome: Will the proposed program or change affect the patients knowledge, health, or well being? • Process outcome: Will implementing the program or change go as planned? • Balancing outcome: Will the proposed change lead to a negative outcome in another part of the process? Eneli IU, Cannon-Smith G. Systems of Care and Quality Improvement: Guiding the Care of the Child with Obesity. Clinical Care of the Child with Obesity: A Learner’s and Teacher’s Guide.

  27. Program Outcomes Eneli, IU, et al. (2017). The primary care obesity network: Translating expert committee guidelines on childhood obesity into practice. Clinical Pediatrics.

  28. Case Study 2 Evaluation Readiness: Drilling Down to the Nitty Gritty • Alexis Tindall MHA, RD, LD Center for Healthy Weight and Nutrition Nationwide Children’s Hospital Columbus, Ohio Session 2: Collaborative Learning Project

  29. Organizational Readiness The extent to which individuals or groups are psychologically and behaviorally prepared to implement a specific program or practice A key determinant of implementation success and a mediator of the effectiveness of implementation interventions Implementation Evaluation Implementation Evaluation

  30. Ready, Set, Change! Holt et al, 2010.

  31. Understanding Where You Are At The State of Childhood Obesity in Franklin County *Includes youth ages 10-17. (Remaining body mass index percentages includes youth ages 6-18.) http://www.centralohiohospitals.org/documents/HealthMap_2016.pdf

  32. Translating to Columbus Healthy Weight Quick Facts • Children 2 and older • BMI at or above 95th percentile • Monthly visits • Individualized treatment plan • Physician or NP • Dietitian • Physical therapy • Social work/psychology • New U evening group programs—child and parent • ATC • Dietitian • +/- Psych • Bariatric program

  33. Strategic Roadmap

  34. The “Keys” for Assessing Program Evaluation Readiness • Consider “The Who” • Community Awareness • Physician practices, community organizations (e.g., YMCA), after school programs, stakeholders • Organizational Awareness • Leadership support, relationship building with other sub-specialists

  35. The “Keys” for Assessing Program Evaluation Readiness • Consider “The How” • Staffing needs (MD, RD, Psych, Health Coach, etc.) and the requested amount of FTEs per discipline • Training and knowledge expectations • Resources needed (outside of staffing) • Space, equipment, materials, sites • Infrastructure • Electronic health record, Excel, Access

  36. The “Keys” for Assessing Program Readiness for Managing Data • Has the organization/program identified and prioritized its desired results? • Is there an established means to measure progress toward those results or can it be created? • Is there a process for tracking and measuring progress toward its desired results, which includes effective means to display data? • Is there a communication plan where individuals working to achieve the desired results can exchange and provide ongoing feedback? • Is there an established plan to periodically review progress? Monthly? Quarterly? • Is there a process to intervene when needed as a means to improve progress? (HRSA, nd)

  37. Evaluation is a Continuous Cycle • Continuous Quality Improvement (CQI) cycle • Planning – What actions will best reach our goals and objectives? • Performance measurement – How are we doing? • Evaluation – Why are we doing well or poorly? What do we do? Why are we doing well or poorly? How do we do it? How are we doing? ** Credit to NCCOR Session 1

  38. Staying Accountable IHI Model for Improvement • Three fundamental questions • What are we trying to accomplish? • How will we know that a change is an improvement? • What changes can we make that will result in improvement? • The Plan-Do-Study-Act (PDSA) cycle. • Key Driver Diagram (KDD)

  39. Weiner’s Theory of Organizational Readiness to Change Change commitment: Intent to change is shared Change efficacy: Shared belief that they can jointly make the change Change valence: How much the change is valued Informational assessment: Perception that resources are available to implement the change Context: Organizational culture, climate, resources, structure, and past experiences with implementing change

  40. Bonus Case Study: ARCHES Session 2: Collaborative Learning Project

  41. Active Recreation through Community-Healthcare Engagement Study (ARCHES) • Model to Evaluate: Bull City Fit, a clinic—parks and recreation partnership to deliver child obesity treatment • Goal: Implement and evaluate the model in 6 new community setting • Challenges: • No existing relationships in new counties • Unclear need/priority of child obesity in counties • No known relationships between clinics and parks/recreation centers in communities • Likely very low capacity for new data collection • Unknown electronic medical record capacity • Resources: • Funding for a local “connector” and equipment grant • The “Trailguide” (curriculum) and our team for technical support

  42. ARCHES Project Aims Children and Teens with Obesity Improved Health Aim 1: Evaluate the implementation feasibility of an integrated clinic-community child obesity treatment program Aim 2: Evaluate the clinic-community model, including ability to meet the existing child obesity treatment recommendations. Aim 3: Report patient outcomes associated with participation in the integrated model

  43. Current and Planned Sites

  44. Readiness for Evaluation • Chose sites intentionally • 1:1 conversations with key clinic and community stakeholders • Large group “kickoff” meeting with both clinic and community partners • Site visits • Administered Wilder Collaboration Inventory (pre/post)

  45. Assessment • Site: • Providers say they need something for kids with obesity; parks and rec express desire to meet “wellness” domain • Lots of discussion about best “fit” recreation center. • Discussion about past success, what they need from us, to have capacity

  46. Assessment • Intervention: • Discussion of readiness to implement – staff to refer from clinic, to deliver program • Opportunity to discuss evidence base for program • Each site has different resource needs, identified that the key resource is personnel – referral coordinator on the clinic side, program staff on community side

  47. Identified Gaps and Solutions • Referrals cannot be EMR-embedded, must have HIPAA, paper will slow down clinicians • Created standardized referral form that can be a letter, line for parent signature • Program staff have limited data skills for entering data • REDcap data entry page simplified and streamlined, only requires attendance/program description • IRB processes is very intimidating • Research assistant helps staff through CITI modules, minimize number needed • Informed consent requirement leading to low accrual • E-consent, video delivered

More Related