1 / 17

Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations. Objectives. Identify key concepts Discuss applications Address questions and concerns. Key Concepts. Performance monitoring What you do How well you do it Do you accomplish something

Download Presentation

Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance Monitoring:Thoughts, Lessons, and Other Practical Considerations

  2. Objectives • Identify key concepts • Discuss applications • Address questions and concerns

  3. Key Concepts • Performance monitoring • What you do • How well you do it • Do you accomplish something • Process, quality, capacity, outcomes • The window • Baselines and standards • Risk or case mix adjustment

  4. Performance Monitoring • Part of much larger cycle of program design and implementation • Performance - this is about definitions • Inputs, outputs, or the relationship between inputs and outputs? • Monitoring - This is about data collection and analysis • Important with respect to investment - are you getting something back?

  5. The Framework

  6. Process of care • Referral, intake, and assessment • Service planning, link to interventions • Reassessment, follow-up, case closure

  7. Quality of Care • Human resources • Physical plant and equipment • Practice protocols - evidence base • Supervision • Consumer feedback • Agency management around practice model fidelity

  8. Capacity • Enough trained workers • Enough office space • Enough funding • Enough information • Enough is defined by the relationship between process, quality, and outcomes

  9. Outcomes • Depends on the program and intervention • Well-being • Safety • Family provides stable nurturing • Education • Health • Behavioral health

  10. Process, Quality, and Outcomes • Highly interdependent • Quality depends on a process • Process is different than quality • Quality without outcomes is ‘inefficient’ • Agencies invest in process, quality, and capacity

  11. The Window • Performance happens in time • Improvement is change in performance over time • Sampling in time is difficult but critical

  12. Clinical Experience in Time:(Each line represents the start and end of service within the window)

  13. Sampling • Inception • Process vs child • How much time do you have to observe the process?

  14. Baselines and Standards • Baseline is a measure taken prior to intervention • Standards of practice and performance • The usual as in standard practice • Fidelity or compliance • Standards are better suited to process and quality; baselines are better suited to outcomes

  15. Risk or Case Mix Adjustment • An important question when facing variation in performance: Is the variation a function or performance or the result of client differences • Children/families have different outcomes for reasons that are intrinsic to them • Baseline mortality rates differ by age • Adjustment for case mix refers to taking the intrinsic differences into account somehow when measuring outcomes

  16. Case Mix Adjustment Applied • Case mix adjustment makes more sense for outcomes, less so for process and quality • Process/quality standards apply to all children, given the process standard applies in the first instance (differential diagnosis) • Baselines for outcomes should be adjusted • Standards don’t work as well for outcomes because of the random component.

  17. Comments, Questions, Concerns Thank you!

More Related