1 / 29

Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis

Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis. Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team. CMS PIP Protocol Changes. Activities III, IV, VII, and VIII have been reversed in order.

pravat
Download Presentation

Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance Improvement Project Validation Process Outcome Focused Scoring Methodologyand Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team

  2. CMS PIP Protocol Changes Activities III, IV, VII, and VIII have been reversed in order. • Activity III: Use a Representative and Generalizable Study Population • Activity IV: Select the Study Indicator(s) • Activity VII: Data Analysis and Interpretation of Results • Activity VIII: Improvement Strategies

  3. Activity I: Choose the Study Topic HSAG Evaluation Elements: The Study Topic • Is selected following collection and analysis of data (critical element) • Has the potential to affect member health, outcomes of care, functional status, or satisfaction

  4. Activity II: State the Study Question HSAG Evaluation Elements: The Study Question • States the problem to be studied in simple terms and is in the recommended X/Y format (critical element)

  5. Activity III: Identify the Study Population • HSAG Evaluation Elements: The Study Population • Is accurately and completely defined and captures all members to whom the study question applies (critical element)

  6. Activity IV: Select the Study Indicator HSAG Evaluation Elements: The Study Indicator • Is well-defined, objective, and measures changes in health or functional status, consumer satisfaction, or valid process alternatives (critical element) • Includes the basis on which the indicator was adopted, if internally developed • Allows for the study question to be answered (critical element)

  7. Activity V: Use Valid Sampling Techniques* • HSAG Evaluation Elements: Sampling Techniques • Specify the measurement period for the sampling methods used • Provide the title of the applicable study indicator • Identify the population size • Identify the sample size (critical element) • Specify the margin of error and confidence level • Describe in detail the methods used to select the sample • * Activity V is only scored if sampling techniques were used.

  8. Activity VI: Define Data Collection • HSAG Evaluation Elements: Data Collection • The data collection procedures: • Identify the data elements to be collected • Include a defined and systematic process for collecting baseline and remeasurement data

  9. Activity VI: Define Data Collection • HSAG Evaluation Elements: Data Collection • The manual data collection procedures: • Include the qualifications of staff member(s) collecting manual data • Include a manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications (critical element)

  10. Activity VI: Define Data Collection • HSAG Evaluation Elements: Data Collection • The administrative data collection procedures: • Include an estimated degree of administrative data completeness • Describe the data analysis plan

  11. Activity VII: Analyze Data and Interpret Study Results • HSAG Evaluation Elements: Data Analysis • Is conducted according to the data analysis plan in the study design • Allows for the generalization of results to the study population if a sample was selected (critical element) • Identifies factors that threaten internal or external validity of findings • Includes an interpretation of findings • *Evaluation Elements 1-5, in Activity VII, are scored for PIPs that provide baseline data

  12. Activity VII: Analyze Data and Interpret Study Results • HSAG Evaluation Elements: Interpretation of Study Results • Is presented in a way that provides accurate, clear, easily understood information (critical element) • Identifies the initial measurement and the remeasurement of study indicators • Identifies statistical differences between the initial measurement and the remeasurement • Identifies factors that affect the ability to compare the initial measurement with the remeasurement • Includes an interpretation of the extent to which the study was successful

  13. Activity VIII:Implementing Interventions and Improvement Strategies • HSAG Evaluation Elements: Improvement Strategies • Are related to causes/barriers identified through data analysis and quality improvement processes (critical element) • Are system changes that are likely to induce permanent change • Are revised if the original interventions are not successful • Are standardized and monitored if interventions are successful

  14. Activity IX: Real Improvement* • HSAG Evaluation Elements: Report Improvement • The remeasurement methodology is the same as the baseline methodology • There is documented improvement in processes or outcomes of care • There is statistical evidence that observed improvement is true improvement over baseline (critical element) • The improvement appears to be the result of planned intervention(s) • * Activity IX is scored when the PIP has progressed to Remeasurement 1 and will be scored on an annual basis until statistically significant improvement is achieved from baseline to a subsequent remeasurement for all study indicators. Once Evaluation Element 3 receives a Met score, it will remain Met for the duration of the PIP.

  15. Activity X: Sustained Improvement* • HSAG Evaluation Elements: Sustained Improvement • Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant (critical element) • *HSAG will not validate Activity X until statistically significant improvement has been achieved across all study indicators. Once statistically significant improvement is achieved, the MCO will need to demonstrate a subsequent remeasurement period demonstrating that they sustained that improvement to receive an overall Met validation status. 

  16. PIP Tool Format Old Tool Format New Tool Format 10 Activities 37 Evaluation Elements Activity III Study Population Activity IV Study Indicator(s) Activity VII Data Analysis Activity VIII Interventions 12 Critical Elements • 10 Activities • 53 Evaluation Elements • Activity VII Interventions • Activity VIII Data Analysis • 13 Critical Elements

  17. Outcome Focused PIP Scoring HSAG Evaluation Tool • 37 Evaluation Elements Total • 12 Critical Elements (CE) • Activity VI: • Activity VII: • Activity VIII: • Activity IX: • Activity X: 1 CE 2 CE 1 CE 1 CE 1 CE • Activity I: • Activity II: • Activity III: • Activity IV: • Activity V: 1 CE 1 CE 1 CE 2 CE 1 CE

  18. Outcome Focused PIP Scoring Changes • Activity VII • Evaluation Element 5 is critical • MCOs should ensure that data reported in all PIPs are accurate and align with what has been reported in its IDSS. • Activity IX • Evaluation Elements 3 and 4 have been reversed • New criteria for scoring Activity IX • Activity X • New criteria for scoring Activity X

  19. Activity IX:Outcome Focused PIP Scoring • HSAG Evaluation Elements: Assessing for Real Improvement • The remeasurement methodology is the same as the baseline methodology • There is documented improvement in processes or outcomes of care • There is statistical evidence that observed improvement is true improvement over baseline and across all study indicators • The improvement appears to be the result of planned intervention(s)

  20. Activity X:Outcome Focused PIP Scoring • HSAG Evaluation Elements: Assessing for Sustained Improvement • Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant across all study indicators

  21. Outcome Focused PIP Scoring • Activity IX • Repeated measurement of the indicators demonstrates meaningful change in performance • Improvement must be statistically significant for all study indicators to receive an overall Met validation status • Is scored on an annual basis until statistically significant improvement over baseline has been achieved for all study indicators • Once Evaluation Element 3 receives a Met score, it will remain Met for the duration of the PIP • Evaluation elements 3 and 4 are linked

  22. Outcome Focused PIP Scoring • Activity X • Repeated measurement of the indicators demonstrates sustained improvement • HSAG will not validate Activity X until Evaluation Element 3 of Activity IX is Met • Once statistically significant improvement has been achieved for all indicators, the MCO will need to document a subsequent measurement period demonstrating sustained improvement in order to receive a Met in Activity X

  23. Outcome Focused PIP Rationale • Overall Met Validation Status • The changes align the actual outcomes of the project with the overall validation status • Emphasis on statistically significant, sustained improvement in outcomes

  24. Critical Analysis HSAG will be evaluating whether or not… • A current causal/barrier analysis was completed-MCOs should conduct an annual causal/barrier and drill-down analysis in addition to periodic analyses of their most recent data. MCOs should include the updated causal/barrier analysis outcomes in its PIPs.

  25. Critical Analysis HSAG will be evaluating whether or not… • Barriers and interventions were relevant to the focus of the study and can impact the study indicator(s) outcomes

  26. Critical Analysis For any intervention implemented, the MCO should have a process in place to evaluate the efficacy of the intervention to determine if it is having the desired effect. This evaluation process should be detailed in the PIP documentation. If the interventions are not having the desired effect, the MCO should discuss how it will be addressing these deficiencies and what changes will be made to its improvement strategies.

  27. Critical Analysis The MCO should ensure that the intervention(s) implemented will impact the study indicator(s) outcomes. • Member-focused interventions will not impact a study indicator measuring the quality of service provided by a PCP- WCC HEDIS Measure (Childhood Obesity PIP) • Interventions focused on educating MCO staff on HEDIS measures will not impact members accessing care and seeking well-child visits

  28. Critical Analysis The MCO should be cognizant of the timing of interventions. Interventions implemented in the last few months of the year will not have been in place long enough to have an impact on the results.

  29. Questions and Answers

More Related