640 likes | 785 Views
Performance Improvement Projects Technical Assistance Health Maintenance Organizations & Provider Service Networks Wednesday, March 28, 2007 3:00 p.m. – 5:00 p.m. Cheryl L. Neel, RN, MPH, CPHQ Manager, Performance Improvement Projects David Mabb, MS Sr. Director, Statistical Evaluation.
E N D
Performance Improvement ProjectsTechnical AssistanceHealth Maintenance Organizations & Provider Service NetworksWednesday, March 28, 20073:00 p.m. – 5:00 p.m. Cheryl L. Neel, RN, MPH, CPHQ Manager, Performance Improvement Projects David Mabb, MS Sr. Director, Statistical Evaluation
Presentation Outline • PIP Overall Comments • Aggregate MCO PIP Findings • Aggregate HMO Specific Findings • Technical Assistance with Group Activities • Study Design • Study Implementation • Quality Outcomes Achieved • Questions and Answers
Key PIP Strategies • Conduct outcome-oriented projects • Achieve demonstrable improvement • Sustain improvement • Correct systemic problems
Validity and Reliability of PIP Results • Activity 3 of the CMS Validating Protocol: Evaluating overall validity and reliability of PIP results: • Met = Confidence/High confidence in reported PIP results • Partially Met = Low confidence in reported PIP results • Not Met = Reported PIP results not credible
Proportion of PIPs Meeting the Requirements for Each Activity
Aggregate Valid Percent Met I II III IV VI VII VIII IX X V
HMO Specific Findings • 53 PIPs submitted • Scores ranged from 5% to 98% • Average score was 65% • Assessed evaluation elements were scored as Met 64% of the time
Study Design Four Components: • Activity I. Selecting an Appropriate Study Topic • Activity II. Presenting Clearly Defined, Answerable Study Question(s) • Activity III. Documenting Clearly Defined Study Indicator(s) • Activity IV. Stating a Correctly Identified Study Population
Activity I. Selecting an Appropriate Study Topic – HMO Overall Score
Activity I. Selecting an Appropriate Study Topic Results: • 92 percent of the six evaluation elements were Met • 8 percent were Partially Met or Not Met • None of the evaluation elements were Not Applicable or Not Assessed
Activity I: Review the Selected Study Topic HSAG Evaluation Elements: • Reflects high-volume or high-risk conditions (or was selected by the State). • Is selected following collection and analysis of data (or was selected by the State). • Addresses a broad spectrum of care and services (or was selected by the State). • Includes all eligible populations that meet the study criteria. • Does not exclude members with special health care needs. • Has the potential to affect member health, functional status, or satisfaction. Bolded evaluation elements show areas for improvement
Activity II. Presenting Clearly Defined, Answerable Study Question(s) - HMO Overall Score
Activity II. Presenting Clearly Defined, Answerable Study Question(s) Results: • 43 percent of the two evaluation elements were Met • 57 percent were Partially Met or Not Met • None of the evaluation elements were Not Applicable or Not Assessed
Activity II: Review the Study Question(s) HSAG Evaluation Elements: • States the problem to be studied in simple terms. • Is answerable. Bolded evaluation elements show areas for improvement
HMO Overall Score for Each Evaluation Element – Activity III
Activity III. Documenting Clearly Defined Study Indicator(s) Results: • 56 percent of the seven evaluation elements were Met • 27 percent were Partially Met or Not Met • 16 percent were Not Applicable or Not Assessed
Activity III: Review Selected Study Indicator(s) HSAG Evaluation Elements: • Is well defined, objective, and measurable. • Are based on practice guidelines, with sources identified. • Allows for the study question to be answered. • Measures changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives. • Have available data that can be collected on each indicator. • Are nationally recognized measure such as HEDIS®, when appropriate. • Includes the basis on which each indicator was adopted, if internally developed. Bolded evaluation elements show areas for improvement
Activity IV. Stating a Correctly Identified Study Population - HMO Overall Score
Activity IV. Stating a Correctly Identified Study Population Results: • 53 percent of the three evaluation elements were Met • 43 percent were Partially Met or Not Met • 3 percent of the evaluation elements were Not Applicable or Not Assessed
Activity IV: Review the Identified Study Population HSAG Evaluation Elements: • Is accurately and completely defined. • Includes requirements for the length of a member’s enrollment in the managed care plan. • Captures all members to whom the study question applies. Bolded evaluation elements show areas for improvement
Study Implementation Three Components: • Activity V. Valid Sampling Techniques • Activity VI. Accurate/Complete Data Collection • Activity VII. Appropriate Improvement Strategies
Activity V. Presenting a Valid Sampling Technique - HMO Overall Score
Activity V. Presenting a Valid Sampling Technique Results: • 12 out of the 53 PIP studies used sampling. • 11 percent of the six evaluation elements were Met. • 12 percent were Partially Met or Not Met. • 77 percent of the evaluation elements were Not Applicable or Not Assessed.
Activity V: Review Sampling Methods * This section is only validated if sampling is used. HSAG Evaluation Elements: • Consider and specify the true or estimated frequency of occurrence. (N=7) • Identify the sample size. (N=7) • Specify the confidence level to be used. (N=5) • Specify the acceptable margin of error. (N=5) • Ensure a representative sample of the eligible population. (N=5) • Ensure that the sampling techniques are in accordance with generally accepted principles of research design and statistical analysis. (N=5)
Populations or Samples? All? Some? Generally, • Administrative data uses populations • Hybrid (chart abstraction) method uses samples identified through administrative data
Activity VI. Specifying Accurate/Complete Data Collection - HMO Overall Score
Activity VI. Specifying Accurate/Complete Data Collection Results: • 34 percent of the eleven evaluation elements were Met • 24 percent were Partially Met or Not Met • 42 percent of the evaluation elements were Not Applicable or Not Assessed
Activity VI: Review Data Collection Procedures HSAG Evaluation Elements: • Clearly defined data elements to be collected. • Clearly identified sources of data. • A clearly defined and systematic process for collecting data that includes how baseline and remeasurement data will be collected. • A timeline for the collection of baseline and remeasurement data. • Qualified staff and personnel to collect manual data. • A manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications. Bolded evaluation elements show areas for improvement
Activity VI: Review Data Collection Procedures (cont.) HSAG Evaluation Elements: • A manual data collection tool that supports interrater reliability. Clear and concise written instructions for completing the manual data collection tool. An overview of the study in the written instructions. • Administrative data collection algorithms that show steps in the production of indicators. • An estimated degree of automated data completeness (important if using the administrative method). Bolded evaluation elements show areas for improvement
Baseline Data Sources Medical Records Administrative claims/encounter data Hybrid HEDIS Survey Data MCO program data Other
Activity VII. Documenting the Appropriate Improvement Strategies - HMO Overall Score
Activity VII. Documenting the Appropriate Improvement Strategies Results: • 28 percent of the four evaluation elements were Met • 21 percent were Partially Met or Not Met • 51 percent of the evaluation elements were Not Applicable or Not Assessed
Activity Seven: Assess Improvement Strategies HSAG Evaluation Elements: • Related to causes/barriers identified through data analysis and Quality Improvement (QI) processes. • System changes that are likely to induce permanent change. • Revised if original interventions are not successful. • Standardized and monitored if interventions are successful. Bolded evaluation elements show areas for improvement
Determining Interventions Once you know how you are doing at baseline, what interventions will produce meaningful improvement in the target population?
First Do A Barrier Analysis What did an analysis of baseline results show ? How can we relate it to system improvement? • Opportunities for improvement • Determine intervention(s) • Identify barriers to reaching improvement
How was intervention(s) chosen? • By reviewing the literature • Evidence-based • Pros & cons • Benefits & costs • Develop list of potential interventions -- what is most effective?
Types of Interventions • Education • Provider performance feedback • Reminders & tracking systems • Organizational changes • Community level interventions • Mass media
Choosing Interventions • Balance • potential for success with ease of use • acceptability to providers & collaborators • cost considerations (direct and indirect) • Feasibility • adequate resources • adequate staff and training to ensure a sustainableeffort
Most effective: real-time reminders outreach/detailing opinion leaders provider profiles Less effective: educational materials (alone) formal CME program without enabling or reinforcing strategies Physician Interventions: Multifaceted Most Effective
Patient Interventions • Educational programs • Disease-specific education booklets • Lists of questions to ask your physician • Organizing materials: flowsheets, charts, reminder cards • Screening instruments to detect complications • Direct mailing, media ads, websites
Evaluating Interventions • Does it target a specific quality indicator? • Is it aimed at appropriate stakeholders? • Is it directed at a specific process/outcome of care or service? • Did the intervention begin after baseline measurement period?
Interventions Checklist • Analyze barriers (root causes) • Choose & understand target audience • Select interventions based on cost-benefit • Track intermediate results • Evaluate effectiveness • Modify interventions as needed • Re-Measure
Quality Outcomes Achieved Three Components: • Activity VIII. Presentation of Sufficient Data Analysis and Interpretation • Activity IX. Evidence of Real Improvement Achieved • Activity X. Data Supporting Sustained Improvement Achieved
Activity VIII. Presentation of Sufficient Data Analysis and Interpretation - HMO Overall Score
Activity VIII. Presentation of Sufficient Data Analysis and Interpretation Results: • 35 percent of the nine evaluation elements were Met • 15 percent of the evaluation elements Partially Met or Not Met • 50 percent of the evaluation elements were Not Applicable or Not Assessed