220 likes | 367 Views
Used with Permission. The Post-Survey Process. George Mason University College of Nursing and Health Science Regulatory Requirements for Health Systems Summer 2004. Scoring and Decision Process- Executive Summary. EP Scoring are based on a three-point scale
E N D
Used with Permission The Post-Survey Process George Mason University College of Nursing and Health Science Regulatory Requirements for Health Systems Summer 2004
Scoring and Decision Process- Executive Summary . . . • EP Scoring are based on a three-point scale • Type I and supplemental recommendations are replaced with “requirements for improvement” and supplemental findings • The surveyor leaves a final report on site • Final decision comes after acceptance of the evidence of standards compliance (ESC)
Scoring Guidelines • The scoring guidelines divide EP’s into different categories • Category A (EP’s scored yes or no) • Category B (EP’s address situations in which the literal intent is met, but there is need to evaluate the quality or comprehensiveness of the effort) • Category C (EP’s that address frequency)
How is each category scored? • EP’s are scored on a 3 point scale,with 0 as insufficient compliance, 1 as partial compliance, and 2 as satisfactory compliance • “A” EP’s, are scored only 0 or 2 (yes or no), unless there was a track record issue leading to a score of 1(partial compliance)
New Scoring and Decision Process • 3-point vs 5-point Element of Performance scoring scale • Standards identified as compliant or not compliant • Simplified aggregation process • No grid element or summary grid score calculation • Summary score based on number of non-compliant standards • Statistically-based thresholds for Conditional and Preliminary Denial of Accreditation (2 and 3 standard deviations above the mean Summary Score) • Revised accreditation decision categories • Measure-based follow-up (Evidence of Standards Compliance and Measure of Success) • No scores shared with HCO
Standard Level Scoring • The EP’s are aggregated to determine standards compliance • Standards are either in compliance or not in compliance • There is no partial compliance at the standard level
New Scoring and Decision Process: Overall Program Decision ProgramFollow-up Number of not compliant standards Standard (STD) (2 Point Scale) Element of Performance (EP) (3 Point Scale) Summary Grid Score Grid Element Score Standard (STD) (5 Point Scale) Measurable Characteristic (MC) (5 Point Scale) PREVIOUS MODEL CURRENT MODEL
Previous Aggregation and Decision Process • Standard Score • Worst Measurable Characteristic score • Measurable Characteristic set score • “If at least 2 MCs are scored 5 or worse, the standard score is 5” • Grid Element Score • Standard scores are weighted through capping • Worst standard score after applying caps • Summary Grid Score • Convert each grid element score into points (0-4) • Add the points for each converted grid element score • Determine the maximum number of points (# of scored elements x 4) • Divide actual points by maximum points and multiply the result by 100 • Conditional Accreditation = Summary Grid Score < 80 • Preliminary Denial of Accreditation = Summary Grid Score < 50
Current Aggregation and Decision Process • Standard Score • Score each Element of Performance • (0 =Insufficient Compliance, 1 = Partial Compliance, 2 = Satisfactory Compliance) • Standard score = 0 (not compliant) if any one EP is scored 0 (insufficient compliance) or a predetermined percent of EP’s are scored 1 (partial compliance). • Count of the standards scored 0 (not compliant) • Conditional Accreditation = Summary Score between 2 and 3 Standard Deviations above the mean. • Preliminary Denial of Accreditation = Summary Score greater than 3 Standard Deviations above the mean.
What kind of follow-up is there? • The HCO has 90 days* after survey for the first 18 months (January 2004 through June 2005) to send JCAHO “evidence of standards compliance” (ESC) – i.e., what it has done to come into compliance with the standard, or what evidence proves it was in compliance at the time of the on-site survey • At this time the HCO submits an indicator or measure of success that they will use to assess sustained compliance over time, as applicable • Four months after approval of the ESC, the HCO submits data on their measure of success to demonstrate track record – in all cases this is an audit process *Note: HCO’s surveyed after July 1, 2005 will be allowed 45 days for ESC submission; this timeframe is subject to revision by Accreditation Committee after review and analysis of actual submission data
Remember, a Measure of Success (MOS) is . . . • A numerical or other quantitative measure usually related to an audit that validates that an action was effective and sustained • Submitted via the extranet • Submitted on an electronic form with space limited to a brief indication of the numerical measure – often just a numerator and denominator with definitions of each.
What about revisions? • There is no longer any need for revisions… • During the 90 days* after the on-site survey, the HCO can send information to JCAHO to demonstrate what they have done to come into compliance (corrective evidence), OR, to demonstrate that they were in compliance at the time of survey (clarifying evidence) • During this time period, the HCO maintains its current accreditation status *Note: HCO’s surveyed after July 1, 2005 will be allowed 45 days for ESC submission; this timeframe is subject to revision by Accreditation Committee after review and analysis of actual submission data
The final accreditation decision • Is made after the JCAHO receives and approves the HCO’s evidence of standards compliance (ESC) and identified measure of success • If an acceptable ESC is received, then the HCO will receive an “Accredited” decision
If no ESC submission within the required timeframe . . . • After 90 days*, the HCO will receive a “Provisional Accreditation” decision • This decision will be disclosable *Note: HCO’s surveyed after July 1, 2005 will be allowed 45 days for ESC submission; this timeframe is subject to revision by Accreditation Committee after review and analysis of actual submission data
Aligning the decision and reporting process • The revised process differentiates between accreditation categories rather than within a single category • The revised decision process emphasizes compliance with all standards all the time, and continuous improvement in key safety and quality areas
Accreditation Decisions • Published category of “Accredited with Requirements for Improvement” is eliminated • New decision rules for Conditional and Preliminary Denial of Accreditation (PDA) decisions
Final Report Format Standards Out of Compliance The Primary Critical Focus Area XXX may be vulnerable as evidenced by: • Standard • Standard Text • Program • Recommendation • Element of Performance • Secondary Critical Focus Area (as appropriate)
Post-Survey Sequence Previous • Preliminary left on site • Final mailed within 35 days • Revision process • Written progress reports (1, 4 & 6 month) • 2nd generation failures (can result in CON) Current • Final Report of findings left on site • Report on extranet within 48 hours • No revisions • ESC in 90 days during the first 18 months • Four months after an approved ESC, HCO submits a Measure of Success (MOS)
Quality Reports . . . • Will replace performance reports • Will include new information relative to quality and safety • National Patient Safety Goal Performance • National quality improvement goal performance/ ORYX core measures • Optional disease specific care or other certifications • Special recognitions/achievements (e.g., Codman award winner, Magnet Hospital)
Public Disclosure Performance Report • Decision • Overall Score • Summary Data Comparisons • Recommendations for Improvement on all surveys Quality Report • Decision • No Score • Recommendations for Improvement for provisional, conditional, PDA and DA decisions • National Patient Safety Goals • National Quality Goals • Quality Distinctions • Certifications, Awards
Shared Visions—New Pathways: Triennial Accreditation CycleExample of HCO Surveyed in July 2002 Organization submits Evidence of Standards Compliance and Measures of Success (if recommendations during on-site survey) Organization returns Periodic Performance Review and corrective action plans with definition of MOS Standards Interpretation Group conducts phone interview with HCO; reviews and approves corrective action plan JCAHO gives HCO extranet access to Periodic Performance Review and Priority Focus Process output July 2005 Full Survey July 2002 Full Survey Organization submits data for MOS to JCAHO JCAHO runs Priority Focus Process and sends output to organization Onsite Survey scheduled Survey scheduled Decision rendered Month: -60 1151819303335.5 3639 40 41 44 Organizations submit quarterly core measure data (as core measures are implemented across accreditation programs) • On-site Survey • Tracer Methodology • Systems Tracers on key issues • Validation of implementation of corrective action plan from Periodic Performance Review • Final report left on site Organization completes Periodic Performance Review; Organization identifies areas of non-compliance and develops corrective action plan Organization completes extranet Application for Accreditation Quality Report posted on the web PFP output is delivered to surveyor with itinerary for review prior to survey Organization’s accreditation status is not impacted if corrective action plan is approved Organization completes extranet Application for Accreditation– made available 6-9 months prior to survey due date