570 likes | 738 Views
“Maintaining Accreditation: Meeting the Challenges of Compliance”. AATOD 20 th Anniversary National Conference October, 2004 Mary Cesare-Murphy, PhD Executive Director, Behavioral Healthcare Accreditation Megan Marx, MPA Associate Director, OTP Accreditation Project
E N D
“Maintaining Accreditation: Meeting the Challenges of Compliance” AATOD 20th Anniversary National Conference October, 2004 Mary Cesare-Murphy, PhD Executive Director, Behavioral Healthcare Accreditation Megan Marx, MPA Associate Director, OTP Accreditation Project Joint Commission on Accreditation of Health Care Organizations
OTP Surveys Conducted 1/1/04 – 8/31/04 Twenty (20) OTPs received “No Requirement(s) for Improvement.” Forty-seven (47) OTP surveys had Requirement(s) for Improvement, with an Evidence of Standard Compliance (ESC) due as follow-up.
2004 CAMBH Chapters with Non-Compliant Standards Chapter # of Non-compliant Standards RI – (Ethics, Rights & Responsibilities) 17 LD – (Leadership) 7 APR – (Accreditation Participation Requirements) 7 HR – (Management of Human Resources) 37 PI – (Improving Organization Performance) 11 IC – (Infection Control) 2 MM – (Medication Management) 11 IM – (Information Management) 14 EC – (Environment of Care) 5 PC – (Provision of Care, Treatment & Services) 45
OTP Surveys Conducted 2002 - 2003 • One hundred forty-two (142) OTPs received “Accreditation with Recommendations for Improvement”. • One hundred thirty-seven (137) OTPs received “Accreditation with Full Standards Compliance”. • Six (6) OTPs received “Conditional Accreditation”
Individual-Focused Functions: RI – (Rights & Responsibilities & Ethics) PE – (Assessment) TX – (Care) PF – (Education) CC – (Continuum) Organization Functions: PI – (Improving Organization Performance) LD – (Leadership) EC – (Management of Environment of Care) HR – (Management of Human Resources) IM – (Management of Information) IC – (Surveillance, Prevention, Control of Infection) PS – (Behavioral Health Promotion)
Comparison of overall trend(s) & identification of problem areas within OTPs: • Standards most frequently cited in OTP’s consistently came from the Assessment/Provision of Care, Treatment and Services and the Management of Human Resources sections of the standards. • Standards cited concerning licensed independent practitioners, assessment of patients religious or spiritual orientation and pain management were prevalent in OTP survey findings from both 2002-2003 and 1/1/04 – 8/31/04.
Approach to OTP Education • Accreditation education efforts for OTPs should be focused on Assessment/Provision of Care, Treatment and Services and the Management of Human Resources to improve standards compliance. • If funding is awarded, Joint Commission will offer more topic specific learning opportunities utilizing user friendly distance learning formats in an effort to provide education to more OTPs.
Periodic Performance Review (PPR) • Facilitates a more continuous accreditation by incorporating an additional mid-cycle evaluation. • Provides for educational opportunities
Periodic Performance Review • Is an accreditation participation requirement. • Will be completed between the 15th and 18th month point in the accreditation cycle. • Findings with an approved plan of action not subject to citation during a Random Unannounced Survey during approved timeframes.
Periodic Performance Review • A surveyor on-site cannot overrule an approved plan of action. • During on-site survey, surveyors will request and review measures of success identified at time of 18-month PPR. • Process includes three options as well as full PPR.
Characteristics of Full PPR • Areas of non-compliance self-assessed by the organization and scored using the JCAHO extranet tool. • Findings submitted electronically to the Joint Commission using the extranet. • JCAHO staff review plans of action and measures of success and conduct interactive phone call.
Tips for the PPR • Read the user guide. • Check applicability table. • If unsure of applicability, leave unscored and discuss with standard representative. • Develop separate plans of action and measures of success (when required). • When it doubt, score it out – material for discussion. • Take full advantage of conference call time for questions.
Guidelines for Sampling for PPR • When assessing category “C” Elements of Performance (EPs) these guidelines are recommended: • 30 cases for a population up to 100 (If population is less than 30, sample all) • 50 cases for a population of 101-500 • 70 cases for a population over 500
Plan of Action • For each standard evaluated as “Not compliant” the organization will • Described the planned action for each element of performance (EP) marked as partial or not compliant • Develop a measure of success
Measure of Success (MOS) • A numerical or other quantitative measure usually unrelated to an audit that validates that an action was effective and sustained • Submitted via extranet • Submitted on an electronic form with space limited to a brief indication of numerical measure – expressed as a percentage
Benefits of Periodic Performance Review • Employs same tool as used by surveyors • Expands intra-cycle interaction with JCAHO • Supports continuous operational improvement • Assists organization in quest for 100% compliance, 100% of the time
Link Between Period Performance Review and On-site Survey • At triennial survey, time will be devoted to reviewing measures of success • Surveyor will ask for data related to each measure of success • Track record requirements remain • Surveyors do not see the organization’s specific performance review or action plans
PPR Option One • Organizations will attest that after careful consideration with legal counsel, they have decided not to participate in the Full PPR • Organizations will self assess compliance with standards, develop plans of action and measures of success (MOS) as applicable • Organizations will not submit PPR data to JCAHO
PPR Option One • Organizations will not be able to use extranet tool to score compliance, but will be able to view and print all standards and EPs • Organizations will be able to submit standards related issues for discussion with JCAHO staff during an interactive, scheduled phone call, but no inference relative to compliance will be made
PPR Option Two • Organizations will attest that after careful consideration with legal counsel, they have decided not to participate in the full PPR • An on-site survey will take the place of self-assessment activity • Survey length will be approximately one third of usual triennial survey • Organization will submit plans of action and MOS(s) for surveyor identified areas of non-compliance
PPR Option Two • Conference call with JCAHO will be scheduled to review and approve plans of action and MOS(s) • Organizations will be charged a fee to cover costs of the on-site survey
PPR Option Three • Organizations will attest that after careful consideration with legal counsel, they have decided not to participate in the Full PPR • A limited on-site survey will be conducted at the midpoint of the accreditation cycle • Following the survey the organization may elect to participate in a conference call to discuss standards related issues • At the time of the trienniel survey the surveyor will receive no information relating to the organizations Option 3 survey findings
Using Data to Improve Program Performance • Planning is the key to preventing performance measurement mistakes • Ask the following questions: • What data should be collected? • Why should the data be collected? • What data are already available? • What are the sources of available data? • How will the data physically be collected? • How will the data be used?
Using Data to Improve Program Performance • Consider the following common mistakes and tips to avoid these errors in your organization: • Mistake 1 – Insufficient planning before collecting data • Tip 1 – Determine which strategic measurement areas are high priorities
Using Data to Improve Program Performance • Mistake 2 – Insufficient resources to support data collection • Tip 2 – Enlist leadership to ensure that adequate resources are available • Mistake 3 – Data integrity • Tip 3 – Assess the completeness of the data
Using Data to Improve Program Performance • Mistake 4 – Extensive data collection • Tip 4 – Break data collection into manageable projects • Mistake 5 - Data collection “silos” • Tip 5 - Investigate data sources and instruments already in place.
Using Data to Improve Program Performance Facts = Data Data Combinations = Measures Analyzed Measures = Information Applied Information = Improvement Improvement generates knowledge
Using Data to Improve Program Performance • Follow these steps to avoid common pitfalls in data collection: • Review the specific purpose of your outcomes focused improvement project & determine what information, measures and data are necessary to achieve that purpose. • Review the specific information you need, specify performance measures that will generate that information & identify the data that compose these measures.
Using Data to Improve Program Performance • Define indicator data elements. • Determine the sources for all needed data. • Create your data collection instruments. • Determine the most effective data analysis strategies by considering what type of data need to be collected and how they will be used to improve performance. • Document your data collection plan. • Pilot test the data collection tool and analysis strategies.
Using Data to Improve Program Performance The Three “T’s” • TREND • Data over time on indicators • TARGET • Range of performance of each one • TOGETHER • Look at indicators in combination Joint Commission Benchmark January 2003 pgs 1,7
Using Data to Improve Program Performance Types of Measurement • Administrative Measures – Productivity • Comparison Measures – Benchmarking • Process Measure – Access, Satisfaction • Functional Measures – Improvement • Fidelity Measures – Following processes
Using Data to Improve Program Performance Administrative Measures • An administrative measure is an indication of how well your agency is following its mission, vision and values. • It is also a measure of how well your agency is doing. • Productivity or resource utilization is one example.
Using Data to Improve Program Performance Productivity Examples • Direct Service Percentage • Billed Service Percentage • Show Rate/Keep Rate • Percentage of Improvement Rate • Revenue per staff
Using Data to Improve Program Performance Comparison Data • Allows you to compare how your agency is doing in terms of other agencies. • Any number of areas you might choose to compare.
Using Data to Improve Program Performance Process Measures • A process measure looks at how well your processes are meeting your goals or standards. • Examples: • Rate of meeting intake timeliness • Show rate of initial appointment • Show rate for second appointment after intake
Using Data to Improve Program Performance Process Measure Examples • Emergency Services Use • Length of stay per diagnosis • Time of first appointment • Time between first & second appointment • Percent of consumers receiving first appointment within 48 hour of request • Keep rate • First appointment, subsequent appointments
Using Data to Improve Program Performance Functional Measures • A functional measure is an outcome measure. • It can be as complicated as a formal, fee based measurement with national norms • Brief Symptom Inventory (BSI) • It can be as simple as a “home-made” measurement using a Likert scale
Using Data to Improve Program Performance Construct a Likert Measure • List the functional elements that are important in the person’s life. • Supervisors and staff with experience with the population can help insure that the measure will have meaning. • Decide on a rating scale. • “0 to 10” is an 11 point scale, “O to 3” is a four point scale • Add descriptors to the rating to help staff know how to score the person • 0 = not present, 5 = some present, 10 = totally present
Using Data to Improve Program Performance Construct a Likert Measure • Train staff how to use the scale • Implement the scale • Chart pre-and-post treatment scores as a comparison outcome measure.
Using Data to Improve Program Performance Fidelity Measures • Fidelity is a concept used in formal research • In a treatment setting, ‘fidelity’ measures the extent that staff have followed your treatment guidelines. • Fidelity measurement is important in establishing a relationship between your treatment methods & functional improvement/outcome.
Using Data to Improve Program Performance Fidelity: Sample Question Fidelity can be simple “Yes/No” questions to each part of your treatment protocol. • Were required lab tests current? y/n • Was the practice protocol followed? y/n • Did the physician sign the treatment plan? y/n
Using Data to Improve Program Performance Readily Available Outcome Measures • Beck Depression • Beck Anxiety • CAP – Children’s Attention Problems, for Attention Deficit Hyperactivity Disorder (ADHD) • Conners (for ADHD) • Yale Brown Obsessive Compulsive • Michigan Alcohol Screening Test (MAST, for addiction)
Using Data to Improve Program Performance Selecting the Measures • Organizational Context • Matching measures to your needs • Measure what reflects your vision and mission
Using Data to Improve Program Performance Organizational Context • Organizational culture committed to data based decision making • Technical & management systems interdependent & well integrated • Support by top levels of management
Using Data to Improve Program Performance Measures and your Mission • Quality • How do you know people are improving or at least maintaining functional level? • Coordinated • How can you tell if people can get needed services? • Responsive • What is your access goal? What is your actual access rate? Difference?????