200 likes | 1.37k Views
Accrediting Organization Validation Survey. Dora Kane, Louisiana Department of Health Robin Bucknell , Washington State Department of Health Cassie Dunham, California Department of Public Health. August 6, 2019. Objectives .
E N D
Accrediting Organization Validation Survey Dora Kane, Louisiana Department of Health Robin Bucknell, Washington State Department of Health Cassie Dunham, California Department of Public Health August 6, 2019
Objectives • Describe the three state agency’s perspective of the Pilot Validation survey • Identify advantages and disadvantages of the new pilot process • Discussability to implement this process for other deemed providers
Process • Regional Office (RO) notifies the Survey Agency (SA) & Accreditation Organization (AO) to advise of selected survey • AO and SA independently research the provider/surveyor • Includes application, survey reports, complaints, 2567s, websites, patient population, etc. to assess past issues and performance • RO liaison coordinates with AO and SA, and B-tag Contractor (if APH) • No later than 10 days prior to survey start date
Process • Survey composition are 1:1 ratio of AO surveyors to SA observers • No AO surveyor to be unaccompanied/I observed by the SA • Should consider pairing by professional experience • LSC/LSC, Health surveyor/Health surveyor, Physical Environment/Physical Environment • Both teams meet at designated area & enter simultaneously • AO may pre-announce the re-accreditation survey no more than 30 minutes prior to survey start • Survey follows AO accreditation survey process with SA observing using the AO observation worksheet • Evaluate skill, knowledge and performance of the AO team. • SA observation team to ask questions as necessary to understand the work during survey activity
Process • If SA observation or AO findings cannot be reconciled, the RO mediates based on evidence provided and renders a final decision prior to exit. • Ongoing discussions are held to confirm observations and deficiency level(s) • If observation of AO performance that are partially met or not met: • SA documents in the comments/notes section, including Section 5 if necessary • Exit conference managed by the AO
Process • AO produces the final survey deficiency report • AO forwards the report and POC to RO/Central Office (CO) within 10 business days • If Immediate Jeopardy exists- surveyor/survey team must notify the RO for confirmation. • AO to explain to SA observation team process used to investigate the IJ • If IJ exists and AO is unable to complete investigation, the SA is responsible for finishing the IJ process • SA to finalize the observation worksheet per the CMS POD and send to CMS RO and CO within 21 days • Final data from the CMS/AO worksheet is used for the AO performance data report
Process • RO will create a Sample Validation Recertification kit in ASPEN and generate the Form 2802 • Survey Agency will create survey shells within the kit to document activity in the initial comments • Pilot requires State Survey Agency to account for hours for the event ID (Health and LSC) on Form 670 • Complete all other tabs in the kit required for upload
Louisiana • Acute Psychiatric Hospital • 3day survey • TJC (The Joint Commission) • State surveyors: 2 Health & 1 LSC • What worked? • Good collaboration, open communication, timelines met. AO openly discussed process during survey and tasks. Discrepancies discussed. • What did not work? • TJC had no way to cite issues regarding Medical Staff credentialing. • State surveyors not aware of TJC process for “tracers” versus regulations.
Louisiana • Facility concerns • None, receptive to the process • AO process • Frequent meetings with provider, morning and afternoon debriefings • Opportunity for provider to ask questions/obtain clarification • Recommend more detail on process, survey flow and time frame be incorporated into pre-survey call. • Need front end training on observation sheet. • Survey tool notes blank sections as “incomplete”- impacts scoring
Washington • Ambulatory Surgery Center • 1day survey • AAAASF (American Association for Accreditation of Ambulatory Surgery Facilities) • State surveyors: 2 Health and 1 Life Safety Code • AO sent 2 Health (1 physician, 1 nurse) and 1 Life Safety Code • What worked? • Facility receptive. Collaborative and Beneficial to observe and understand AO process. Overall positive experience. • What did not work? • SA surveyors found it difficult to interrupt process to ask follow up questions when necessary. • AO seemed pressured to conclude survey therefore not asking follow up questions. • Due to time constraints, AO concluded the survey without being able to get answers from AO contact.
Washington • Facility concerns • No facility concerns • AO training issues • Unclear if the AO physician adequately trained on Appendix Z • Facility survey manager expressed same concern • Challenges • Training disparities and time constraints impact the ability for SA and AO to conduct substantially equivalent surveys. • Survey method is structured differently. Difficult to tell if all conditions were evaluated until the end of the survey process, at which point it was too late to go back to fill in the gaps due to time constraint. • AO questions consisted of yes and no answers in regard to the directed method of compliance, which the SA surveyors where only partly familiar. • Appeared to be a conflict between the Life Safety Code Surveyor and the State Fire Marshal.
California • Selected to complete six Pilot Validation Surveys • Three – Ambulatory Surgery Centers • Two- Home Health Agencies • One – Acute Psychiatric Hospital
California • Ambulatory Survey Center #1 • Two day survey • AAAHC (Accreditation Association for Ambulatory Health Care) • State surveyors: Health Facility Evaluator Nurse and Life Safety Code • What worked? • AO and State surveyors felt there was positive feedback and collaboration. • What did not work? • Life Safety Code Surveyor was unable to observe Emergency Preparedness survey- done simultaneously
California • Ambulatory Survey Center #2 • Four day survey • TJC (The Joint Commission) • State surveyors: Medical/Physician Consultant and Life Safety Code • (AO sent one physician surveyor) • What worked? • AO surveyor very thorough, TJC standards very similar to CMS Conditions of Coverage • What did not work? • Low surgery census and single surveyor required additional days to observe processes
California • Ambulatory Survey Center #3 • Three day survey • AAAASF (American Association of Ambulatory Surgery Facilities) • State surveyors: Medical/Physician Consultant (1 day), Health Facility Evaluator Nurse and Life Safety Code • (AO sent one Physician and one Nurse) • What worked? • AO and State surveyors agreed on the partially met determination • What did not work? • Questioning consisted of yes/no format, not in line with questions for how, when, where • No collaboration between AO physician and state surveyor; later offered LSC surveyor to ask questions for EP
California • Acute Psychiatric Hospital • Three day survey • TJC (The Joint Commission) • State surveyors: Medical/Physician Consultant, Health Facility Evaluator Nurse and Life Safety Code • What worked? • Pre-survey calls very important for coordinating the survey. Key: indicate who will observe LSC and EP portion. AO and SA teams worked well together. • What did not work? • Health and LSC teams had to understand how to share the electronic workbook
California • Home Health Agency • Four day survey • CHAP (Community Health Accreditation Partner) • State surveyors: Health Facility Evaluator Nurse • Results pending
California • Home Health Agency • Scheduled for two days (Mid August) • ACHC (Accreditation Commission for Health Care) • State surveyors: Health Facility Evaluator Nurse
Open discussion & Questions