1 / 30

Efcog Contractor assurance Working Group (CAWG) Update Maturity evaluation Tool (MET)

Learn about the Maturity Evaluation Tool for DOE Contractors to ensure safe and compliant work practices through a structured evaluation process. Includes criteria for CAS requirements evaluation and feedback processes.<br>

smartin
Download Presentation

Efcog Contractor assurance Working Group (CAWG) Update Maturity evaluation Tool (MET)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. November, 2016 Efcog Contractor assurance Working Group (CAWG) UpdateMaturity evaluation Tool (MET) Patricia Allen, Chair Jita Morrison, SRR CAS Lead

  2. MET Purpose DOE O 226.1B, Source Requirement (2.a) “The contractor must establish an assurance system that includes assignment of management responsibilities and accountabilities and provides evidence to assure both the Department of Energy’s (DOE) and the contractor’s managements that work is being performed safely, securely, and in compliance with all requirements; risks are being identified and managed; and that the systems of control are effective and efficient.” The Maturity Evaluation Tool can enable DOE Contractors to apply a graded approach and prioritize improvement efforts.

  3. Evaluation Process • Two Maturity Levels are established • Level 1 – Meets Requirements • Level 2 – Enhanced • Subjective in nature • Focused on understanding details of current performance and determining where to improve • Performed annually or as needed to enable discussion and improvement • Best practices will be noted

  4. MET Categories DOE 226.1B Contractor Requirements: • CAS Management and Scope (Att.1, Section 2.a) • CAS Components must include (Att.1, Section 2.b) • CAS Effectiveness Validation • Self-Assessment and Feedback Process • Structured Issue Management Process • Feedback and Improvement • Timely Communication to DOE CO • Metrics • CAS Program Implementation and Monitoring (Att.1, Section 2.c and 2.d)

  5. CAS Evaluation Structure 226.1B CRD Elements _ perf. metrics _ central /line org _ Mgmt reviews _ external audit _ certification _ etc Evaluation Criteria Sub-Element

  6. CAS Management &Scope/ Effectiveness Validation • CAS Management and Scope - Source Requirement (2.a) • “The contractor must establish an assurance system that includes assignment of management responsibilities and accou 1ntabilities and provides evidence to assure both the Department of Energy’s (DOE) and the contractor’s managements that work is being performed safely, securely, and in compliance with all requirements; risks are being identified and managed; and that the systems of control are effective and efficient.” • CAS Effectiveness Validation Source Requirement (2.b) • “The contractor assurance system, at a minimum, must include the following: (2.b(1)) “A method for validating the effectiveness of assurance system processes. Third party audits, peer reviews, independent assessments, and external certification may be used and integrated into the contractor’s assurance system to complement, but not replace, internal assurance systems.”

  7. CAS Mgmt &Scope/ Effectiveness Validation Evaluation Criteria

  8. SRR Nuclear Safety Culture Dashboard July 2016– September 2016 October 11, 2016 • Management Leadership • Training Hours in Nuclear Safety (Ops/Maintenance) • Senior Management MFO Performance • Management Field Observations • Organizational Learning • Number of Alleged Retaliation Employee Concerns • Timeliness of Employee ConcernEvaluations • Conops/Disciplined Operations Events(ORPS Reported) • Timeliness of ORPS Characterization • Integrated Assessment Plan vs. Schedule • Assessment Quality Evaluation Complete • Self Identified versus Event Response Corrective Actions • Corrective Action Program Timeliness • Organizational Learning • # of Employee Concerns • Timeliness of Employee ConcernEvaluation • # of Disciplined OperationsEvents (ORPS Reported) • Timeliness of ORPS Characterization • Integrated Assessment Plan vs. Schedule • Assessment Quality Evaluation • Self Identified versus Event Response Corrective Actions • Corrective Action Program Timeliness UPDATE NEEDED • Employee Engagement • New Issue Actions Identified(STAR CTS) • Safety Meeting Attendance • Behavior Based SafetyObservations Key: Good Marginal Unsatisfactory No Data

  9. Self-Assessment & Feedback Self-Assessment and Feedback- Source Requirement (2.b(2)) “Rigorous, risk-informed, and credible self-assessment and feedback and improvement activities. Assessment programs must be risk-informed, formally described and documented, and appropriately cover potentially high consequence activities.”

  10. Self-Assessment and Feedback Process Evaluation Criteria

  11. Integrated Assessment Plan Performance

  12. Self-Assessment Grading

  13. MFO Performance

  14. Issues Management Issues Management - Source Requirement (2.b(3), (2.b(3)(a) • (2.b(3))“A structured issues management system that is formally described and documented and that:” • (2.b(3)(a)) “Captures program and performance deficiencies (individually and collectively) in systems that provide for timely reporting, and taking compensatory corrective actions when needed.” Issues Management - Source Requirement (2.b(3)(b)) “Contains an issues management process that is capable of categorizing the significance of findings based on risk and priority and other appropriate factors that enables contractor management to ensure that problems are evaluated and corrected on a timely basis. For issues categorized as higher significance findings, contractor management must ensure the following activities are completed and documented: • (2.b(3)(b)(1)) A thorough analysis of the underlying causal factors is completed; • (2.b(3)(b)(2)) Timely corrective actions • (2.b(3)(b)(3)) Effectiveness review • (2.b(3)(b)(4)) Documentation of the analysis process and results • (2.b(3)(b)(5)) Communicated to SrMgmet (Not included here see Performance Analysis)

  15. Issue Management Process Evaluation Criteria

  16. Issue Management Process Evaluation Criteria

  17. Corrective Action Closure - Significance Category 1,2 Issues

  18. Corrective Action Closure – Significance Category 3 Issues

  19. Corrective Action Closure – Significance Category T Issues

  20. Feedback & Improvement/ Timely Communication Feedback & Improvement – Source Requirement (2.b(5)) “Continuous feedback and improvement, including worker feedback mechanisms (e.g., employee concerns programs, telephone hotlines, employee suggestions forms, labor organization input), improvements in work planning and hazard identification activities, and lessons learned programs.” Timely Communications – Source Requirement (2.b(4)) “Timely and appropriate communication to the Contracting Officer, including electronic access of assurance-related information.”

  21. Feedback & ImprovementEvaluation Criteria

  22. Findings / Improvements from MFOs UPDATE NEEDED

  23. Performance Analysis & Metrics Performance Analysis (Communication and Use of Metrics)- Source Requirements(2.b(3)(b)(5) and and 2.b.(6)) • (2.b(3)(b)(5)) Communicates issues and performance trends or analysis results up the contractor management chain to senior management using a graded approach that considers hazards and risks, and provides sufficient technical basis to allow managers to make informed decisions and correct negative performance/compliance trends before they become significant issues.” • (2.b(6)) “Metrics and targets to assess the effectiveness of performance, including benchmarking of key functional areas with other DOE contractors, industry, and research institutions.”

  24. Performance Analysis & Metrics Evaluation Criteria

  25. Safety Index FY16 Target Goal >80% FYTD xx% FY 2016 POMC Dashboard Assessments Graded NSC Dashboard Cumulative Dose FY16 Goal < 12.5% Variance from ALARA Target Goal FY16 Goal Grade > 25%per Month FY16 Goal YTDAverage >8.0 FYTD < xx% FYTD x.x CurrentAverage xx% Fiscal Year to Date Performance:10/1/2015 through 8/31/2016 Radiological Control Severity Index Environmental Compliance Index FY16 Goal < 75% Goal FY16 Goal < 1.0/Month FYTD xx% FYTD 0.0

  26. Program Implementation & Monitoring Program Implementation - Source Requirement (2.c) • “The contractor must submit an initial contractor assurance system description to the Contracting Officer for DOE review and approval. That description must clearly define processes, key activities, and accountabilities. An implementation plan that considers and mitigates risks should also be submitted if needed and should encompass all facilities, systems, and organization elements. Once the description is approved, timely notification must be made to the Contracting Officer of significant assurance system changes prior to the changes being made.” Program Monitoring -Source Requirement (2.d) • “To facilitate appropriate oversight, contractor assurance system data must be documented and readily available to DOE. Results of assurance processes must be analyzed, compiled, and reported to DOE as requested by the Contracting Officer (e.g., in support of contractor evaluation or to support review/approval of corrective action plans).”

  27. MET - Program Implementation & Monitoring

  28. ORPS Normalized Score UPDATE NEEDED

  29. Management Concern ORPS Events UPDATE NEEDED

  30. Periodic Reviews with Corporate and DOE Management

More Related