1 / 30

Performance Based Logistics Metrics Workshop SOLE Conference 15-16 May 2007

Performance Based Logistics Metrics Workshop SOLE Conference 15-16 May 2007. Terri Schwierling Chief, PBL & Analysis Branch Maint Spt Div, Maint Dir IMMC, AMCOM. Performance Based Logistics Metric Workshop Topics of Discussion . Metric Policy and Guidance

xarles
Download Presentation

Performance Based Logistics Metrics Workshop SOLE Conference 15-16 May 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance Based Logistics Metrics WorkshopSOLE Conference15-16 May 2007 Terri Schwierling Chief, PBL & Analysis Branch Maint Spt Div, Maint Dir IMMC, AMCOM

  2. Performance Based LogisticsMetric WorkshopTopics of Discussion • Metric Policy and Guidance • Metrics Relevant to Warfighter Requirements • Metric Selection • Metric Assessment • Metric Weights • Summary

  3. Life Cycle Sustainment Outcome Metrics* • Materiel Availability (Key Performance Parameter) • % of Entire Population Operational • Materiel Reliability (Key System Attribute) • Mean Time Between Failure • Ownership Cost (Key System Attribute) • O&S Costs Associated With Materiel Readiness • Mean Down Time • Average Total Downtime Required to Restore Asset to Operational Condition * JROC Established Materiel Readiness/Sustainment Key Performance Parameter & Key System Attributes (DUSD (L&MR) Memorandum Dated 10 Mar 07)

  4. PBL Policy and Guidance Performance Based Metrics and Reporting Overarching Metrics Most Frequently Used Metrics Reporting Semiannual Report DA and AMC Automated System TBD Operational Availability Operational Readiness Fill Rate Customer Wait Time Backorder Reduction Operational Availability Mission Reliability Cost Per Unit Usage Logistics Footprint Logistics Response Time Inherent Only: Reliability Obsolescence Areas to Consider Measurable Metric Each Metric Does Not Require A Separate IPT Data Collection Automated Reporting Data Base Impact to Resources Multiple Sub-Metric Options

  5. Define the process • Determine customers, inputs, outputs, value-added • Use walkthrough to achieve common understanding Measure process performance • Define metrics and identify data • Determine baseline performance • Diagnose performance drivers • Provide reports and feedback Iterate for continuous improvement Improve the process • Establish goals • Develop improved process designs • Implement changes UAVS PMO PBL Metrics Metric Max Rating System Status 85% or Higher Readiness (SSR) Customer Wait Time 90% or Higher (CWT) Logistics 12/1 or Higher Maintenance Ratio (LMR) Field Service Satisfactory Representative Performance (FSR) PBL Metrics A o OR MTBSA What are we Buying ? SSR CWT FSR MTBF MTBEFF LMR MTTR Performance Metrics Pyramid

  6. Intelligent Workload Allocation: Flexibility & Unity of Effort • PSI Workscope: • Contractor Managed Supply Support • Contractor Managed Maintenance Support • Field Support Representatives • Sustainment Engineering • Brigade Integration Team (BIT) Training • RESET/PRE-SET Efforts • Deployment Support (CONUS/OCONUS Trng Exercises, OIF/OEF, etc) Performance Requirement: Operational Readiness and Availability Metrics Scope has to match the desired outcomes.

  7. SAMPLE PBL Metricsto ORD Crosswalk Performance Based Product Support Metrics PBL Metrics Crosswalk to ORD • Ao = Availability of 85% • SSR and CWT • Mean Time Between System Abort • 20 Hrs Threshold – 57 Hrs Objective • SSR and LMR • Mean Time to Repair (MTTR) • .5 hrs (AVUM)/ 2.0 hrs (AVIM) • SSR, CWT, and FSR • Operational Readiness (non-ORD) of 90% • SSR, CWT, and FSR • SSR– System Status Readiness • Defined as Total Time – Down Time (at Sub system Level) • Total Time • CWT – Customer Wait Time • Defined as Total Req’s - # of Unsuccessfully Filled • Total Req’s • FSR - Field Service Representative Quotient • Defined as Customer Satisfaction quotients evaluated via CSAP Report • LMR - Logistics Maintenance Ratio • Defined as Total Operating Hours • # of Unscheduled Maintenance Actions

  8. Scoring Methodology • Performance Metrics are weighted as follows: • SSR 50% • CWT 25% • FSR 20% • LMR 5% • A Wrap up/Composite Score is calculated to determine incentive award (Fee), on a quarterly basis, as follows: • (SSR x .50) + (CWT x .25) + (FSR x .20) + (LMR x .05)

  9. Incentive Model versus Minimum Acceptable Performance • The Composite Wrap Up Score is calculated and Fee is then determine via the following Table found in Section H of the PBL Contract. • As an example, the PSI achieved a Composite Wrap Up Score of 91% – the incentive would be 3 % (MIN) + 7.2 % (INC) = 10.2 % • In this example, the PSI missed the SSR and CWT scoring a Composite Wrap up Score of 84%. The incentive would be 8.0 % due to missing the PBL metric requirements. Plus Score .91 Minus Score .84

  10. Effective Metrics, Performance Management, and Continuous Improvement Culture • PBL Process Review - 1st QTR FY 06 • Focused on the processes required to achieve the PBL Metrics: • Scrutinized the processes used to maintain PBL Shop Stock . • Determined quantities of items in the ROR [Repair of Repairables] cycle and identified the processes currently used to move items expeditiously through the depot’s ROR line. • Reviewed all aspects of FSR Utilization (Training, Management, Technical Documentation, etc.) • Reviewed the current processes used to reduce the Mishap Rate and identified ways to enhance these efforts. • Identified and quantified Integrated Material Management Activities as they relate to all aspects of the PBL Effort. • Developed new Metrics that can reduce the Mishap Rate and improve process efficiency ! It’s really not about what you say. It’s about what you do.

  11. System Status Readiness(No change to current Metric) SSR= Total Time (TT) – Down Time (DT) Total Time (TT) • Sub-systems that count against NMC criteria include: • 3 of the 4 AVs • 2 GCSs • 2 GDTs • 1 PGCS • 1 PGDT • 2 TALS • 1 LAU • 2 of the 4 RVT’s • A fielded system consists of 4 AVs. The fourth AV is considered a spare AV and shall not be included in SSR DT calculations for the first 30 days after the fourth AV, as a spare, is used to replace a NMC AV. After 30 days, if an additional AV becomes NMC that AV will be included in DT calculations unless the PSI has added back a fourth spare AV to the unit.

  12. Current Metric Customer Wait Time (CWT) Goal 90% Weighted at 25% CWT= Total Number of Soldier Requisitions – Late Solider Requisitions (LR) Total Number of Requisitions Terms And Conditions Contractor Shall fill 90% of unit requisitions per the following CWT: Priority Designator (PD)CONUS (Days)OCONUS (Days) 01 thru 03 3 7 04 thru 08 7 10 09 thru 15 10 15 CWT begins on the day the requisition is submitted by a soldier either through the FSR or the Standard Army Retail Supply System (SARSS) when it is implemented as evidenced by the document number. When Government personnel, as evidence by the carriers shipping document, receive the part CWT calculations end. A LR is a soldier requisition that is not filled within the number of days delineated in the table based upon the unit location (CONUS/OCONUS)

  13. Customer Wait Time (CWT) CWT= Total Number of Soldier Requisitions – Late Solider Requisitions (LR) Total Number of Requisitions PROS Desired Result- Responsiveness Measurable Data Collection Process In place Known out come CONS Unit Relies on FSR to Submit Requisition FSR Relies on Depot to Submit Requisition Delay with FSR closing out Requisition Misuse of Priority Designator Does not take into account demands filled thru repair process Government Transportation

  14. New MetricConsideration Customer Response Time (CRT) CRT = Combination of Requisition Processing and Repair of Repairables (ROR) • Considerations • Goals • Meet Customer Requirements • Improve Inventory Control Process • Improve Repair of Repairables Process • Measurable – Yes • Data Collection Process In Place – Yes • Achievable – Yes • Weight Factor Adjustment • Impacts • Improve System Status Readiness • Cost • Funding for Long Lead Items • Competing Resources with Production Recommendation for Supply and ROR Count all Requisitions User FSR Depot Implement Automated Inventory Control Process Update LMI Data Base with Provisioning Type Data Address all Lead Times Recoverability Codes Refresher Training on UPAS Reduce all ROR lead times by 20% Track all ROR actions in UPAS Implement Tracking Process for Unserviceables Identify Direct Exchange Items Subcontractor/Vendor Support Agreements Flow Metrics to subs Conduct Subcontractor/Vendor Conference

  15. New Metric Consideration Customer Response Time (CRT) – What we think it is ! CRT = Combination of Requisition Processing and Repair of Repairables (ROR) • Requisition Processing • Include only requisitions for LRU/SRU’s • (Short list – few consumables) • Decision made not to use LRU/SRU or RPSTL. Recommendation is to include all field requisitions • Include only requisitions coming from the ‘field’ do not count requisitions coming from OEM Depot • Include all other Priority 01 thru 03 • Decision made to include all priorities. • Notes: Clock starts date/time of requisition. Clock stops when shipment tracking number is entered (which means customs clearance is received). Success is 90% completion fill on time. • ROR Completions • Determine Top 15 ROR Items (by Quantity/Cost) • OEM Action to Provide List NLT Wed at 0830 • Govt will review list during telecon • Recommend a short list of top 15 Items to put a metric against • Establish Standard Repair Time • OEM to provide standard repair time for each item on the ROR list above. Govt will review during the telecon and determine concurrence or non-concurrence. • Track these ROR actions in UPAS using Standard Repair Time for Due out Date • Notes: Clock starts when item transfers into ROR warehouse (UPAS) and is assigned a PO or MO. Standard Repair Time completion date is entered into the UPAS data base at the same time item is transferred to ROR warehouse. Success is 90% completion of repair with in the established standard repair time. CRT= Total Number of Requisitions and ROR’s – Late Requisitions and ROR’s Total Number of Requisitions and ROR’s

  16. Result of Customer Response Time (CRT) Discussions 2 Metrics Customer Wait Time and Depot Maintenance Ratio Customer Wait Time (CWT) • Depot Maintenance Ratio (DMR) CWT= Total Number of Requisitions – Late Requisitions Total Number of Requisitions DMR= Total Hours # of Open Maintenance Actions • Includes all field requisitions • Includes all Priorities • Success is 90% completion fill on time • Incorporate Data Field to track • complete Requisition TAT • Maintenance Actions Per Flight Hour • Improve ROR TAT and Reduce Backlog • Includes all Depot Level Maintenance • actions not a specific list. • Actions: - Establish Ratio • - Starting point (snap shot of data) • which should include all depot • ROR backlog • - Run Data Analysis

  17. Current Metric Logistics Maintenance Ratio Goal 8 thru 12:1 Weighted at 5% LMR= Total Hours # of Unscheduled Maintenance Actions (UMA) Terms And Conditions The contractor shall maintain a LMR of 8 through 12:1 TH= Total System Flight Hours. All fielded system TM shall be summed into a Quarterly total. UMA= Maintenance actions that are required to return the systems or subsystem to a mission capable status due to component failure beyond the routine preventive maintenance actions defined in the UAVS TMs. UMAs include stand alone UMAs and, in some case of the parent child MAs, only the subordinate UMAs will be counted. The LMR shall be calculated using both actual data and data obtained by analysis when Recommended engineering changes have not been funded for implementation.

  18. Logistics Maintenance Ratio LMR= Total Hours # of Unscheduled Maintenance Actions (UMA) CONS Engineering Services currently not included in PBL Contract Requires up front investment PROS Measurable Data Collection Process In place

  19. New Metric Consideration Logistics Maintenance Ratio (LMR) LMR= Total Hours # of Unscheduled Maintenance Actions (UMA) Considerations Goals Reduce Mishaps Reduce Unscheduled Maintenance Actions Improve Reliability Accountability Improve Maintainability Measurable Data Collection Process Achievable Adjust Weight Factor Impact Cost Cost Sharing Associated with Mechanical Failures System Status Readiness Recommendations Include Engineering Services in PBL Cost Study/Trade Analysis Include Mishap Investigation in PBL

  20. Current Metric Field Service Representative (FSR) Goal Satisfactory Weighted at 20% FSR = Field Service Representative Quotient Defined as Customer Satisfaction Quotient Terms And Conditions The parties agree that the contractor shall achieve a minimum Satisfactory FSR rating: The FSR Customer Assessment Report (FCA) will be used to determine the FSR quarterly rating. All FCA reports will be summed and a straight Average will be used to determine FSR rating. Ratings on the FCA report are 5=Outstanding 4=Good 3=Satisfactory 2=Below Average 1=Poor The contractor shall only retain FSR Teams that maintain a performance rating of “Satisfactory or better on the FCA appraisal”

  21. Field Service Representative (FSR) FSR = Field Service Representative Quotient Defined as Customer Satisfaction Quotient evaluated via CSAP Report CONS Subjective not Quantitative Limited feedback from OIF Units Minimum measurement of FSR performance FSR have limited control of unit operations PROS Customer Feedback Measurable Data Collection Process In place Subjective FSR Represent OEM and GOV PMO

  22. New Metric Consideration Field Service Representative FSR Accountability FSR Performance = Mishap Reduction • Considerations • Goals • FSR Accountability • Reduce FSR Footprint • Minimize and Reduce Mishaps • Mentor Unit • Increase Data Collection via UPAS • Measurable • Data Collection Process • Achievable • Ability to influence unit/ unit leadership • Incentivize FSR that take on Challenging • Unit • Impacts • Cost • System Status Readiness • FSR Personnel Turn Over/Transfer to • another unit Recommendations Enhance FSR Training Measure FSR by UPAS Transactions Measure FSR by Mishap per flight hour Improve MOA with Gaining Units Reduce FSR Footprint Collocated Unit Division FSR Support

  23. Reliability Growth Rate (RGR) • Identify Reliability Growth Rate Slope over a two year period with quarterly goals • Government would like to see a cost share on AV mishaps that fail above the RGR goal • OEM interested in a cost share against the incentives • Actions: • OEM RGR Slope under review by GOV PMO engineering staff. • Need final agreement on RGR • Establish desired weight of this metric and cost share approach

  24. Cost Sharing Clause

  25. Alignment to Known Warfighter Requirements ORD Requirements and Corresponding PBL Metrics UAVS PBL Metrics Definitions • Ao = Availability of 85% • SSR, CWT, and RGR • Mean Time Between System Abort • 20 Hrs Threshold – 57 Hrs Objective • SSR andRGR • Mean Time to Repair (MTTR) • .5 hrs (AVUM)/ 2.0 hrs (AVIM) • SSR andCWT • Operational Readiness • (non-ORD) of 90% • SSR,CWT, andDMR • System Status Readiness (SSR) • Defined as Total Time – Down Time (at Sub system Level) • Total Time • Customer Wait Time (CWT) • Defined as Total Req’s - # of Unsuccessfully Filled • Total Req’s • Depot Maintenance Ratio (DMR) • Defined as Total Flight Hours current QTR • # Open Depot Maintenance Actions • Reliability Growth Rate (RGR) • Defined as Performance against a Reliability Growth Curve Colors show where PBL metrics Support ORD Requirements!

  26. Metric Weights Current Recommended ? • System Status Readiness (SSR) 50 % 50% • Customer Wait Time (CWT) 25% 10% • Depot Maintenance Ratio (DMR) 10% • Reliability Growth Rate (RGR) 25% (FSR & LMR) 30%

  27. Scoring Methodology • Performance Metrics are weighted as follows: • SSR 45% • RGR 30% • CWT 5% • DMR 20% • A Incentive Score is calculated to determine incentive award (Fee), on a quarterly basis as follows: • (SSR x .45) + (RGR x .30) + (CWT x .05) + (DMR x .20) To facilitate the change from Cost contracts to Fixed / Incentive Fee type contracts – The FY 06 efforts will review collected actual cost during each scoring meeting to determine how much cost can be quantified and compared to the definitized cost target proposed.

  28. Sample RAW Metric Performance Score Table

  29. Determination of Incentive Based on Performance The Composite Incentive Score (IS) is calculated and Fee is then determined via the following Table found in Section H of the PBL Contract. • Example 1: The PSI exceeded the metrics and the Incentive Score is 93. Therefore, the incentive would be 8.0%. • Example 2: The PSI achieved all metrics – the IS score is 85. The incentive would be 5.2%. Exceeded Metrics IS = 93 Achieved Metrics IS = 85

  30. Summary • Metric selection must support the scope of the PBL program and the desired outcome • Specific details in the metric calculations must be clearly defined and measureable • Constant improvement process to achieve performance outcomes

More Related