630 likes | 726 Views
Defense Sustainment Consortium Project 6: Common Processes Phase 1 – Test Bed Demonstrations Program Review. February 2, 2005 San Antonio, TX. Common Processes Agenda. Description Presenter Introduction – Objectives Holcomb Metrics/Common Tasks VanderBok
E N D
Defense Sustainment ConsortiumProject 6: Common ProcessesPhase 1 – Test Bed DemonstrationsProgram Review February 2, 2005 San Antonio, TX
Common Processes Agenda DescriptionPresenter Introduction – Objectives Holcomb Metrics/Common Tasks VanderBok ADP Test Bed Demonstration Major AFDRAS Test Bed Demonstration Miller Project Financials Holcomb
Problem - Solution • Future Logistics Enterprise (FLE) is DoD’s vision to enhance support to the warfighter • FLE includes CBM+ • Enhanced prognostics / diagnostics • Failure trend analysis • Point of maintenance aids • Serial item tracking • Data-driven interactive maintenance training • Distance support • Integrated maintenance with other logistics functions
Army & Navy Targeted Solution • A practical system that can be integrated into any weapons platform. • A system that uses Commercial-Off-The-Shelf (COTS) applications. • The minimum of modifications to existing hardware. WBS 2.5.2 Common Processes – Phase 1 WBS 2.5.2.1 Common Task ActivitiesWBS 2.5.2.2 ADP Test Bed DemonstrationWBS 2.5.2.3 AFDRAS Test Bed DemonstrationWBS 2.5.2.4 Project Management
Common Processes Team • CVN DemonstrationNGC:John Major, john.major@ngc.comNSWC: Walt Kostyk, kostykwj@nswccd.navy.mil • M88 DemonstrationUDLP: Andy Miller, andrew.miller@udlp.com PM M88: Dave Boster, bosterd@tacom.army.mil • Integration & Project ManagementATI: Curtis Holcomb, holcomb@aticorp.orgAltarum: Ray VanderBok, ray.vanderbok@altarum.org
Common Processes Agenda DescriptionPresenter Introduction – Objectives Holcomb Metrics/Common Tasks VanderBok ADP Test Bed Demonstration Major AFDRAS Test Bed Demonstration Miller Project Financials Holcomb
WBS 2.5.2.1: Common Task Activities • WBS 2.5.2.1.1 Update common decision process • Common decision process documented in Phase-0 • Evaluate appropriate applications • Will work with OEM’s and PM’s for refining process • WBS 2.5.2.1.2 Promote common evaluation metrics • Primary role to support continued development • Build metrics from Phase-0 starting point • Share metrics and rationale across demonstrations
WBS 2.5.2.1: Common Task Activities • WBS 2.5.2.1.3 Share metrics and lessons learned • Proactively share information across programs • Recruit DSC members to engage in program reviews • WBS 2.5.2.1.4 Plan for broad deployment • Work with OEM’s and PM offices to identify deployment path • Support deployment justification • WBS 2.5.2.1.5 Final report • System descriptions • Application description • System performance • Business impact (business & government) • Deployment plans
WBS 2.5.2.1: Common Task Activities • Accomplishments • Assisted in capturing the core business cases for deployment • Developing metrics to support business cases • Next Quarter • Validate business cases and metrics with program offices • Status: 50% complete • Deliverables: • Updated Phase 0 Report – Dec 05 • Consolidated Final Report – Dec 05
Common Metrics Pilot Metrics Business Case Implementation Decision Metrics that support the business case for deployment
Business Case Overview CBM+ Benefits Pilot metrics and feasibility combined with weapon system characteristics will demonstrate the total value of the CBM technologies + Consistent WithOther Initiatives Robust forEnvironment Deployment Decision MatureTechnology Fits Architecture Extends toMany Applications Supportable High TechnicalPerformance Manageable Risk DSC PilotDemonstrations
Core Business Case • Benefits • Meets the DOD policy that: • In-line with DOD CBM+ Plan of Action and Milestones • Cost • Common approach that applies to multiple systems • Leverages existing hardware investments • Supports • OSD-AT&L CBM+ Policy and Guidance • Navy’s CBM+ Initiatives & Integrated Condition Assessment System (ICAS) • Army’s CBM+ Plan “Condition Based Maintenance be implemented to improve maintenance agility and responsiveness, increase operational availability, and reduce life cycle total ownership costs”
Common Task Activities • Next Steps • Briefings for support of DSC CBM+ Efforts • OSD, NAVSEA, NAVSUP, FCS... • Broad exposure in the Navy and Army • Identify a weapon system to continue Common Processes Efforts (Future) • Identify specific metrics for pilot demonstrations
Common Processes Agenda DescriptionPresenter Introduction – Objectives Holcomb Metrics/Common Tasks VanderBok ADP Test Bed Demonstration Major AFDRAS Test Bed Demonstration Miller Project Financials Holcomb
WBS 2.5.2.2: Automated Diagnostic / Prognostic (ADP) System - Test Bed Demonstration • Goals: • Install, test, and demonstrate a commercial ADP system connected to ICAS through the existing ship network. • The ADP will communicate to ICAS through an Open System Architecture – Condition Based Maintenance (OSA-CBM) commercial standard • The integrated ADP/ICAS/Network will align with CBM+ • The integrated ADP/ICAS/Network will align with OPNAV Instruction 4700.7k (11 JUL 2003)
WBS 2.5.2.2: ADP Test Bed Demonstration • Expected Tangible Benefits: • Reusable “core” can be modified to be used on many different ship systems • Represents the “how-to” link between a commercial vendor and ICAS • Allows embedded diagnostic and maintenance expertise from the vendor/manufacturer to be applied • Show how vendors create individual ADPs that send CBM information to ICAS • Reduced workload on NSWCCD to develop and maintain: • the equipment system data bases • equipment expert systems • equipment prognostic systems • ADP interfaces (for each individual ADP/ICAS interface) • Reduced maintenance workload on the Ship’s Force • Measurable “availability” of ship’s systems • Performance of maintenance only “as required” • Detect, Diagnose, and be provided “what action to take” concerning system faults
WBS 2.5.2.2: ADP Test Bed Demonstration • Process for Deployment of Results • Hold Symposium for ADP and ICAS Development and Integration • Demonstrate ADP / ICAS to OSD (CBM+) • Demonstrate ADP / ICAS capabilities to NAVSEA • Demonstrate ADP / ICAS capabilities to PEO Carriers • Demonstrate ADP / ICAS capabilities to Army • Present Technical Papers at ASNE and MFPT Conferences
WBS 2.5.2.2: ADP Test Bed Demonstration • Progress Relative to Schedule • Progress is noted in each WBS section in following slides. • Regrouping after NSWC brought back online. • Project will be back on schedule by March ’05. • No problems foreseen in meeting deliverables after that date.
XML XML XML XML XML XML XML XML ADP/Network/ICAS Diagram Ship Network Four Vent Fans and 80 Sensors (total) ADP OSA Decision Support Layer 44 Parameters OSA Prognostics Layer OSA Health Assessment Layer 44 Parameters OSA Condition Monitoring Layer HTML Page ICAS OSA Signal Processing Layer 44 Parameters OSA Data Acquisition 44 Parameters OSA Sensor Layer Internet Based Interfaces
XML XML Responsibilities of ICAS and ADP in Relation to DSC Project ADP • ICAS will do the following to display data, diagnostic, and prognostic information from vent fans: • Notify the operator that a non-normal condition exists on one or multiple vent fans. • Open an Internet Web Browser. • Select the proper URL/IP address for the vent fans. • Open and process the XML files every minute for HTML page updates. • ADP will do the following to send displays for data, diagnostic, and prognostic information from vent fans: • Pass an integer to ICAS notifying a non-normal condition exists on a fan or multiple fans. • Provide the Internet Web Server that hosts the pages to be displayed. • Provide the proper URL/IP address for the vent fans. • Update the XML files every minute for HTML page updates. Ship Network ICAS
Normal ICAS Screen with Alert Icon ICAS after URL/IP selection ICAS after select of more information Decision Support –Describes problem(s) and what to do next ICAS Progression of Non-normal Identification Selection on ICAS Workstation • Normal ICAS screen. • Alert icon on Normal ICAS screen. • Operator “clicks” on Alert icon, opening Web Browser. • Display of fault area shown on ICAS via Web Browser. • Operator can obtain more information by clicking on icons. • Processed sensor data, diagnostics, prognostics, and decision support, • provided by the ADP, can be accessed via this Web Browser. • Operator can terminate at anytime and return to normal ICAS Screens.
WBS 2.5.2.2 ADP Test Bed DemonstrationWBS 2.5.2.2.1 Selected System General Maintenance and Time Frequency Analysis Objective: Review maintenance reports and determine potential impact of ADP on maintenance cost for ventilation systems • Accomplishments • Target Demonstration Platform • Evaluated CVN73 existing network and ICAS infrastructures • Performed a ship check on CVN 73 • Located four ventilation fans for test • Located area for ADP system • Determined that we will use temporary shore power • Failure Mode Analysis • NSWC has completed the Fan Failure Analysis Report • Targeted ventilation systems on the O3 level on forward end of ship. • Reviewing the NSWC Fan Failure Analysis Report • Next Quarter • Prepare a Test and Evaluation (T&E) Report for AIRLANT and obtain necessary approvals • Preparing initial Drawings to submit to AIRLANT for approval and insertion into the Availability Work Package • Status: 75% complete • Deliverable: Fan Failure Analysis Report – Jan 05
WBS 2.5.2.2 ADP Test Bed DemonstrationWBS 2.5.2.2.2 CBM System Development Objective: Develop ADP for ventilation system onboard CVN68 class aircraft carrier. • Accomplishments • Developed interface between ADP and ICAS using OSA - MIMOSA standards • Integrated the ADP system with the NSWCCD developed Integrated Condition Assessment System (ICAS) via an open systems interface standard on January 19, 2005 • Completed and tested integration of ADP with ICAS. • Next Quarter • Adapting NGNN ADP system for target systems based on the NSWCCD Fan Failure Analysis Report. • Developing Test Plan and Procedure for the ADP System. • Complete test of ADP SYSTEM • Status: 80% • Deliverable: ADP System Documentation – March 05
ADP Server HMI Server WBS 2.5.2.2 ADP Test Bed Subtask WBS 2.5.2.2.2 CBM System Development (Cont) Integration of ADP/Network/ICAS New Installation Console I/O PLC Direct Connection I/O Typical Core Network I/O HUB 3 PLC Network Connection HUB 6 HUB 2 PLC PLC I/O I/O HUB 4 HUB 1 I/O ICAS HUB 5 New Installation HMI Console Vent Fans
WBS 2.5.2.2 ADP Test Bed WBS 2.5.2.2.3 Laboratory Testing Lab Testing Description (Sept’04-Mar’05) • Accomplishments • Target Demonstration Platform • Identified, procured, and have tested fan sensors and data acquisition software to ensure reliability • Developed test plans and procedures and have tested the ADP sensors and support equipment. • Developed test plans and procedures and have tested the integration of the ADP with ICAS and the network infrastructure • Next Quarter • Adapting NGNN ADP system for target systems based on the NSWCCD Implementing the findings of the NSWC Fan Failure Analysis Report in the ADP code • Demonstrate / test the ADP diagnostics/prognostics capabilities
WBS 2.5.2.2 ADP Test Bed WBS 2.5.2.2.3 Laboratory Testing Lab Testing Description (continued) • Next Quarter • Documentation of test results • Status: 75% complete • Deliverable: ADP Lab Demonstration – March 05
WBS 2.5.2.2 ADP Test Bed WBS 2.5.2.2.4 Ship Installation and Testing • Shipboard Installation and Initial Testing (Jan’05-Jul’05) • Overview The ADP will acquire data from four representative vane-axial fans in relatively close proximity to each other to minimize installation costs for the project. Installation will be accomplished on CVN73 during the January-December scheduled availability. The ADP will be connected via the shipboard network to ICAS where diagnostic and prognostic vent fan information will be displayed to the operator. We are expecting installation to be completed in the June – July time frame.
WBS 2.5.2.2 ADP Test Bed WBS 2.5.2.2.4 Ship Installation and Testing Shipboard Installation and Initial Testing (continued) • Accomplishments • Performed Successful Ship Check on 20Jan05 • Located four ventilation fans for test • Located area for ADP system • Determined cable power runs • Determined power requirements • Determined to use temporary shore power • Started Ship Installation Drawings for Installation • Next Quarter • Development of shipboard test plans and procedures for accomplishing shipboard testing of ADP sensors, support equipment, and ADP system. • Start of the installation of ventilation sensors, support equipment, and ADP. • Development of shipboard test plans and procedures for integration test of the ADP with ICAS and the shipboard network infrastructure • Start of the integration of the ADP with ICAS via the shipboard network infrastructure • Status: 40% • Deliverable: ADP Ship System Documentation – July 05
WBS 2.5.2.2 ADP Test Bed WBS 2.5.2.2.5 Demonstration Ship Board Demonstration (June 05 – Dec 05) • Monitor the ability of the system to pass any maintenance conditions to ICAS. • After 3 months visit the ship and examine ADP/ICAS. • After 6 month period • review data to confirm the existence or non-existence of failures or problems • determine if fan parts usage aligns with ADP fault identification • In the event of no failures or problems with the selected fans, implement failure simulation test procedures • Simulation Plan • Development of test plans and procedures for accomplishing shipboard simulation testing of ADP sensors, support equipment, and ADP system. • Stimulate maintenance faults/problems through sensor stimulation and/or through software and determine the ability of the system to pass these maintenance conditions to ICAS. • Document demonstration results. • Status: 0% Not started
WBS 2.5.2.2: ADP Test Bed Timeline • Shipboard Demonstration • Simulation Plan • Periodic Reviews • Drawing Development • ICAS Integration Project Start October ‘03 Mar ‘04 Oct ‘04 Feb ‘05 Jun ‘05 Dec ‘05 Evaluation of Results and Final Report • Failure Modes Review/Analysis • CBM System Development • Identify Demo Platform • Test Procedures • Lab Testing of ADP • Lab Test of ADP/ICAS • Ship Installation • Ship S/W Integration • Shipboard Testing • Training
Common Processes Agenda DescriptionPresenter Introduction – Objectives Holcomb Metrics/Common Tasks VanderBok ADP Test Bed Demonstration Major AFDRAS Test Bed Demonstration Miller Project Financials Holcomb
Task 3: Test Bed Demonstration 2 – Automated Fault Data Reporting Assessment System (AFDRAS) • Goals: • Construct a demonstrator to show the ability to collect, interpret, and store data acquired during weapon system (HERCULES) diagnostics and maintenance. • Show that stored data is useful for off-system assessment databases or other logistical support functions (CBM+). • Integrate data acquisition with the troubleshooting process so that data is collected during maintenance and formatted for transmission to a central database.
Common Processes AFDRAS Demonstrator Concept • Use current IETM software. • Use current Enhanced Diagnostic System (EDS) hardware and software. The EDS software will require some changes to record the necessary data. • Develop additional software to enable maintainer to record parts information data. • Develop software to format data into an XML tagged format for wireless transmission from the field support device.
Common Processes AFDRAS Demonstrator Concept Field Support Device (MSD or SPORT) 1553 Interface EDS Software M88A2 HERCULES XML Data File Transfer Queue EDS 1553 Interface M88 IETM Failed Parts Data Software Hydraulic Pressure Transducers Extracted System Data Storage Wireless Transmission Maintainer Inputs
WBS 2.5.2.3 AFDRAS Test Bed Subtasks • WBS 2.5.2.3.1 Identifying and Using Data • Completed January 2004 • WBS 2.5.2.3.2 Identifying Hardware/Software to Enable Data Collection • Completed October 2004 • Submitted deliverable on October 29, 2004 • WBS 2.5.2.3.3 Collecting Data through an IETM • Task is 90% Complete • WBS 2.5.2.3.4 Extracting Data from the IETM • Task is 70% Complete
WBS 2.5.2.3 AFDRAS Test Bed Subtasks • WBS 2.5.2.3.5 Automatic Database Population • Task is 80% Complete. • Documenting the hardware solution for the wireless data transfer is primarily what is left of the remaining effort to complete this WBS. • WBS 2.5.2.3.6 Automating Data Collection and Database Population, Demonstrator Hardware/Software Acquisition, and Assembly • Began late January 2005 • WBS 2.5.2.3.7 Demonstration • Scheduled to begin April 2005 (Begin trials for demo prep at UDLP) • Presumed end user evaluations in July-August. UDLP will coordinate with the HERCULES PMO. • WBS 2.5.2.3.8 Evaluation and Final Report • Scheduled to begin October 2005
Path Forward • WBS 2.5.2.3.3 Collecting Data through an IETM • Task is 90% Complete • UDLP will complete (in house) the programming of the Parts Information Software Module (PISM) by 28 February 2005. • WBS 2.5.2.3.4 Extracting Data from the IETM • Task is 70% Complete • UDLP will complete (in house) the first Beta copy of the DCDT software by 31 March 2005. • Additional resources will transfer to the DCDT efforts after the PISM is complete.
Parts Information Failed Parts Data SGML/XML File Display & Record Commands Troubleshooting Data SGML/XML File IETM Display & Record Commands EDS Software Pressure Data SGML/XML File Data Collection WBS 2.5.2.3.3 Subtask 3: Collecting Data Through an IETM
WBS 2.5.2.3.3 Subtask 3: Collecting Data Through an IETM • Continued developing the methodology for IETM software enhancements to provide the needed parts information for recording the desired data during each maintenance action. • Parts information modules use the HERCULES electronic Repair Parts and Special Tools List (RPSTL) figures and data files from the IETM software. • Define user selection of the parts display and parts information stored in the MS Access database. Within the database software, the data is stored in an array structure. • Begin completion of the logical retrieval of the parts information within the array structure into XML data format.
WBS 2.5.2.3.3 Subtask 3: Collecting Data Through an IETM FAILED PARTS DISPLAY
WBS 2.5.2.3.3 Subtask 3: Collecting Data Through an IETM FAILED PARTS DISPLAY
WBS 2.5.2.3.3 Subtask 3: Collecting Data Through an IETM FAILED PARTS DATA XML FILE <ietm-maint-report-data> <ietm-maint-report> <ietm-parts-data> <parts-ordered-info> <partno>11671057</partno> <nomen>PLATE, INSTRUCTION</nomen> <smr>PAOZZ</smr> <cageno>19207</cageno> <nsn>5310-0-637-9541</nsn> <qty>1</qty> <date>01-20-05</date> <time>13:25:35</time> </parts-ordered-info> <parts-received-info> <partno>n/a</partno> <nomen>n/a</nomen> <cageno>n/a</cageno> <qty>n/a</qty> <date>01-20-05</date> <time>13:25:35</time> </parts-received-info> </ietm-parts-data> </ietm-maint-report> </ietm-maint-report-data>
WBS 2.5.2.3.3 Subtask 3: Collecting Data Through an IETM • Completed and submitted software specification for failed parts module. • Completed software modifications to the IETM EDS to record data that is currently displayed and used by the maintainer for diagnostics.
WBS 2.5.2.3.3 Subtask 3: Collecting Data Through an IETM EDS PRESSURE DATA DISPLAY 253 251 RT On
WBS 2.5.2.3.3 Subtask 3: Collecting Data Through an IETM EDS PRESSURE DATA XML FILE <PRESSURE> <DATE> 2005-01-20 </DATE> <TIME> 13:30:21.25 </TIME> <TESTPOINT> TV_PA </TESTPOINT> <VALUE> 253 </VALUE> <TESTPOINT> TV_PC </TESTPOINT> <VALUE> 251 </VALUE> <DATE> 2005-01-20 </DATE> <TIME> 13:30:21.75 </TIME> <TESTPOINT> TV_PA </TESTPOINT> <VALUE> 254 </VALUE> <TESTPOINT> TV_PC </TESTPOINT> <VALUE> 251 </VALUE> . . .<DATE> 2005-01-20 </DATE> <TIME> 13:30:25.25 </TIME> <TESTPOINT> TV_PA </TESTPOINT> <VALUE> 253 </VALUE> <TESTPOINT> TV_PC </TESTPOINT> <VALUE> 251 </VALUE> .<DATE> 2005-01-20 </DATE> <TIME> 13:30:25.25 </TIME> <TESTPOINT> TV_PA </TESTPOINT> <VALUE> 253 </VALUE> <TESTPOINT> TV_PC </TESTPOINT> <VALUE> 251 </VALUE> <DATE> 2005-01-20 </DATE> <TIME> 13:30:25.75 </TIME> <TESTPOINT> TV_PA </TESTPOINT> <VALUE> 253 </VALUE> <TESTPOINT> TV_PC </TESTPOINT> <VALUE> 251 </VALUE> <DATE> 2005-01-20 </DATE> <TIME> 13:30:26.25 </TIME> <TESTPOINT> TV_PA </TESTPOINT> <VALUE> 253 </VALUE> <TESTPOINT> TV_PC </TESTPOINT> <VALUE> 251 </VALUE> </PRESSURE>
WBS 2.5.2.3.4 Subtask 4: Accessing(Extracting) Data Through an IETM • Completed software development that extracts troubleshooting and maintenance action information using the SGML file produced by the IETM. • Completed specification for the DCDT software which was submitted as part of the UDLP Common Processes Phase 1 Data Collection System Documentation on October 29, 2004.
WBS 2.5.2.3.4 Subtask 4: Accessing (Extracting) Data Through an IETM TROUBLESHOOTING TREE DATA FILE • </Configuration> • <Environment Temperature = "0" Temperature-Scale = "FAHRENHEIT" Humidity = "0" Terrain = "FLAT" Altitude = "0" Altitude-Units = "FEET"> • </Environment> • <Equip-Data Admin-Num = "" Equip-Serial = "" Equip-Model = "" • Regis-Num = "" Equip-Noun = "" Equip-NSN = ""> • <Usage Reading = "0" Units = "MILES"> • </Equip-Data> • <Date Month = "JAN" Day = "12" Year = "2005"> • <Time Hour = "12" Minute = "53" Second = "37" Millisec = "80"> • </Collection-Info> • <IETM-DATA> • <Maintenance-Requested Priority = "Normal"> • <Classify> • </Maintenance-Requested> • <TShoot System = "" Symptom = "Engine Has Excessive White Smoke." SubDoc = "F08" Element = "F08"> • <Date Month = "JAN" Day = "12" Year = "2005"> • <Time Hour = "12" Minute = "53" Second = "37" Millisec = "156"> • <Visual Question = "Is excessive white smoke still coming from exhaust?" Reason = "This test determines if the manual fuel valve to the smoke gener" Engine-State = "CRANKING"> • <Button Caption = "YES"> • <HyperProc TMIdNo = "M88A2" Subdoc = "F08" Element = "TP02" Linkage = "No-Return"> • <Date Month = "JAN" Day = "12" Year = "2005"> • <Time Hour = "12" Minute = "53" Second = "37" Millisec = "234"> • </Button> • </Visual>