1 / 30

Metrics Based Approach for Evaluating Air Traffic Control Automation of the Future

Metrics Based Approach for Evaluating Air Traffic Control Automation of the Future. Purpose. Provide overview of air traffic control automation system metrics definition activity Motivation Process

adonia
Download Presentation

Metrics Based Approach for Evaluating Air Traffic Control Automation of the Future

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Metrics Based Approach for Evaluating Air Traffic Control Automation of the Future

  2. Purpose • Provide overview of air traffic control automation system metrics definition activity • Motivation • Process • Present comparison of Host Computer System (HCS) radar tracks to GPS-derived aircraft positions

  3. Definitions • Software Testing • Process used to help identify the correctness, completeness, security and quality of developed computer software • "Testing is the process of comparing the invisible to the ambiguous, so as to avoid the unthinkable happening to the anonymous."James Bach (Contemporary author & founder of Satisfice, a test training & consulting company) • “Testing can show the presence of errors, but never their absence.” Dijkstra (Famous Dutch computer scientist and physicist, author, etc.) • Two Fundamental Processes • Verification • Building the product right (e.g. determining equations are implemented correctly) • Validation • Building the right product (e.g. solving the right equations)

  4. Why is this important to the FAA? • En Route Automation Modernization (ERAM) • Replaces EnRoute Host Computer System (HCS) and backup • ERAM provides all of today’s functionality and: • Capabilities that enable National Airspace System evolution • Improved information security and streamlined traffic flow at our international borders • Additional flight radar data processing, communications support, and controller display data • A fully functional backup system, precluding the need to restrict operations as a result of a primary system failure • Improved surveillance processing performance using a greater numbe/variety of surveillance sources (e.g. ADS-B) • Stand-alone Testing and Training capability

  5. ERAM Test Challenges • Limited funding • Installed and operational at 20 sites in 2008-9 • System Requirements • 1,298 in FAA System Level Specification • 4,156+ in contractor System Segment Specifications • 21,906 B-Level “shalls” • Software: 1.2 million SLOC • COTS/NDI/Developmental mixture • Numerous potential impacts, significant changes • ATC Safety, ATC Functions, System Performance, RMA, ATC Efficiency • Replacement of 1970s legacy software that has evolved to meet today’s mission

  6. Metric Based Approach • Formation of Cross Functional Team • Members from ERAM Test, Simulation, Human Factors, System Engineering, Air Traffic Controllers, and others… • Charter • “To support the developmental and operational testing of ERAM by developing a set of metrics which quantify the effectiveness of key system functions in ERAM” • Focus beyond requirement based testing but validation emphasis linked directly to services • Targeted system functions – Surveillance Data Processing (SDP), Flight Data Processing (FDP), Conflict Probe Tool (CPT), Display System (DS)

  7. Background • Metrics may be absolute or comparative in nature • Comparative metrics will be applied to current air traffic control automation systems (and later to ERAM) • Measure the performance of the legacy En Route automation systems in operation today to establish a benchmark • Allow direct comparison of similar functionality in ERAM • Absolute metrics would be applied to FAA standards • Provide quantifiable guidance on a particular function in ERAM • Could be used to validate a requirement • Task phases • Metrics Identification • Implementation Planning • Data Collection/Analysis

  8. Background (cont.) • Identification Phase – List of approximately 100 metrics were mapped to the Air Traffic services and capabilities found in the Blueprint for NAS Modernization 2002 Update • Implementation Planning Phase – Metrics have been prioritized to generate initial reports on a subset of these metrics • Data Collection/Analysis Phase – Iterative process

  9. Iterative Process • A series (drops) of data collection/analysis reports generated in the targeted system areas • Generate timely reports to the test group • Documentation is amended as process iterates

  10. Example Metrics • High Priority Metric – false alert rate of Surveillance Data Processing (SDP) Safety Alert Function • Direct link to ATC Separation Assurance from NAS Blueprint • Affects several controller decisions: aircraft conflict potential, resolution and monitor • Directly observable by controller and impacts workload • Several ERAM requirements – e.g. “ERAM shall ensure that nomore than 6 percent of the declared alerts are nuisance alerts…” • Lockheed Martin is using it in their TPM/TPI program • Low Priority Metric – wind direction accuracy for Flight Data Processing (FDP) Aircraft Trajectory • Trajectory accuracy already high priority metric • Potentially affects controller decisions but only indirectly by increasing trajectory prediction accuracy • Not directly observable by controller

  11. High Priority Metrics FY05/06 • Surveillance Data Processing (SDP) • Positional accuracy of surveillance tracker • Conflict prediction accuracy of Safety Alert Functions • Flight Data Processing (FDP) • User Request Evaluation Tool (URET) trajectory accuracy metrics • Comparison of route processing (HCS/URET & ERAM) • Forecast performance of auto-hand-off initiate function • Conflict Probe Tool (CPT) • URET conflict prediction accuracy metrics for strategic alerts (missed and false alert rates), working closely with development contractor (scenarios, tools, etc.)

  12. High Priority Metrics FY05/06 • Display System (DS) • By En Route Automation Group • DS Air Traffic Function Mapping to ATC Capabilities • By NAS Human Factors Group • Usage Characteristics Assessment • Tightly controlled environment, not dynamic simulation • Focused on most frequent and critical controller commands (e.g., time required to complete a flight plan amendment) • Baseline Simulation • High-fidelity ATC simulation, dynamic tasks • Focused on overall performance, efficiency, safety (e.g., number of aircraft controlled per hour)

  13. Completed Studies • “Comparison of Host Radar Tracks to Aircraft Positions from the Global Positioning Satellite System,” Dr. Hollis F. Ryan, Mike M. Paglione, August 2005, DOT/FAA/CT-TN05/30.* • “Host Radar Tracking Simulation and Performance Analysis,” Mike M. Paglione, W. Clifton Baldwin, Seth Putney, August 2005, DOT/FAA/CT-TN05/31.* • “Comparison of Converted Route Processing by Existing Versus Future En Route Automation,” W. Clifton Baldwin, August 2005, DOT/FAA/CT-TN05/29.* • “Display System Air Traffic Function Mapping to Air Traffic Control Capabilities,” Version 1, Christopher Reilly, Lawrence Rovani, Wayne Young, August 2005. • “Frequency of Use of Current En Route Air Traffic Control Automation Functions,” Kenneth Allendoerfer, Carolina Zingale, Shantanu Pai, Ben Willems, September 2005. • “An Analysis of En Route Air Traffic Control System Usage During Special Situations,” Kenneth Allendoerfer, Carolina Zingale, Shantanu Pai, November 2005. *Available at http://acy.tc.faa.gov/cpat/docs/

  14. Current Activities • Continue the baseline of system metrics • Begin comparison of ERAM performance to current system metrics

  15. Immediate Benefits to Initial Tests • Establish legacy system performance benchmarks • Determine if ERAM supports air traffic control with at least the same “effectiveness” as current system • Provides data driven scenarios, methods, and tools for comparison of current HCS to ERAM • Leverages broad array of SMEs to develop metrics and address ERAM testing questions

  16. Longer Term Benefits • Apply experience to future ERAM releases • Provide valid baseline, methods and measurements for future test programs • Support Next Generation Air Transportation System (www.jpdo.aero) initiatives • Contribute to the development of future requirements by defining system capabilities based on measurable performance data

  17. Study 1: Comparison of Host Computer System (HCS) Radar Tracks to Aircraft GPS-Derived Positions

  18. Background Task: Determine the accuracy of the HCS radar tracker • Supports the test and evaluation of the FAA’s En Route Automation Modernization (ERAM) System • Provides ERAM tracking performance baseline metric • Recorded HCS radar track data available from Host Air Traffic Management Data Distribution System • GPS-derived position data available from the FAA’s Reduced Vertical Separation Minimum (RVSM) certification program • GPS data assumed to be the true aircraft positions

  19. GPS-Derived Data • RVSM certification flights • Differential GPS • Horizontal position (latitude & longitude) • Aircraft positions identified by date/call-sign/time • 265 flights, 20 Air Route Traffic Control Centers (ARTCCs), January thru February 2005 • Continuous flight segments – level cruise, climbs, descents, turns

  20. HCS Radar Track Data • Recorded primarily as track positions in the Common Message Set format, archived at the Technical Center • Extracted “Flight Plan” and “Track” messages from RVSM flights • Track positions identified by date, call sign, ARTCC, and time tag (UTC)

  21. Methodology • Point-by-point comparison – HCS track position to GPS position – for same flight at same time • Accuracy performance metrics in nautical miles: • horizontal error - the unsigned horizontal distance between the time coincident radar track report and the GPS position • along track error - the longitudinal orthogonal component (ahead and behind) of the horizontal error • cross track error - the lateral orthogonal component (side-to-side) of the horizontal error • Distances defined in Cartesian coordinate system • Latitude/longitude converted into Cartesian (stereographic) coordinates • Stereographic coordinate system unique to each ARTCC

  22. Reduction of Radar Track Data • Split flights into ARTCC segments • Convert latitude/longitude to stereographic coordinates • Clean up track data • Discard data not matched to GPS data • Resample to 10 second interval & synchronize

  23. Reduction of GPS Data • Discard non-contiguous data (15% discarded) • Identify ARTCC and convert lat/longs to stereographic coordinates • Reformat to legacy format • Re-sample to 10 second intervals and synchronize

  24. Comparison Processing • Radar track point (x1,y1) matched to corresponding GPS point (x2,y2) • Pairs of points matched by date, call sign, time tag • Horizontal distance = SQRT [(x1-x2)2+(y1-y2)2] = SQRT [(Along Track Dist.)2+(Cross Track Dist.)2]

  25. Horizontal Error (nm) Cross Track Error (nm) Along Track Error (nm) Type Sample Size Mean RMS Mean RMS Mean RMS Signed 54170 0.69 0.78 0.00 0.16 -0.67 0.77 Unsigned 0.12 0.67 Descriptive Statistics

  26. Radar Horizontal Track - Flight #1 Falcon Mystere business jet Springfield – Kansas City – Wichita – Fayetteville radial – St Louis Climb – Cruise (FL350 & FL370) - Descend Y Coordinate in NM X Coordinate in Nautical Miles

  27. Radar (Left) & GPS (Right) Flight #1 – Turn (“south” heading) Y Coordinate in NM X Coordinate in Nautical Miles

  28. Radar (Right) & GPS (Left) Flight #1 – Straight (northeast heading) Y Coordinate in NM X Coordinate in Nautical Miles

  29. Horizontal Error (nm) Cross Track Error (nm) Along Track Error (nm) Type Sample Size Mean RMS Mean RMS Mean RMS Signed 374 0.80 0.89 -0.04 0.12 -0.79 0.88 Unsigned 0.10 0.79 Track Errors – Flight #1

  30. Contact the Author: mike.paglione@faa.gov 609-485-7926 Available Publications: http://acy.tc.faa.gov/cpat/docs/index.shtml

More Related