280 likes | 464 Views
“EVS – EXPANDED HORIZONS”. FAA NEW TECHNOLOGY WORKSHOP III JANUARY 9, 2007. TOPICS. Dynamic-Range Management EVS and LED‘s Including Advanced Runway Acquisition Runway Infrared Range (RIRR) New MMW Sensors, Conditioning, and Fusion Processing Integrated Synthetic Vision
E N D
“EVS – EXPANDED HORIZONS” FAA NEW TECHNOLOGY WORKSHOP III JANUARY 9, 2007
TOPICS • Dynamic-Range Management • EVS and LED‘s • Including Advanced Runway Acquisition • Runway Infrared Range (RIRR) • New MMW Sensors, Conditioning, and Fusion Processing • Integrated Synthetic Vision • Machine Verification and Integrity Monitoring
Dynamic Range Management (a) • Patented “Multiple Fused” Camera Approach • Separate thermal background scene (LWIR or MWIR) camera and runway/approach/taxi lights camera • Lights’ camera can be SWIR and/or Visible/NIR • SWIR is best match for conventional lights (and lower solar background) • Uncooled InGaAs technology is extremely sensitive • Visible/NIR senses conventional and LED lighting • New avalanche CCD technology provides a compact LLLTV • Either can see strobes (when properly implemented) • Eliminates competing requirements in one camera • Separately optimized; eliminates dynamic-range conflict • No blooming; no reduction in thermal sensitivity • Neither function compromises the other
B737 on Runway C-172 Taxiing Fused EVS With “Incursions”
Dynamic Range Management (b) Improved Thermal Image Processing for EVS • Sky/ground contrast is very high • One or other is saturated • Produces large changes in gain/level during banking or pitch maneuvers • Results in “jumpy” image levels and washout of contrast • Advanced approach to autogain/level control: • Separate high frequency content from image • Clamp amplitude of original image (low freq ac-coupled) • Optimize gain and level of high frequency image • Recombine the two into composite image • Greatly expands dynamic range • Maximizes high frequency detail under all conditions • Eliminates jumpiness in level, contrast • Optimizes for LCD displays (limited gray shades) • Enhanced fine detail, all conditions; no blooming • Optimization For HUD, Head-Down Are Different!
AGC dominated by ground Conventional Advanced A AGC AGC dominated by sky
EVS and LED’s • New visible LEDs: no IR signature • Max-Viz architecture: fuse a third, avalanche-CCD camera • Or: “VisGaAs” camera (InGaAs, visible through SWIR wavelengths) • For combined MWIR/SWIR cameras: provide SWIR laser diodes • Future approach: advanced acquisition technique for LEDs • Cooperative lights: Visible-LED and SWIR laser-diode based • Visible and eye-safe-SWIR approach lights through landing • Working with Harvey Mudd College: follow-on to FAA “white LED-based light bar” prototype activity • Pursuing capability with ATO-W Navigation Service: operational concepts, technology development, capability growth path • Penetrates fog significantly beyond ability of human eye
EVS andLED’s (cont.) • Acquisition-advantages apply in all conditions • VMC (overcome background noise) • Night-time (overcome EVS system noise) • Other potential application: - very small, inexpensive aircraft beacons • Collision avoidance: robust detection by other aircraft • UAV see-and-avoid • Pulsed eyesafe laser diodes with wide angular cones can readily be detected at 5-8 miles at high sky-background levels
Runway Infrared Range (RIRR) • Major Issue for EVS vs IMC Landing Credit: Uncertainty regarding “infrared range” to acquire runway • Proposal: Measure the local LWIR/MWIR range in real time • Requires return to transmissometer(s) • Single-mast scatterometer won’t work (geometric optics don’t apply) • Emulate the full Allard’s Law algorithm • Thermal-IR background scene PLUS: • Invoke standardized sensors for lights (TSO’d) • Include lights setting, background level • Display and eye response • Details are in new patent disclosure • Assumptions are similar to RVR • e.g., localized measurement applies to slant-path approach • May make particular sense at heavy-single-user hubs
MMW Imaging Radar: New Approaches to Processingfor All-Weather Sensing • More Effectively Preprocess Radar Data • Raw imagery is relatively crude • Enhance features of interest by multiple segmentation • Runway, structures, hazards • Suppress clutter, noise • Best if done “up front”: before conversion to perspective display • Full digital dynamic range • Effectively Deal With Fundamental Nature of 2D-Radar Imager • Range resolution is quite good (e.g., 1 meter) throughout image • Use resampling to retain that in perspective display (C-scope) features • Help alleviate range/elevation ambiguity on non-flat objects: • Appropriate handling of shadows: further range detail vs cancel noise • Can sense the difference between “a structure and a flock of birds”!
MMWR (cont.) • 3D Rendering from 2D Radar • New means of successive-frame processing • Minimal-latency in result • Also invoke terrain and cultural features: database correlation • Hazard-detection Upcoming flight tests: MMW w/3D Processing Fusion with IR U.S. Army HALS Program
“Smart-Fusion” Approach for MMW • Primary theme: • Good data from each sensor is enhanced • No sensor output degrades good data from other sensors • Noise and clutter are suppressed • Automatically Select Dominant (Primary) Sensor • Based on spatial-frequency and contrast/noise content • Selection can change with scene content (e.g., fog, brownout, rain, snow) • Dynamic selection, transparent to user, damped against oscillation • Treats other sensor data as localized increment on primary sensor output • Eliminates polarity-reversal issues • Applies to multiple, diverse sensors (“All-Weather Sensor Suite”) • Recently demonstrated EO/IR with MMW imaging radar at WPAFB • Fog, rain, snow
MMW Dominates IR Dominates Intermediate
Fusion of FLIR with MMW radar LWIR “Hard” “Raw” MMW radar FUSED Image
IR AFRL I AFRL Field Demo Jan. 31-Feb. 10, 2006 • Variety of weather conditions • Obstacles detected on runway Fused with MMW
Integrated EVS/SVS • Critical Paradigm: SVS must be verified • Integrated display – pilot does verification • “SE-Vision”: AFRL/Rockwell Collins/Max-Viz - Flight tests on FAA Wm. J. Hughes Tech Center B727 • New integrated EVS/SVS offerings for business aircraft, GA • Automatic verification – EVS/database correlation - Powerful tool for additional capabilities, integrity
The view from the FAA 727 cockpit!Flight Test Video New Mexico 2005
Terrain-Database Correlation • Need for “smart verification” of SVS • GPS/nav errors, integrity issues vs driving SVS display • Database errors and obsolescence • Presence of transient hazards • Solution: automatic (“machine-based”) database-correlation with EVS sensors • Uses runway/taxiway geometry, structures, terrain features • Fixed-wing: runway lock-on • Rotary-wing LZ: lock-on to terrain and other features • How is it accomplished • Multisensor data fusion • Advanced recognition technology from DLR (Germany) • Integrated into Max-Viz processor • AI-derived, hypothesis-testing/clustering algorithms for high-integrity “machine decisions” • Over 100 man-years in development
Database Correlation (cont.) • Real-Time Functions: • Verify landing area • Correct the SVS-registration (including guidance symbology) • Heading • Verify database accuracy • Detect hazards and annunciate • Separate-thread navigation signal • 3D position: range to threshold and accurate “virtual ILS” • Includes critical vertical (AGL) signal • Completely independent of GNSS • Curved-approach acquisition, nav-generation: Demonstrated during SE-Vision flights, fall of 2006 • Broad capability to be incorporated in tech-demo system: Boeing Phantomworks (spring 2007)
Boeing Field Integrated EVS/Ground-Data Correlation Sun Valley
Video Clips • Split-view takeoff: new thermal image processing • MMW fusion Long Beach • MMW fusion WPAFB tower • MMW hazard: auto-annunciate • Curved-approach runway lock-on • Virtual-ILS generation from runway lock-on • SE-Vision flights