410 likes | 560 Views
Developing Performance Predictive Tools for Supervisors of Teams and Complex Systems. February 22 nd 2007 Yves Boussemart (yves@mit.edu) Humans & Automation Lab MIT Aeronautics and Astronautics http://halab.mit.edu. Outline. Lab Overview Human Supervisory Control
E N D
Developing Performance Predictive Tools for Supervisors of Teams and Complex Systems February 22nd 2007 Yves Boussemart (yves@mit.edu) Humans & Automation Lab MIT Aeronautics and Astronautics http://halab.mit.edu
Outline • Lab Overview • Human Supervisory Control • Single vs. Multiple UVs • Supervising teams of UV operators • Tools for Supervisor • TRACS • Performance Prediction • Team Environments
MIT Humans & Automation Lab • Created in 2004 • Director: Dr. Mary (Missy) Cummings • Visiting professors: Dr. Gilles Coppin • Visiting scientist: Dr. Jill Drury • Post Doctorate Associates: Dr. Stacey Scott, Dr. Jake Crandall, Dr. Mark Ashdown • Grad Students: Yves Boussemart, Sylvain Bruni, Amy Brzezinski, Hudson Graham, Jessica Marquez, Jim McGrew, Carl Nehme, Jordan Wan
Research and Current Projects • Research in the Humans and Automation Lab (HAL) focuses on the multifaceted interactions of human and computer decision-making in complex socio-technical systems. • Time-Sensitive Operations for Distributed Teams • Human Supervisory Control Issues of Multiple Unmanned Vehicles (Reduced manning) • Measurement Technologies for Human-UV Teams • Collaborative Human Computer Decision Making • Integrated Sensor Decision Support • Sponsors: Office of Naval Research, Boeing, Lincoln Labs, AFOSR, Thales
HAL Testing Equipment Single Operator Testing: ONR’s Multi-modal Watch Station (MMWS) Team Testing: HAL Complex Operation Center ONR Mobile Testing Lab
Outline • Lab Overview • Human Supervisory Control • Single vs. Multiple UVs • Supervising teams of UV operators • Tools for Supervisor • TRACS • Performance Prediction • Team Environments
Computer Task Human Supervisory Control (HSC) Actuators Controls Human Operator (Supervisor) Displays Sensors • Human Supervisory Control • Humans on the loop vs. in the loop • Supporting knowledge-based versus skill-based tasks • Network-centric operations & cognitive saturation
Human-Supervisory Control of Automated Systems Manned Aviation Process Control Satellite Operations (Shadow UAV) (Mars rover) Unmanned Vehicle Operations
Major Research Area: HSC of Unmanned Vehicles Unmanned Aerial Vehicles (UAVs) Unmanned Undersea Vehicles (UUVs) Predator UAV VideoRay UUV Shadow UAV Odyssey UUV Unmanned Ground Vehicles UGVs (i.e., Robots) Spotter UGV Packbot UGV
Major Research Area: HSC of Unmanned Vehicles Unmanned Aerial Vehicles (UAVs) Unmanned Undersea Vehicles (UUVs) Predator UAV VideoRay UUV Shadow UAV Odyssey UUV Unmanned Ground Vehicles UGVs (i.e., Robots) Spotter UGV Packbot UGV
Predator Ground Control Station Predator UAV Motivation: Increasing Reliance on UAVs in Military Operations • UAVs are becoming an essential part of modern military operations • Typical UAV missions include: • Force protection • Intelligence, surveillance, and reconnaissance (ISR) • Combat search and rescue • Strike coordination and reconnaissance (SCAR)
Inverting the Operator/Vehicle Ratio Semi-Autonomous UAV Operations2-5 UAVs : 1 Operator Current UAV Operations1 UAV : 2-5 Operators Future UAV Teams
Current Supervisory-Level Decision Support for Teams • Developed large-screen supervisor displays that provide current and expected mission and task progress information of team assets andoperator activity • Displays integrate related informationand provides emergent features fortime-critical data
Supervisory Information? • Individual and Team performances • Stress & time pressure • Rapidly evolving situation • Actions: • Adaptive automation • Operator replacement / shifts Excessive workload
Towards Performance Prediction Tools 4 step process: • Tracking of individual actions • Pattern recognition on strategies and performance prediction • Aggregation of individual data and collaboration factors • Team level performance predictions Is the team doing well? Is the Operator using “good” strategies? Operator Supervisor
Outline • Lab Overview • Human Supervisory Control • Single vs. Multiple UVs • Supervising teams of UV operators • Tools for Supervisor • TRACS • Performance Prediction • Team Environments
Tracking Resource Allocation Cognitive Strategies (TRACS) • 2-dimensional space: • Level of Information Detail (LOID) • Mode (action steps) • 4 quadrants: • LOID: higher vs. lower automation/information • Mode: evaluation vs. generation of solutions • Technology disclosure for patent and licensing
Example of TRACS Application • Application: Decision-Support for Tomahawk Land Attack Missile (TLAM) Strike Planning • Resource allocation task: • Match resources (missiles) with objectives (missions) • Respect Rules of Engagement • Satisfy multivariate constraints • Current system: PC-MDS, no decision-support • 3 interfaces at various levels of collaboration
Example of TRACS Representation • TRACS applied to TLAM • LOID: • Higher automation: • Group of criteria • Individual criterion • Lower automation: • Group of matches • Individual match • Data cluster • Data item • Mode: • Evaluation: • Evaluate, • Backtrack • Generation: • Browse • Search • Select • Filter • Automatch
Example of TRACS Results Mostly Manual (Interface 1) Mostly Automation (Interface 3) Combination (Interface 2) Cognitive strategies are emerging as patterns
Outline • Lab Overview • Human Supervisory Control • Single vs. Multiple UVs • Supervising teams of UV operators • Tools for Supervisor • TRACS • Performance Prediction • Team Environments
Performance Prediction with TRACS • TRACS as a observable data of Hidden Markov Model for individual users • Compute the decision transition matrices from empirical data • Bayesian Prediction based on Markov Chains
Performance Prediction with TRACS • TRACS + Neural Networks: • Detect pattern with neural network: cognitive strategies • Alert supervisor when behavior degrades • Are bad performances robustly predictable in advance? Manual to automatch Manual Browsing Automatch loop
Outline • Lab Overview • Human Supervisory Control • Single vs. Multiple UVs • Supervising teams of UV operators • Tools for Supervisor • TRACS • Performance Prediction • Team Environments
Individual Tracking cognitive strategies Performance predictions Relatively simple metrics Team Team dynamics Intra-Team communication Verbal & Non-Verbal Performance metrics Group Awareness Situation and activity awareness Distributed cognition Group Interaction Theories Collaboration Factors Open Research Questions
Critical Questions to Consider • What metrics can we use to gauge team performance? • Which factors drive the metric? • How does time pressure affect the decision process? • How much information does a supervisor need? • Direct observation of operators’ behavior • Synthetic data only (TRACS)? • Both?
Summary • Focus went from individual UAV operator to supervisor of teams of UAV operators • Proposing a performance predictive tool • Extend the predictions to team environments
Research supervised by Prof. M. L. Cummings • Research effort sponsored by Boeing/Boeing Phantom Works • Contacts: yves@mit.edumissyc@mit.edu • Web: http://halab.mit.edu • TRACS demo: • http://web.mit.edu/aeroastro/www/labs/halab/media.html • http://tinyurl.com/ybafp2
Interface 1 - manual LOA 2 - manual matching Basic support: filtering, sorting, warning and summarizing
Interface 2 - collaborative manual matching “automatch” = customizable heuristic search algorithm graphical summaries of constraint satisfaction option to save for comparison purposes LOA 3 - collaborative matching Advanced features for interactive search of a solution
Interface 3 - configural possibility to tweak the solution or to force assignment no access to raw data aggregated info. only LOA 4 - automated matching with configural display High level, constrained solution search
Tomahawk Mission Planning • Performance on incomplete scenario • performance decreased when LOA increased on single interface setup • best: interface 1 and interfaces 2&3 - worst: interfaces 1&3 • no deviation on interface 3 • Interface 1: P = 69.7 • Interface 3: P = 68.5
TRACS 3D • Problems with a 3D visualization • Loss of granularity and clutter • Occlusion effect (loss of 2D information) • Parallax effect (detrimental perspective) • Difficult to manipulate (high cognitive load) • Difficult to orient oneself (loss of SA) • Lack of emergent temporal analysis feature
From TRACS 3D to TRACS 2.5D • Temporal Data • TRACS 3D: orthogonal axis • TRACS 2.5D: interactive timeline • Advantages • Not 3D (occlusion, parallax, orientation problems addressed) • Familiar manipulation • Clear grouping of temporal features (granularity, clutter, emergent properties)
Mobile Advanced Command and Control Station Humans and Automation Laboratory