350 likes | 476 Views
USARsim & HRI Research. Michael Lewis. Background. USARsim was developed as a research tool for an NSF project to study Robot, Agent, Person Teams in Urban Search & Rescue Katia Sycara CMU- Multi agent systems Illah Nourbakhsh CMU/NASA- Field robotics
E N D
USARsim & HRI Research Michael Lewis
Background.. USARsim was developed as a research tool for an NSF project to study Robot, Agent, Person Teams in Urban Search & Rescue Katia Sycara CMU- Multi agent systems Illah Nourbakhsh CMU/NASA- Field robotics Mike Lewis Pitt- human control of robotic teams
USARSim Design Objective Leverage technology developed by the $30 billion/year game industry to focus on building and validating high fidelity models of robots Piggyback on rapidly evolving technology for game engines • Photorealistic graphics to allow study of human-robot interaction and machine vision • Best available physics engines to replicate control/mobility challenges of real environments • Availability of modeling tools to create realistically complex environments in a reasonable time • Compatibility with robotics software such as Player or Pyro and game content such as America’s Army
Graphics and Vision The Unreal Engine supports the rapidly evolving graphics acceleration on the new video cards USARSim’s image server captures in-memory video so that images can be made available for: • Machine vision algorithms • Addition of realistic noise and distortion We are engaged in an on-going validation effort to identify aspects/algorithms that are/are not accurately modeled by the simulation
Brief history • 2002-2003Developed USARsim simulation • Limited to our own robots • Limited to our own (RETSINA) control architecture • 2003-2004 Extended simulator for general access • Added robots widely used in robocup USAR • Added api’s for Player & Pyro • 2004-2005 Began cooperative development • Involved NIST in maintenance • Demo competition at robocup in Osaka • 2005-2006 Simulation matured • Virtual Robots USAR competition in Bremen • Rationalization of units, modularization of classes, etc.
www.sourceforge.net/projects/usarsim • Used for Virtual Robots Competition in USAR League • Maintained by NIST
Fixed Camera Illusion • Can gravity referenced view (GRV) help us maintain awareness of attitude? • Less time • Less backtracking
Camera Control Experiment • Video Feed is the strongest perceptual link to remote environment • Disorientation • Failure to take precaution against hazards • Non-detection of mission critical information
Camera Control • Fixed Camera • PTZ Camera • Dual ptz Cameras Results More targets for PTA & dual camera Dual camera twice as likely to be “disjoint”
MultiRobot Control Fully autonomous cooperation (Machinetta) Manual Cooperative
Multi-robot results • More complete searches for autonomous & cooperative • More victims found in cooperative (followed by manual) • Cooperative participants switched more frequently between robots and • Frequency of switching correlated with finding victims
Validation All robots in USARsim model real robots and so are candidates for validation Gives indication of the degree to which experimental results might be generalizable Provides a common reference for comparing experimental results Provides a mechanism for sharing advances in control code and interfaces among researchers Provides reassurance that software developed using simulation can be ported to hardware
Sensor validation for vision • Conventional wisdom is that synthetic images are NOT useful for work in machine vision because of intrinsic regularities, etc. • Similar to arguments wrt congruential random number generators • Question should be empirical not theory
Feature Extraction Algorithms • Edge detection • Template matching • Snakes • OCR Tested in: • Camera well lit • Camera dimly lit • Simulation well lit
Canny Edge Detection Gaussian Filter to remove noise original Sobel operator separates High horizonal/vertical regions Canny operator with thresholding
Template Matching Template Image with feature correlation Inverse of Fourier transform of image x Fourier transform of template
Template Convolution real camera images simulation
Template convolutiondistance in pixels estimate & target feature
Pitt/CMU Validation Participants controlled either robot or simulation from lab at Pitt Robot testing was conducted in replica of NIST’s Orange Arena at CMU Control was either Direct teleoperation or Command where operator specified waypoint Two robot types, the experimental Personal Explorer Rover (PER) and the Pioneer P3-AT (simulated as P2-AT) were tested
1 Meter 3 Meters 1 Meter 1 Meter 3 Meters 3 Meters Simple & Complex Navigation Tasks
& now Accelerated Physics Next engine release will support Aegia PhysX • Continually improving simulation quality (ex: 3 order of magnitude improvement in physics with hardware acceleration) & validation • Let us do tracked robots, collapsing buildings, etc.