180 likes | 378 Views
MCP II: Student Robot for 2007 IGVC Competition. Dr. Fred G. Martin Assistant Professor, Computer Science University of Massachusetts Lowell. Presentation Overview. The Intelligent Ground Vehicles Competition (IGVC) RIA equipment donation Our robot: The MCP
E N D
MCP II: Student Robot for 2007 IGVC Competition Dr. Fred G. Martin Assistant Professor, Computer Science University of Massachusetts Lowell
Presentation Overview • The Intelligent Ground Vehicles Competition (IGVC) • RIA equipment donation • Our robot: The MCP • IGVC Navigation Challenge: performance video & our solution • IGVC Autonomous Challenge: performance video & our solution • Probabilistic Hough Algorithm • Results • Lessons Learned
Intelligent Ground Vehicle Competition igvc.org • Organized by the Association for Unmanned Vehicle Systems International and Oakland University (Rochester, MI) since 1993 • Sort of a mini “DARPA Grand Challenge” and a precursor to it • Two robot performance events: 1/10 mi. circuit “Autonomous” driving challenge, and GPS-based “Navigation” challenge in a field “Autonomous Challenge” “Navigation Challenge”
2005 RIA Equipment Donation • SICK LMS-200 Laser Ranger • FOculus FO124C Firewire camera • Mini-ITX mainboard & Pentium M processor • Lead-acid battery charger & batteries
The MCP II • Welded frame from box steel members • Super-structure built from extruded aluminum 80/20 struts • 36v main drive system using 2 hub motors and Courtney Electronics “Dual 30” PWM controller • Underneath the hood are three 12v lead-acid batteries, mini-ITX motherboard, Blackfin Handy Board controller, and power supply circuits • Above the hood is the SICK ranger, camera, and LCD monitor • MCP I’s grille provides a touch of whimsy
IGVC Navigation Challenge • The field is about 80 x 90 meters. • Obstacles are located on the course. • The task is to visit the 9 given waypoints as soon as possible.
Navigation Task Solution • There are two behaviors: • Turn robot toward next GPS waypoint; • Back up and turn away from obstacles. • Move-to-waypoint is default behavior. • If obstacle is too close, then obstacle-avoidance takes precedence.
MCP Navigation Video • UML Competes in 2007 IGVC Navigation Challenge (YouTube link)
IGVC Autonomous Challenge • 1/10 mile circuit with white painted lines and various obstacles • Robot must stay within white lines as it navigates the course • It is primarily a vision problem with widespread use of SICK laser for obstacle detection
Autonomous Task Solution • Probabilistic-Hough transform is used to isolate visual field as collection of line segments. • Two regions (lower left and lower right of image) are searched for decent lines and a single weighted average of line slope for each region is computed • Based on this, a decision is made to turn left, right, or go straight (depending on slope of viewed lines on either side • Obstacle avoidance is layered on top of this
MCP Autonomous Video • MCP robot attempts Autonomous Challenge at 2007 IGVC (YouTube link)
Behavior-Based Robotics • Behaviors use a subset of robot sensors and implement a specific functionality; e.g., drive toward waypoint, avoid obstacles, stay within lines • Compose behaviors with some kind of weighting or suppression or choice-system to build overall robot performance
Industrial Robotics vs. Behavior-Based Robotics • Industrial robotics: defined and constrained environment (e.g., articles to be manipulated, lighting conditions); extremely high speeds necessary, same actions over and over again • Behavior-based robotics: unstructured (or poorly-structured) “real world” environments, different tasks at different times, more relaxed pace
Behavior-Based Robotics vs. World Model • Composing behaviors and assigning priority is hard — you want to be able to combine behaviors in meaningful ways (other than just pure priority) • Behaviors need to be able to introspect each other — that is, use each other’s internal state as sensory input (example of obstacle avoidance while following lines • Global world model seems like an attractive solution, but it fails to modularize understanding, and there is key information in the unfolding task performance of individual behaviors
Software Environments Need Work • The open-source Player/Stage and the new Microsoft Robotics Studio provide helpful simulation environments which allow you to run your simulated control program on the real robot • But typical robot programming is done in C/C++ and other primitive languages where basic model is interrupt execution, write/debug code, compile, and then rerun. The robot’s brain is restarted on an on-going basis • We need environments that combine the operating-system with live objects and code: an ecosystem of behaviors that can be flexibly recombined, inspected, and modified without the need to restart everything
It’s a Sofware Problem Now • Behavior-based approaches modularize intentionality and provide fail-safe behavior • Service robots require an approach that can deal with uncertainty and dynamic environments (e.g., HelpMate) • Software development environments can be vastly improved to encourage this development of robots that are more like creatures than machines
The MCP II Student Team • Ken Dillon • Amr Elbasiony (vision guru) • Chris King • Joel Michel • John O’Fallon (team leader) • Nathan Palmer • Haiyang Zhang with • Brian Bailey (mechanical engineer) • Kyewook Lee, Yan Tran (members emeritus) • Matt Bailey, Andrew Chanler (contributors)
Contact Information Dr. Fred MartinAssistant Professor, Computer Science University of Massachusetts Lowell 1 University AvenueLowell, MA 01854 (978) 934–1964fredm@cs.uml.edu www.cs.uml.edu/~fredm