430 likes | 703 Views
Projekt. AILA Design of an autonomous mobile dual-arm robot. DFKI Bremen & University of Bremen Robotics Innovations Center Director: Prof. Dr. Frank Kirchner www.dfki.de/robotics robotics@dfki.de. Johannes Lemburg. Saarland. Rhineland-Palatinate. Bremen.
E N D
AILADesign of an autonomous mobile dual-arm robot DFKI Bremen & University of Bremen Robotics Innovations Center Director: Prof. Dr. Frank Kirchner www.dfki.de/robotics robotics@dfki.de Johannes Lemburg
Saarland Rhineland-Palatinate Bremen German Research Center for Artificial Intelligence DFKIis a Joint Venture of: Deutschland GmbH
Bremen Berlin Kaiserslautern Saarbrücken DFKI = Non-Profit Company • DFKI Funding is based on: • 100% reimbursement of project costs from industry and government • Support from Universities and Shareholders: • 8 full professors (75%) • Research staff from Shareholders • Free research space in two buildings on the campus • Joint use of infrastructure (Internet, Libraries, Catering)
Project Smart Label Produkt IKT 2020 Forschungsprogramm: Innovationsallianz „Digitales Produktgedächtnis“ • IKT 2020 • Kennz. 01IA08002 • 16,46 Mio. € • 02/2008 – 01/2011 Semantic Product Memory „a products diary“ Smart Label environment Produkt
Developed Systems Development of a mobile dual-arm robotic system capable of flexible manipulation of varying products RFID reader/writer AILA 1. System prototype environment
AILA design goals • Arms • Joints based upon previous development • Payload to weight ratio > 1 • Low weight and moment of inertia • Stiff structure • Mobile base • Holonomic • Indoor and slightly rough terrain • Synergy to space related project • Torso • Height-adjustment of the arms • Overall • Anthropomorphic • Nice appearance • One year timeframe
Concept Options and Decision Tree
Features head chassis cooling air • ease of maintenance by modular disassembly • thermo-formed scratch-resistant rip-off shell debug cabling
Structure Aluminium + CFRP
Structure Sheet metal Machining
Detailing and Integration Mechanical and Electronics
AILA- Towards Autonomous Mobile Manipulation - José de Gea Fernández DFKI Bremen Robotics Innovation Center Director: Prof. Dr. Frank Kirchner http://www.dfki.de/robotics robotics@dfki.de
Autonomous mobile manipulators (AMMs) Capable of moving around and performing work in unstructured environments without (continuous) intervention of human operators Possible deployment in: Health-care industry Security and disaster relief Deployment in hazardous environments / Handling of dangerous goods Military applications Space Logistics Autonomous Mobile Manipulation
In the context of AMMs: Mobility entails large-scale environments with a variety of objects, tasks, and environmental conditions Manipulation entails mechanical work to modify the arrangement of objects in the world. Autonomy is required so that tasks can be performed without continuous human intervention Extracting information from the environment, recovering from a failure, modifying plans on run-time … Learning and adaption Autonomous Mobile Manipulation
SemProM Project • Second Scenario (2010) • Common demonstrationwith SAP, Siemens, SmartFactoryatthe Hannover Fair • Task: Robot manipulatesobjects on a shelf. Upon requestplacestherequiredobject on a table • Detailedrobottasks: • Navigation towardstheshelf • Shelfrecognitionandposeestimation • Objectrecognitionandposeestimation • Objectgrasping, transportandrelease NAVIGATION SCENE RECOGNITION OBJECT RECOGNITION DUAL-ARM MANIPULATION PLANNING, FORCE CONTROL, RFID
Manipulation Area • Manipulation entails working in different areas: • Shelf/Object recognition and pose estimation • One and two-arm manipulation (trajectory) planning • Force operations • Real-time motion control • RFID-based task planning • Software architecture: • interprocess communication • task coordination and execution
Vision Object Recognition based on SIFT Feature Matching • Marker and Shelf Recognition • 3D Object‘s Pose Estimation
Dual-Arm Trajectory Planning • Simultaneous dual-arm trajectoryplanning • Free andclutteredenvironments
Dual-Arm Trajectory Planning • Simultaneous dual-arm trajectoryplanning • Succesfultests on Schunk arms • Planner can receive new objects from the vision system and add them dynamically to the environment
Dual-Arm Trajectory Planning • Planning including 4 torso DOFs (total of 18 DOFs)
Robot‘s Self-Collision • Real-time self-collisionsoftwarechecksthatnocollisionoccursbetweenbodyparts • Takes intoaccountdecelerationandbrakingtime • Submittedas DFKI Patent
Hand-Eye Calibration • Offline robot DH parametercalibration: • Compute arm posewithrespecttocamera • Measurevisuallytherobot‘spose • Adjustthe DH parameterstominimizetheerrorbetweenvisualandproprioceptivecomputedpose • More weightgiventoproprioceptiveinformation • MinimizeerrorusingLevenberg-Marquardt algorithm
Arm Kinematics • Jacobian-based IK for the whole arm • openRAVE uses 3-2-1 arm assumption for solving IK • Jacobian-based IK corrects for offsets due to AILA‘s non-intersecting wrist axes Non-intersecting wrist axes
Wrist Kinematics • Forward and inverse kinematics • Closed-form forthe inverse kinematics • Numericalsolutionusing Newton-Raphsonforforwardkinematics • Tests with a model in Matlab/SimMechanicsyieldaccurateand fast results
Software Architecture • Developed under ROS (Robot Operating System) software framework • openRAVE as motion planner node • Object recognition node • Self-Collision avoidance node • Scene recognition node • Image acquisition node • Coordinator node
Video – Firsts Tests with Schunk arms • Complete demo under ROS: • Shelf Recognition • Object Recognition • Pose estimation • Dual-arm motion planning • Schunk motor control • Communication
Video – Last week‘s test with AILA • Complete demo under ROS: • Object Recognition • Pose estimation • Dual-arm motion planning • Motor control • Interprocess communication
Navigation What do we need for autonomous INTELLIGENT navigation in human environment? • Perception and reconstruction of the scene • Model building and model fitting • Semantics and scene interpretation • Ontologies and spatial reasoning • Classification of objects and areas • Task planning
Navigation • Navigation Task: • WP 1 (212.113,140.122,0.213), WP 2 (122.213,123.122,1.2333) • WP 3 (422.513,123.122,2.13), WP 4 (522.343,153.162,0.1230) • Patrol waypoints in order 2,1,3,4 • Or • “Take a bottle of beer from the fridge” by reasoning • Beer -> is in “Fridge” • Fridge -> in in “Kitchen” • Kitchen -> connected to “Corridor” • Corridor -> connected to “Living Room” • Robot -> is in “Living Room”
Navigation Point Cloud Geometric Scene Reconstruction 6D Slam Map Not solved yet Vision Semantic Map Pretty much covered in research Task planning Semantic Reasoning Object and Scene Understanding
Outlook • Sensorimotor acquisition and learning of new objects • Scene segmentation to create hypotheses about the environment and find unknown objects • Manipulation of the objects and use of sensorimotor information for: • Acquisiton of new views and dynamic object information • Confirmation of the hypotheses about the object • Storing visual features and geometric information about the object • Autonomous multi-fingered grasping planning • Active Impedance Control • Object-centered dual-arm internal impedance control
Official AILA Video Video