530 likes | 683 Views
Planning for mobile robots A look at work done in AASS. Lars Karlsson Mathias Broxvall Silvia Coradeschi Alessandro Saffiotti Robert Lundh Abdelbaki Bouguerra Amy Loutfi Center for Applied Autonomous Sensor Systems University of Örebro, Sweden www.aass.oru.se.
E N D
Planning for mobile robots A look at work done in AASS Lars Karlsson Mathias Broxvall Silvia Coradeschi Alessandro Saffiotti Robert Lundh Abdelbaki Bouguerra Amy Loutfi Center for Applied Autonomous Sensor Systems University of Örebro, Sweden www.aass.oru.se
Mobile robotics lab at AASS • Symbolic reasoning for perception (e-nose) • Perceptual anchoring • Cooperative anchoring • Sensor-based planning • Robot team-work • Ecologies of physically embedded intelligent systems (PEIS) • Automatic loaders for mining • Hybrid and semantic maps
Contents • Sensor-based planning for recovery from ambiguous anchoring situations • Anchoring failures • Recovery planning • Experiments • Plan-based configuration of robotic teams • What is a configuration? • Generating configuration • Experiments
Anchoring failures • Context: • Autonomous execution of high-level tasks • Eg: Inspect(bag-32), where bag-32 is {large, black, yellow label}
Anchoring failures • Context: • Autonomous execution of high-level tasks Eg: Inspect(bag-32), where bag-32 is {large, black, yellow label} • Perceptual anchoring: • Needed to execute high-level, symbolic actions Eg: anchor “bag-32” to the right regions in the image stream Eg: use these regions to servo the robot
Anchoring failures • Context: • Autonomous execution of high-level tasks Eg: Inspect(bag-32), where bag-32 is {large, black, yellow label} • Perceptual anchoring: • Needed to execute high-level, symbolic actions Eg: anchor “bag-32” to the right regions in the image stream Eg: use these regions to servo the robot • Problem: • Failures in perceptual anchoring Eg: no matching object found for “bag-32” Eg: many matching objects found for “bag-32” Eg: not sure if the object is a good match for “bag-32”
Anchoring failures • Context: Autonomous execution of high-level tasks • Perceptual anchoring: Needed to execute high-level, symbolic actions • Problem: Failures in perceptual anchoring • Goal: • General approach to recover from these failures autonomously • Based on AI planning
Anchoring: reminder “Anchoring is the process to create and maintain the right correspondence between symbols and sensor data that refer to the same physical objects” Anchoring connects sensor data (green pixels) with symbolic names and properties (cup22)
Anchoring: percept matching • Anchor • Function : Time < symbol, percept, signature > • Functionalities • Find: find percept to create new anchor • Track: update existing anchor with new percept • Reacquire: find percept to update old anchor • Core operation • Select percept that matches desired / expected properties • Matching can be full or partial (eg, too far to see the label) • Ideally, we should have 1! fully matching percept
Anchoring: what can go wrong • Wrong number of percepts that (partially) match the desired / expected properties
Anchoring: our goal • Generate automatically the right recovery plan
Contents: recovery from amiguity • Anchoring failures • Recovery planning • Experiments
Recovery planning: approach observe anchoring process situation goal actions Planner plan 1 Detect and classify the problem • interrupt execution of top-level task 2 Model it in a planning domain • perceptual situation • perceptual goal • available actions 3 Generate a recovery plan from this model 4 Execute the recovery plan • resume execution of top-level task
A concrete example • The task • Search for dangerous gas-bottles in building after a fire • Rescue robot must find gas-bottles with yellow labels and approach them (eg, to measure temperature) • The scenario • gb-a, gb-b seen in room R1, gb-b has yellow label • Robot later returns to R1 to inspect gb-b • Plan: ((goto R1) (touch gb-b)) • Problem • Robot cannot see the yellow label • Hence cannot anchor gb-b (case 2)
Modeling the problem • Situation: • two observed candidate anchors • G1, G2 • three anchoring hypotheses • ((anchor gb-b = G1) (mark G1 = t) (mark G2 = f)) • ((anchor gb-b = G2) (mark G1 = f) (mark G2 = t)) • ((anchor gb-b = null) (mark G1 = f) (mark G2 = f)) • ((anchor gb-b = null) (mark G1 = t) (mark G2 = t)) • the mark must be visible from some viewpoint • Goal: • one and only one of {G1, G2} is equal to gb-b • (NEC (anchor gb-b = ?x))
Generated recovery plan Success • Planning: • Generate conditional plan that succeeds if G1 or G2 marked • Strategy is to observe G1 and G2 from different viewpoints • Expected result: • Anchor gb-b to either G1 or G2 • Resume top-level plan • Touch gb-b
Contents: recovery from amiguity • Anchoring failures • Recovery planning • Experiments
Experiments • Environment • Indoor, not (yet) burnt ... • Robot • Magellan Pro • Sonars and color camera • Control architecture • Navigation: Thinking Cap • Vision: color and shape • Planner: PTL-plan
A few more experiments ... • Recovery using complementary sensors- Partial matches due to non observed property, smell.- Uses an electronic nose to disambiguate the situation. • Related object(s) not found - Description: “the green garbage can near a red ball” - Search for secondary objects near the primary object • A bad day at work ...- Original goal: inspect 3 objects in some corridors.- Planner creates top-level plan. - Different types of errors occuring sequentially.- Recovery planning solving each problem, resumes original plan. • New objects discovered • Robot detects new candidate objects during recovery phase and replans
The bottom line • Key to robustness: • we cannot build an infallible system … • … but we can build one that automatically detects its failures, and recovers from them • Our work: • one type of perceptual failures: anchoring failures • use of planning as a general means of recovering • complex/cascading failures partially solved • Related work • Some previous work on recovery planning, none on perceptual failures
Open problems • More complex issues • More about composite/cascading failures • Interactions between recovery plan and top-level plan
Part II:Plan-based configuration of robotic teams • What is a functional configuration? • Generating configuration • Experiments
Introduction • Society of autonomous robotic systems • Can be seen as one big distributed system • A robot can use resources and capabilities of other robots
Introduction Example • Robot pushing a box through a door.
Introduction “Borrow” functionalities from each other
Objectives • Define functional configurations of a robot society • who provides which functionalities to whom, and how • Use planning to synthesize configurations • given the environment, the task, the available resources • Detect the need for a configuration change • in response to changes in the environment, tasks, resources
Related fields • Multi-Agent Systems • Task Allocation, Role Assignment, Coalition Formation, Capability Management ... • Cooperative Robotics • Adapt MAS-techniques, Coordinated Motion, Formation Coordination, Cooperative Perception ... • Program Supervision
Contents: Plan-based configuration of robotic teams • What is a functional configuration? • Generating configuration • Experiments
Functionality A functionality is an operator that uses information to produce additional information. A functionality consists of: • a specification of inputs, outputs, and the relations between them. • a set of causal pre- and post-conditions. • a specification of costs (e.g., computation).
Functionality example door visible in image image Measure pos + orientation of door pos + orient
Resource A resource is a special case of a functionality • Sensing resource • no input • gives information about the surrounding environment or the robot platform • Action resource • no output • post-conditions that affects the environment
Resource example Camera image
Channel A channel transfers data from one functionality to another. • Inter- and intra-robot • Requirements on bandwidth, speed, reliability, etc
Configuration A configuration is a set of functionalities and a set of channels that connect them. An admissible configuration Cross door Camera on A Measure pos. + orient. of door pos. + orient. of door image
Examples of configurations All configurations • are admissible • provide the information required for crossing the door
Examples of configurations • Two robots and a door • Functionalities • Measure robot position • Measure angle to robot • Measure door position + orientation • Resources • Camera • Compass
The “do it yourself” robot Cross door Camera on A Measure pos. + orient. of door image pos. + orient. of door
Looking at each other Camera on A Green = Robot A Cross door Measure angle to robot B image angle orient. of A wrt B Measure angle to robot A image angle pos. + orient of door wrt A Measure pos. of robot A Coordinate transformation B A image pos. of A wrt B Camera on B Measure pos. + orient. of door image pos. + orient. of door wrt B Blue = Robot B
Work schema Planner Operator description Configuration description Generate description of state and goal Translate configuration into executable code Define a set of functionalities Detect need for reconfiguration TC run under given configuration Robots Execution
Configuration generation • Inspired by standard planning techniques (hierarchical planning) • Differences • Parallel versus Sequential • Data flow versus Casual flow
Configuration generation Camera image image Measure pos + orientation of door pos + orient (functionality name: camera(r) input: - output: image(r) precond: camera-on postcond: - ) (functionality name: measure-door(r, d) input: image(r) output: position(r, d), orientation(r, d) precond: visible(r, d) postcond: - )
Configuration generation Camera image Measure pos + orientation of door pos + orient (config-method name: get-door-info(r, d) precond: ( camera(r), in(r, room), in(d, room) robot(r), door(d)) out: f2: position(r, d) f2: orientation(r, d) channels: (local(r), f1, f2, image(r)) body: f1: camera(r, d) f2: measure-door(r, d) )
Configuration description (configuration :functionalities f-31 cross-door(robota, door1) f-42 transform-info(door1, robota) f-47 measure-robot(door1, robota) f-46 camera(door1, robota) :channels local(door1, f-46, f-47, image(door1, robota)) local(door1, f-47, f-42, pos(door1, robota), orient(door1, robota)) global(door1, robota, f-42, f-31, pos(robota, door1), orient(robota, door1)) )
Executable configuration (door1 LET ((CH-1 (make-local-channel ‘door1 ‘f-46 ‘f-47 ‘(image door1 robota))) (CH-2 (make-local-channel ‘door1 ‘f-47 ‘f-42 ‘(pos door1 robota))) (CH-3 (make-local-channel ‘door1 ‘f-47 ‘f-42 ‘(orient door1 robota))) (CH-4 (make-global-channel ‘robota ‘f-42 ‘f-31 ‘(pos robota door1))) (CH-5 (make-global-channel ‘robota ‘f-42 ‘f-31 ‘(orient robota door1)))) (LAMBDA () (PROGN (camera (LIST CH-1)) (measure-robot CH-1 (LIST CH-2) (LIST CH-3)) (transform-info CH-2 CH-3 (LIST CH-4) (LIST CH-5)) )))
Executable configuration (robota LET ((CH-1 (make-global-channel ‘door1 ‘f-42 ‘f-31 ‘(pos robota door1))) (CH-2 (make-global-channel ‘door1 ‘f-42 ‘f-31 ‘(orient robota door1)))) (LAMBDA () (PROGN (cross-door CH-1 CH-2) )))
Contents: Plan-based configuration of robotic teams • What is a functional configuration? • Generating configuration • Experiments
Experimental system • Configurations implemented on two indoor robots • Resources: camera and compass • Configuration generation done by configuration planner. • Configuration switching done manually.
A simple experiment Emil Pippi Emil guides Pippi (looking at each other) Pippi guides Emil (orient. from compasses) Pippi wants to cross door Emil gets in position Pippi: goal achieved Emil wants to cross door Emil: goal achieved