1 / 53

Lars Karlsson Mathias Broxvall Silvia Coradeschi Alessandro Saffiotti Robert Lundh

Planning for mobile robots A look at work done in AASS. Lars Karlsson Mathias Broxvall Silvia Coradeschi Alessandro Saffiotti Robert Lundh Abdelbaki Bouguerra Amy Loutfi Center for Applied Autonomous Sensor Systems University of Örebro, Sweden www.aass.oru.se.

kyoko
Download Presentation

Lars Karlsson Mathias Broxvall Silvia Coradeschi Alessandro Saffiotti Robert Lundh

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Planning for mobile robots A look at work done in AASS Lars Karlsson Mathias Broxvall Silvia Coradeschi Alessandro Saffiotti Robert Lundh Abdelbaki Bouguerra Amy Loutfi Center for Applied Autonomous Sensor Systems University of Örebro, Sweden www.aass.oru.se

  2. Mobile robotics lab at AASS • Symbolic reasoning for perception (e-nose) • Perceptual anchoring • Cooperative anchoring • Sensor-based planning • Robot team-work • Ecologies of physically embedded intelligent systems (PEIS) • Automatic loaders for mining • Hybrid and semantic maps

  3. Contents • Sensor-based planning for recovery from ambiguous anchoring situations • Anchoring failures • Recovery planning • Experiments • Plan-based configuration of robotic teams • What is a configuration? • Generating configuration • Experiments

  4. Anchoring failures • Context: • Autonomous execution of high-level tasks • Eg: Inspect(bag-32), where bag-32 is {large, black, yellow label}

  5. Anchoring failures • Context: • Autonomous execution of high-level tasks Eg: Inspect(bag-32), where bag-32 is {large, black, yellow label} • Perceptual anchoring: • Needed to execute high-level, symbolic actions Eg: anchor “bag-32” to the right regions in the image stream Eg: use these regions to servo the robot

  6. Anchoring failures • Context: • Autonomous execution of high-level tasks Eg: Inspect(bag-32), where bag-32 is {large, black, yellow label} • Perceptual anchoring: • Needed to execute high-level, symbolic actions Eg: anchor “bag-32” to the right regions in the image stream Eg: use these regions to servo the robot • Problem: • Failures in perceptual anchoring Eg: no matching object found for “bag-32” Eg: many matching objects found for “bag-32” Eg: not sure if the object is a good match for “bag-32”

  7. Anchoring failures • Context: Autonomous execution of high-level tasks • Perceptual anchoring: Needed to execute high-level, symbolic actions • Problem: Failures in perceptual anchoring • Goal: • General approach to recover from these failures autonomously • Based on AI planning

  8. Anchoring: reminder “Anchoring is the process to create and maintain the right correspondence between symbols and sensor data that refer to the same physical objects” Anchoring connects sensor data (green pixels) with symbolic names and properties (cup22)

  9. Anchoring: percept matching • Anchor • Function : Time  < symbol, percept, signature > • Functionalities • Find: find percept to create new anchor • Track: update existing anchor with new percept • Reacquire: find percept to update old anchor • Core operation • Select percept that matches desired / expected properties • Matching can be full or partial (eg, too far to see the label) • Ideally, we should have 1! fully matching percept

  10. Anchoring: what can go wrong • Wrong number of percepts that (partially) match the desired / expected properties

  11. Anchoring: our goal • Generate automatically the right recovery plan

  12. Contents: recovery from amiguity • Anchoring failures • Recovery planning • Experiments

  13. Recovery planning: approach observe anchoring process situation goal actions Planner plan 1 Detect and classify the problem • interrupt execution of top-level task 2 Model it in a planning domain • perceptual situation • perceptual goal • available actions 3 Generate a recovery plan from this model 4 Execute the recovery plan • resume execution of top-level task

  14. A concrete example • The task • Search for dangerous gas-bottles in building after a fire • Rescue robot must find gas-bottles with yellow labels and approach them (eg, to measure temperature) • The scenario • gb-a, gb-b seen in room R1, gb-b has yellow label • Robot later returns to R1 to inspect gb-b • Plan: ((goto R1) (touch gb-b)) • Problem • Robot cannot see the yellow label • Hence cannot anchor gb-b (case 2)

  15. Modeling the problem • Situation: • two observed candidate anchors • G1, G2 • three anchoring hypotheses • ((anchor gb-b = G1) (mark G1 = t) (mark G2 = f)) • ((anchor gb-b = G2) (mark G1 = f) (mark G2 = t)) • ((anchor gb-b = null) (mark G1 = f) (mark G2 = f)) • ((anchor gb-b = null) (mark G1 = t) (mark G2 = t)) • the mark must be visible from some viewpoint • Goal: • one and only one of {G1, G2} is equal to gb-b • (NEC (anchor gb-b = ?x))

  16. Generated recovery plan Success • Planning: • Generate conditional plan that succeeds if G1 or G2 marked • Strategy is to observe G1 and G2 from different viewpoints • Expected result: • Anchor gb-b to either G1 or G2 • Resume top-level plan • Touch gb-b

  17. Contents: recovery from amiguity • Anchoring failures • Recovery planning • Experiments

  18. Experiments • Environment • Indoor, not (yet) burnt ... • Robot • Magellan Pro • Sonars and color camera • Control architecture • Navigation: Thinking Cap • Vision: color and shape • Planner: PTL-plan

  19. Experiments

  20. A few more experiments ... • Recovery using complementary sensors- Partial matches due to non observed property, smell.- Uses an electronic nose to disambiguate the situation. • Related object(s) not found - Description: “the green garbage can near a red ball” - Search for secondary objects near the primary object • A bad day at work ...- Original goal: inspect 3 objects in some corridors.- Planner creates top-level plan. - Different types of errors occuring sequentially.- Recovery planning solving each problem, resumes original plan. • New objects discovered • Robot detects new candidate objects during recovery phase and replans

  21. The bottom line • Key to robustness: • we cannot build an infallible system … • … but we can build one that automatically detects its failures, and recovers from them • Our work: • one type of perceptual failures: anchoring failures • use of planning as a general means of recovering • complex/cascading failures partially solved • Related work • Some previous work on recovery planning, none on perceptual failures

  22. The ultimate perceptual failure

  23. Open problems • More complex issues • More about composite/cascading failures • Interactions between recovery plan and top-level plan

  24. Part II:Plan-based configuration of robotic teams • What is a functional configuration? • Generating configuration • Experiments

  25. Introduction • Society of autonomous robotic systems • Can be seen as one big distributed system • A robot can use resources and capabilities of other robots

  26. Introduction Example • Robot pushing a box through a door.

  27. Introduction “Borrow” functionalities from each other

  28. Objectives • Define functional configurations of a robot society • who provides which functionalities to whom, and how • Use planning to synthesize configurations • given the environment, the task, the available resources • Detect the need for a configuration change • in response to changes in the environment, tasks, resources

  29. Related fields • Multi-Agent Systems • Task Allocation, Role Assignment, Coalition Formation, Capability Management ... • Cooperative Robotics • Adapt MAS-techniques, Coordinated Motion, Formation Coordination, Cooperative Perception ... • Program Supervision

  30. Contents: Plan-based configuration of robotic teams • What is a functional configuration? • Generating configuration • Experiments

  31. Functionality A functionality is an operator that uses information to produce additional information. A functionality consists of: • a specification of inputs, outputs, and the relations between them. • a set of causal pre- and post-conditions. • a specification of costs (e.g., computation).

  32. Functionality example door visible in image image Measure pos + orientation of door pos + orient

  33. Resource A resource is a special case of a functionality • Sensing resource • no input • gives information about the surrounding environment or the robot platform • Action resource • no output • post-conditions that affects the environment

  34. Resource example Camera image

  35. Channel A channel transfers data from one functionality to another. • Inter- and intra-robot • Requirements on bandwidth, speed, reliability, etc

  36. Configuration A configuration is a set of functionalities and a set of channels that connect them. An admissible configuration Cross door Camera on A Measure pos. + orient. of door pos. + orient. of door image

  37. Examples of configurations All configurations • are admissible • provide the information required for crossing the door

  38. Examples of configurations • Two robots and a door • Functionalities • Measure robot position • Measure angle to robot • Measure door position + orientation • Resources • Camera • Compass

  39. The “do it yourself” robot Cross door Camera on A Measure pos. + orient. of door image pos. + orient. of door

  40. Looking at each other Camera on A Green = Robot A Cross door Measure angle to robot B image angle orient. of A wrt B Measure angle to robot A image angle pos. + orient of door wrt A Measure pos. of robot A Coordinate transformation B  A image pos. of A wrt B Camera on B Measure pos. + orient. of door image pos. + orient. of door wrt B Blue = Robot B

  41. Work schema Planner Operator description Configuration description Generate description of state and goal Translate configuration into executable code Define a set of functionalities Detect need for reconfiguration TC run under given configuration Robots Execution

  42. Configuration generation • Inspired by standard planning techniques (hierarchical planning) • Differences • Parallel versus Sequential • Data flow versus Casual flow

  43. Configuration generation Camera image image Measure pos + orientation of door pos + orient (functionality name: camera(r) input: - output: image(r) precond: camera-on postcond: - ) (functionality name: measure-door(r, d) input: image(r) output: position(r, d), orientation(r, d) precond: visible(r, d) postcond: - )

  44. Configuration generation Camera image Measure pos + orientation of door pos + orient (config-method name: get-door-info(r, d) precond: ( camera(r), in(r, room), in(d, room) robot(r), door(d)) out: f2: position(r, d) f2: orientation(r, d) channels: (local(r), f1, f2, image(r)) body: f1: camera(r, d) f2: measure-door(r, d) )

  45. Configuration description (configuration :functionalities f-31 cross-door(robota, door1) f-42 transform-info(door1, robota) f-47 measure-robot(door1, robota) f-46 camera(door1, robota) :channels local(door1, f-46, f-47, image(door1, robota)) local(door1, f-47, f-42, pos(door1, robota), orient(door1, robota)) global(door1, robota, f-42, f-31, pos(robota, door1), orient(robota, door1)) )

  46. Executable configuration (door1 LET ((CH-1 (make-local-channel ‘door1 ‘f-46 ‘f-47 ‘(image door1 robota))) (CH-2 (make-local-channel ‘door1 ‘f-47 ‘f-42 ‘(pos door1 robota))) (CH-3 (make-local-channel ‘door1 ‘f-47 ‘f-42 ‘(orient door1 robota))) (CH-4 (make-global-channel ‘robota ‘f-42 ‘f-31 ‘(pos robota door1))) (CH-5 (make-global-channel ‘robota ‘f-42 ‘f-31 ‘(orient robota door1)))) (LAMBDA () (PROGN (camera (LIST CH-1)) (measure-robot CH-1 (LIST CH-2) (LIST CH-3)) (transform-info CH-2 CH-3 (LIST CH-4) (LIST CH-5)) )))

  47. Executable configuration (robota LET ((CH-1 (make-global-channel ‘door1 ‘f-42 ‘f-31 ‘(pos robota door1))) (CH-2 (make-global-channel ‘door1 ‘f-42 ‘f-31 ‘(orient robota door1)))) (LAMBDA () (PROGN (cross-door CH-1 CH-2) )))

  48. Contents: Plan-based configuration of robotic teams • What is a functional configuration? • Generating configuration • Experiments

  49. Experimental system • Configurations implemented on two indoor robots • Resources: camera and compass • Configuration generation done by configuration planner. • Configuration switching done manually.

  50. A simple experiment Emil Pippi Emil guides Pippi (looking at each other) Pippi guides Emil (orient. from compasses) Pippi wants to cross door Emil gets in position Pippi: goal achieved Emil wants to cross door Emil: goal achieved

More Related