180 likes | 198 Views
Explore the use of ICARUS cognitive architecture to represent social cognition in human-like intelligent systems. Learn how ICARUS is structured and its abilities in handling social scenarios and decision-making processes.
E N D
Modeling Social Cognition in a Unified Cognitive Architecture Pat Langley School of Computing and Informatics Arizona State University Tempe, Arizona USA Thanks to D. Choi, T. Konik, N. Li, D. Shapiro, and D. Stracuzzi for their contributions. This talk reports research partly funded by grants from DARPA IPTO, which is not responsible for its contents.
Cognitive Architectures • the memories that store domain-specific content • the system’s representation and organization of knowledge • the mechanisms that use this knowledge in performance • the processes that learn this knowledge from experience An architecture typically comes with a programming language that eases construction of knowledge-based systems. Research in this area incorporates many ideas from psychology about the nature of human thinking. A cognitive architecture (Newell, 1990) is the infrastructure for an intelligent system that is constant across domains:
The ICARUS Architecture ICARUS (Langley, 2006) is a computational theory of the human cognitive architecture that posits: Short-term memories are distinct from long-term stores Memories contain modular elements cast as symbolic structures Long-term structures are accessed through pattern matching Cognition occurs in retrieval/selection/action cycles Learning involves monotonic addition of elements to memory Learning is incremental and interleaved with performance It shares these assumptions with other cognitive architectures like Soar (Laird et al., 1987) and ACT-R (Anderson, 1993).
Cascaded Integration in ICARUS Like other unified cognitive architectures, ICARUS incorporates a number of distinct modules. learning problem solving skill execution conceptual inference ICARUS adopts a cascaded approach to integration in which lower-level modules produce results for higher-level ones.
An ICARUS Agent for Urban Driving • Consider driving a vehicle in a city, which requires: • selecting routes • obeying traffic lights • avoiding collisions • being polite to others • finding addresses • staying in the lane • parking safely • stopping for pedestrians • following other vehicles • delivering packages These tasks range from low-level execution to high-level reasoning.
Structure and Use of Conceptual Memory ICARUS organizes conceptual memory in a hierarchical manner. Conceptual inference occurs from the bottom up, starting from percepts to produce high-level beliefs about the current state.
ICARUS Skills Build on Concepts ICARUS stores skills in a hierarchical manner that links to concepts. concepts Each concept is defined in terms of other concepts and/or percepts. Each skill is defined in terms of other skills, concepts, and percepts. skills
Skill Execution in ICARUS Skill execution occurs from the top down, starting from goals to find applicable paths through the skill hierarchy. This process repeats on each cycle to produce goal-directed but reactive behavior, biased toward continuing initiated skills.
Execution and Problem Solving in ICARUS Skill Hierarchy Problem Reactive Execution ? no impasse? Primitive Skills yes Executed plan Problem Solving Problem solving involves means-ends analysis that chains backward over skills and concept definitions, executing skills whenever they become applicable.
Skill Hierarchy ICARUS Learns Skills from Problem Solving Problem Reactive Execution ? no impasse? Primitive Skills yes Executed plan Problem Solving Skill Learning
We designed ICARUS to model intelligent behavior in embodied agents, but our work to date has treated them in isolation. Challenge: Thinking about Others But people can reason more deeply about the goals and actions of others, then use their inferences to make decisions. The framework can deal with other independent agents, but only by viewing them as other objects in the environment. • Adding this ability to ICARUS will require knowledge, but it may also demand extensions to the architecture.
An Urban Driving Example You are driving in a city behind another vehicle when a dog suddenly runs across the road ahead of it. You do not want to hit the dog, but you are in no danger of that, yet you guess the other driver shares this goal. You reason that, if you were in his situation, you would swerve or step on the brakes to avoid hitting the dog. This leads you to predict that the other car may soon slow down very rapidly. Since you have another goal – to avoid collisions – you slow down in case that event happens.
For ICARUS to handle social cognition of this sort, it should: Social Cognition in ICARUS Imagine itself in another agent’s physical/social situation; Infer the other agent’s goals either by default reasoning or based on its behavior; Carry out mental simulation of the other agent’s plausible actions and their effects on the world; Take high-probability trajectories into account in selecting which actions to execute itself. Each of these abilities require changes to the architecture of ICARUS, not just its knowledge base.
In response, we are planning a number of changes to ICARUS: Architectural Extensions Replace deductive inference with generation of plausible beliefs (e.g., others’ goals) – via abductive inference Support use of agent’s own concepts/skills to reason about others – by encoding alternate worlds with inheritance Extend the problem solver to support forward-chaining search – via mental simulation using repeated lookahead Revise skill execution to consider probability of future events – using desirability of likely trajectories These extensions will let ICARUS exhibit social cognition, but they should also support many other abilities.
But most social cognition seems more routine, in that it does not require complex reasoning or mental simulation. In response, we are extending ICARUS’ learning mechanisms to: Automating Social Cognition Detect when the agent’s interactions with others lead to an undesirable situation (e.g., a collision); Analyze how this occurred and use counterfactual reasoning about how it might have been avoided (e.g., slowing down); Store new skills that avoid the undesired situation in an automated, reactive manner. Over time, the agent will come to behave in socially relevant ways without explicit reasoning or simulation.
ICARUS is a unified theory of the cognitive architecture that: Concluding Remarks • includes hierarchical memories for concepts and skills; • interleaves conceptual inference with reactive execution; • resorts to problem solving when it lacks routine skills; • learns such skills from successful resolution of impasses. We have developed agents for a variety of simulated physical environments, including urban driving. We are extending ICARUS to reason about others’ situations/ goals, predict their behavior, and select appropriate responses.