760 likes | 1.14k Views
Argumentation in Agent Systems Part 2: Dialogue . Henry Prakken EASSS-07 31-08-2007. Why study argumentation in agent technology?. For internal reasoning of single agents Reasoning about beliefs, goals, intentions etc often is defeasible For interaction between multiple agents
E N D
Argumentation in Agent SystemsPart 2:Dialogue Henry Prakken EASSS-07 31-08-2007
Why study argumentation in agent technology? • For internal reasoning of single agents • Reasoning about beliefs, goals, intentions etc often is defeasible • For interaction between multiple agents • Information exchange involves explanation • Collaboration and negotiation involve conflict of opinion and persuasion
Overview • Recent trends in argumentation logics • Argument schemes • Epistemic vs. practical reasoning • Argumentation in dialogue • Dialogue game approach • Types of dialogues • How they involve argumentation • The notion of commitment • Some dialogue systems • Agent behaviour in dialogues • Research issues
Argument schemes: general form • The same as logical inference rules • But also critical questions • Pointers to undercutters Premise 1, … , Premise n Therefore (presumably), conclusion
Statistical syllogism • P and if P then usually Q is a reason to believe that Q • Birds usually fly • Critical question: subproperty defeater? • Conflicting generalisation about an exceptional class • Penguins don’t fly
“Normative syllogism” • P and if P then as a rule Q is a reason to accept that Q • Critical question: are thereexceptions? • How does a lawyer argue for exceptions to a rule? • Say legislation makes an exception • Say it is motivated by the rule’s purpose • Find an overruling principle • Argue that rule application has bad consequences
Witness testimony • Critical questions: • Is W sincere? (veracity) • Did W really see P? (objectivity) • Did P occur? (observational sensitivity) Witness W says P Therefore (presumably), P
Temporal persistence • Critical questions: • Was P known to be false between T1 and T2? • Is the gap between T1 and T2 too long? P is true at T1 and T2 > T1 Therefore (presumably), P is Still true at T2
Arguments from consequences • Critical questions: • Does A also have bad consequences? • Are there other ways to bring about the good consequences? Action A brings about good consequences Therefore (presumably), A should be done
Dialogue Type Dialogue Goal Initial situation Persuasion resolution of conflict conflict of opinion Negotiation making a deal conflict of interest Deliberation reaching a decision need for action Information seeking exchange of information personal ignorance Inquiry growth of knowledge general ignorance Types of dialogues (Walton & Krabbe)
P: I offer you this Peugeot for $10000. P: why do you reject my offer? P: why are French cars no good? P: why are French cars unsafe? P: Meinwagen is biased since German car magazines usually are biased against French cars P: why does Meinwagen have a very high reputation?. P: OK, I accept your offer. O: I reject your offer O: since French cars are no good O: since French cars are unsafe O: since magazine Meinwagen says so O: I concede that German car magazines usually are biased against French cars, butMeinwagen is not since it has a very high reputation. O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000. Example
P: I offer you this Peugeot for $10000. P: why do you reject my offer? P: why are French cars no good? P: why are French cars unsafe? P: Meinwagen is biased since German car magazines usually are biased against French cars P: why does Meinwagen have a very high reputation?. P: OK, I accept your offer. O: I reject your offer O: since French cars are no good O: since French cars are unsafe O: since magazine Meinwagen says so O: I concede that German car magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation. O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000. Example (2)
P: I offer you this Peugeot for $10000. P: why do you reject my offer? P: why are French cars no good? P: why are French cars unsafe? P: Meinwagen is biased since German car magazines usually are biased against French cars P: why does Meinwagen have a very high reputation?. P: OK, I accept your offer. O: I reject your offer O: since French cars are no good O: since French cars are unsafe O: since magazine Meinwagen says so O: I concede that German car magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation. O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000. Example (3)
Dialogue systems (according to Carlson 1983) • Dialogue systems define the conditions under which an utterance is appropriate • An utterance is appropriate if it furthers the goal of the dialogue in which it is made • Appropriateness defined not at speech act level but at dialogue level • Dialogue game approach
Dialogue game systems • A dialogue purpose • Participants (with roles) • A communication language Lc • With embedded topic language Lt and a logic for Lt • A protocol for Lc • Effect rules for Lc (“commitment rules”) • Termination and outcome rules
Some history • In philosophy: formal dialectics • (Hamblin 1970, MacKenzie 1979, Walton & Krabbe 1995, …) • Deductive setting • In AI: procedural defeasibility • Loui (1998(1992)), Brewka (1994,2001) • Adding counterarguments • In AI & Law: dispute resolution • (Gordon 1993, Bench-Capon 1998, Lodder 1999, Prakken 2000-2006, …) • Adding counterarguments and third parties • In MAS: agent interaction • Parsons-Sierra-Jennings 1998, Amgoud-Maudet-Parsons 2000, McBurney-Parsons 2002, … • Adding agents
Persuasion • Participants:proponent (P) and opponent (O) of a dialogue topic t • Dialogue goal:resolve the conflict of opinion on t. • Participants’ goals: • P wants O to accept t • O wants P to give up t • Typical speech acts: • Claim p, Concede p, retract p, Why p, p since S, …
Information seeking • Dialogue goal: information exchange • Agent’s goals: learning(?) • Typical speech acts: • Ask p, Tell p, Notell p, …
Negotiation • Dialogue goal: agreement on reallocation of scarce resources • Participants’ goals: maximise individual gain • Typical communication language: • Request p, Offer p, Accept p, Reject p, …
Deliberation • Participants:any • Dialogue goal:resolve need for action • Participants’ goals: • None initially • Possible set of speech acts: • Propose, ask-justify, prefer, accept, reject, …
Dialectical shifts to persuasion • Information exchange: explaining why something is the case or how I know it • Persuasion over fact • Negotiation: explaining why offer is good for you or bad for me • Persuasion over fact or action • Deliberation: explaining why proposal is good or bad for us • Persuasion over fact or action
Commitment in dialogue • Walton & Krabbe (1995): • General case: commitment to action • Special cases: • Commitment to action in dialogue (dialogical or propositional commitment) • Commitment to action outside dialogue (social commitment) • Negotiation and deliberation lead to social commitments • Persuasion leads to dialogical commitments
Quality aspects of dialogue protocols • Effectiveness: does the protocol further the dialogue goal? • Commitments • Agents’ logical and dialogical consistency • Efficiency (relevance, termination, ...) • Fairness: does the protocol respect the participants’ goals? • Flexibility, opportunity, … • Public semantics: can protocol compliance be externally observed?
Effectiveness vs fairness • Relevance and efficiency: moves should be related to the dialogue topic • Relevance often enforced in rigid so efficient “unique-move immediate response” protocols • But sometimes participants must have freedom to backtrack, to explore alternatives, to postpone responses, …
Public semantics:Commitments in persuasion • A participant’s publicly declared standpoints, so not the same as beliefs! • Only commitments and dialogical behaviour should count for move legality: • “Claim p is allowed only if you believe p” vs. • “Claim p is allowed only if you are not committed to p and have not challenged p”
Assertion/Acceptance attitudes • Relative to speaker’s own knowledge! • Confident/Thoughtful agent: can assert/accept P iff he can construct an argument for P • Careful/cautious agent: can assert/accept P iff he can construct an argument for P and no stronger counterargument • Thoughtful/skeptical agent: can assert/accept P iff he can construct a justified argument for P • If part of protocol, then protocol has no public semantics!
Two systems for persuasion dialogue • Parsons, Wooldridge & Amgoud • Journal of Logic and Computation 13(2003) • Prakken • Journal of Logic and Computation 15(2005)
PWA: languages, logic, agents • Lc: Claim p, Why p, Concede p, Claim S, Question p • p Lt, S Lt • Lt: propositional • Logic: argumentation logic • Arguments: (S, p) such that • S Lt, consistent • S propositionally implies p • Attack: (S, p) attacks (S’, p’) iff • p S’ and • level(S) ≤ level(S’) • Semantics: grounded • Assumptions on agents: • Have a knowledge base KB Lt • Have an assertion and acceptance attitude
PWA: protocol • W claims p; • B concedes if allowed, if not claims p if allowed or else challenges p • If B claims p, then goto 2 with players’ roles reversed and p in place of p; • If B has challenged, then: • W claims S, an argument for p; • Goto 2 for each s S in turn. • B concedes if allowed, or the dialogue terminates. • Outcome: do players agree at termination?
Example persuasion dialogue P1: My car is safe. claim P2: Since it has an airbag. argument P3: why does that not make my car safe? challenge P4: Yes, that is what the newspapers say, concession but that does not prove anything, since newspapers are unreliable sources of technological information undercutter P5: OK, I was wrong that my car is safe. retraction O1: Why is your car safe? challenge O2: That is true, concession but your car is still not safe counterclaim O3: Since the newspapers recently reported on airbags exploding without cause rebuttal O4: Still your car is not safe, since its maximum speed is very high. alternative rebuttal
PWA: example dialogue P: careful/cautious P1: claim safe P2: claim {airbag, airbag safe} P3: claim {airbag safe} O: thoughtful/cautious O1: why safe O2a: concede airbag O2b: why airbag safe P: careful/cautious P1: claim safe. P2: whysafe P3a: concede newspaper P3b: why newspaper safe O: confident/cautious O1: claimsafe O2: claim {newspaper, newspaper safe} O3: claim {newspaper safe}
PWA: characteristics • Protocol • multi-move • (almost) unique-reply • Deterministic in Lc • Dialogues • Short (no stepwise construction of arguments, no alternative replies) • Only one side develops arguments • Logic • used for single agent: check attitudes and construct argument
Prakken: languages, logic, agents • Lc: Any, provided it has a reply structure (attacks + surrenders) • Lt: any • Logic: argumentation logic • Arguments: trees of conclusive and/or defeasible inferences • Attack: depends on chosen logic • Semantics: grounded • Assumptions on agents: none.
Acts Attacked by Surrendered by claim p why p concede p why p p since S retract p concede p retract p p since S p’ since S’ why s (s S) concede (p since S) concede s (s S) Prakken: example Lc (with reply structure)
Protocol variations • Unique-vs multiple moves per turn • Unique vs. multiple replies • Immediate response or not • …
Prakken: protocols (basic rules) • Each noninitial move replies to some previous move of hearer • Replying moves must be defined in Lc as a reply to their target • Argue moves must respect underlying argumentation logic • Termination: if player to move has no legal moves • Outcome: what is dialogical status of initial move at termination?
Dialogical status of moves • Each move in a dialogue is in or out: • A surrender is out, • An attacker is: • in iff surrendered, else: • in iff all its attacking children are out
P1+ O1- P2- P4+ O2- O3+ P3+
Functions of dialogical status • Can determine winning • Plaintiff wins iff at termination the initial claim is in; defendant wins otherwise • Can determine turntaking • Turn shifts if dialogical status of initial move has changed • Immediate response protocols (Loui 1998) • Can be used in defining relevance
Relevant protocols • A move must reply to a relevant target • A target is relevant if changing its status changes the status of the initial claim • Turn shifts if dialogical status of initial move has changed • Immediate response protocols
P1+ O1- P2- P4+ O2- O3+ P3+
P1+ O1- P2- P4+ O2+ O3+ P3- O4+
P1+ O1- P2- P4+ O2- O3+ P3+
P1- O1+ P2- P4- O2- O3+ O4+ P3+
Prakken: example dialogue P1: claim safe
Prakken: example dialogue P1: claim safe O1: why safe
Prakken: example dialogue P1: claim safe O1: why safe P2: safe since airbag, airbag safe
Prakken: example dialogue P1: claim safe O1: why safe P2: safe since airbag, airbag safe O2a: concede airbag
Prakken: example dialogue P1: claim safe O1: why safe P2: safe since airbag, airbag safe O2b: safe since newspaper, newspaper safe O2a: concede airbag
Prakken: example dialogue P1: claim safe O1: why safe P2: safe since airbag, airbag safe O2b: safe since newspaper, newspaper safe O2a: concede airbag P3a: concede newspaper