380 likes | 521 Views
Proposition Bank: a resource of predicate-argument relations. Martha Palmer University of Pennsylvania October 9, 2001 Columbia University. Outline. Overview ( Ace consensus: BBN,NYU,MITRE,Penn) Motivation Approach Guidelines, lexical resources, frame sets
E N D
Proposition Bank: a resource of predicate-argument relations Martha Palmer University of Pennsylvania October 9, 2001 Columbia University PropBank
Outline • Overview (Ace consensus: BBN,NYU,MITRE,Penn) • Motivation • Approach • Guidelines, lexical resources, frame sets • Tagging process, hand correction of automatic tagging • Status: accuracy, progress • Colleagues: Joseph Rosenzweig, Paul Kingsbury, Hoa Dang, Karin Kipper, Scott Cotton, Laren Delfs, Christiane Fellbaum PropBank
battle wrestle join debate Powell and Zhu Rongji met consult Powell met with Zhu Rongji Proposition:meet(Powell, Zhu Rongji) Powell and Zhu Rongji had a meeting . . . Proposition Bank:Generalizing from Sentences to Propositions Powell met Zhu Rongji meet(Somebody1, Somebody2) When Powell met Zhu Rongji on Thursday they discussed the return of the spy plane. meet(Powell, Zhu) discuss([Powell, Zhu], return(X, plane)) PropBank
Penn English Treebank • 1.3 million words • Wall Street Journal and other sources • Tagged with Part-of-Speech • Syntactically Parsed • Widely used in NLP community • Available from Linguistic Data Consortium PropBank
VP have been VP expecting SBAR VP NP NP NP in NP a GM-Jaguar pact the British company an eventual 30% stake the US car maker WHNP-1 give that A TreeBanked Sentence (S (NP-SBJ Analysts) (VP have (VP been (VP expecting (NP (NP a GM-Jaguar pact) (SBAR (WHNP-1that) (S (NP-SBJ *T*-1) (VP would (VP give (NP the U.S. car maker) (NP (NP an eventual (ADJP 30 %) stake) (PP-LOC in (NP the British company)))))))))))) S VP NP-SBJ Analysts NP S VP NP-SBJ *T*-1 would NP PP-LOC Analysts have been expecting a GM-Jaguar pact that would give the U.S. car maker an eventual 30% stake in the British company. PropBank
Arg0 that would give Arg1 *T*-1 an eventual 30% stake in the British company Arg2 a GM-Jaguar pact the US car maker The same sentence, PropBanked (S Arg0 (NP-SBJ Analysts) (VP have (VP been (VP expecting Arg1(NP (NP a GM-Jaguar pact) (SBAR (WHNP-1that) (S Arg0 (NP-SBJ *T*-1) (VP would (VP give Arg2 (NP the U.S. car maker) Arg1 (NP (NP an eventual (ADJP 30 %) stake) (PP-LOC in (NP the British company)))))))))))) have been expecting Arg1 Arg0 Analysts expect(Analysts, GM-J pact) give(GM-J pact, US car maker, 30% stake) PropBank
Motivation • Why do we need accurate predicate-argument relations? • They have a major impact on Information Processing. • Ex: Korean/English Machine Translation: ARL/SBIR • CoGenTex, Penn, Systran (K/E Bilinugal Lexicon, 20K) • 4K words ( < 500 words from Systran, military messages) • Plug and play architecture based on DsyntS (rich dependency structure) • Converter bug led to random relabeling of predicate arguments • Correction of predicate argument labels alone led to tripling of acceptable sentence output PropBank
Focusing on Parser comparisons • 200 sentences hand selected to represent “good” translations given a correct parse. • Used to compare: • Corrected DsyntS output • Juntae’s parser output (off-the-shelf) • Anoop’s parser output (Treebank trained, 95% F) PropBank
Evaluating translation quality • Compare DLI Human translation to system output (200) • Criteria used by human judges (2 or more, not blind) • [g] = good, exactly right • [f1] = fairly good, but small grammatical mistakes • [f2] = Needs fixing, but vocabulary basically there • [f3] = Needs quite a bit of fixing, usually some un-translated vocabulary, but most v. is right • [m] = seems grammatical, but semantically wrong, actually misleading • [i] = irredeemable, really wrong, major problems PropBank
Results Comparison = 200 sent. PropBank
Plug and play? • Converter used to map Parser outputs into MT DsyntS format • Bug in the converter affected both systems • Predicate argument structure labels were being lost in the conversion process, relabeled randomly • The converter was also still tuned to Juntae’s parse output, needed to be customized to Anoop’s PropBank
Anoop’s parse -> MTW DsyntS • 0010Target: Unit designations are normally transmitted in code. • 0010Corrected: Normally unit designations are notified in the code. • 0010Anoop: Normally it is notified unit designations in code. notified P = Arg0 C = Arg1 code designations normally unit PropBank
Anoop’s parse -> MTW DsyntS • 0022Target: Under what circumstances does radio inteference occur? • 0022Corrected: In what circumstances does the interference happen in the radio? • 0022Anoop: Do in what circumstance happen interference in radio? happen P = Arg0 P = ArgM C = Arg0 C = Arg1 interference circumstances radio what PropBank
New and Old Results Comparison PropBank
English PropBank • 1M words of Treebank over 2 years, May’01-03 • New semantic augmentations • Predicate-argument relations for verbs • label arguments: Arg0, Arg1, Arg2, … • First subtask, 300K word financial subcorpus (12K sentences, 35K+ predicates) • Spin-off: Guidelines (necessary for annotators) • English lexical resource • 6000+ verbs with labeled examples, rich semantics PropBank
Task: not just undoing passives • The earthquake shook the building. <arg0> <WN3> <arg1> • The walls shook; the building rocked. <arg1> <WN3>; <arg1> <WN1> • The guidelines = lexicon with examples: Frames Files PropBank
Guidelines: Frames Files • Created manually – Paul Kingsbury • working on semi-automatic expansion • Refer to VerbNet, WordNet and Framenet • Currently in place for 230 verbs • Can expand to 2000+ using VerbNet • Will need hand correction • Use “semantic role glosses” unique to each verb (map to Arg0, Arg1 labels appropriate to class) PropBank
Frames Example: expect Roles: Arg0: expecter Arg1: thing expected Example: Transitive, active: Portfolio managers expect further declines in interest rates. Arg0: Portfolio managers REL: expect Arg1: further declines in interest rates PropBank
Frames File example: give Roles: Arg0: giver Arg1: thing given Arg2: entity given to Example: double object The executives gave the chefs a standing ovation. Arg0: The executives REL: gave Arg2: the chefs Arg1: a standing ovation PropBank
Arg0 that would give Arg1 *T*-1 an eventual 30% stake in the British company Arg2 a GM-Jaguar pact the US car maker The same sentence, PropBanked (S Arg0 (NP-SBJ Analysts) (VP have (VP been (VP expecting Arg1(NP (NP a GM-Jaguar pact) (SBAR (WHNP-1that) (S Arg0 (NP-SBJ *T*-1) (VP would (VP give Arg2 (NP the U.S. car maker) Arg1 (NP (NP an eventual (ADJP 30 %) stake) (PP-LOC in (NP the British company)))))))))))) have been expecting Arg1 Arg0 Analysts expect(Analysts, GM-J pact) give(GM-J pact, US car maker, 30% stake) PropBank
Complete Sentence Analysts have been expecting a GM-Jaguar pact that *T*-1 would give the U.S. car maker an eventual 30% stake in the British company and create joint ventures that *T*-2 would produce an executive-model range of cars. PropBank
How are arguments numbered? • Examination of example sentences • Determination of required / highly preferred elements • Sequential numbering, Arg0 is typical first argument, except • ergative/unaccusative verbs (shake example) • Arguments mapped for "synonymous" verbs PropBank
Additional tags (arguments or adjuncts?) • Variety of ArgM’s (Arg#>4): • TMP - when? • LOC - where at? • DIR - where to? • MNR - how? • PRP -why? • REC - himself, themselves, each other • PRD -this argument refers to or modifies another • ADV -others PropBank
Tense/aspect • Verbs also marked for tense/aspect • Passive • Perfect • Progressive • Infinitival • Modals and negation marked as ArgMs PropBank
Ergative/Unaccusative Verbs: rise Roles Arg1 = Logical subject, patient, thing rising Arg2 = EXT, amount risen Arg3* = start point Arg4 = end point Sales rose 4% to $3.28 billion from $3.16 billion. *Note: Have to mention prep explicitly, Arg3-from, Arg4-to, or could have used ArgM-Source, ArgM-Goal. Arbitrary distinction. PropBank
Synonymous Verbs: add in sense rise Roles: Arg1 = Logical subject, patient, thing rising/gaining/being added to Arg2 = EXT, amount risen Arg4 = end point The Nasdaq composite index added 1.01 to 456.6 on paltry volume. PropBank
Phrasal Verbs • Put together • Put in • Put off • Put on • Put out • Put up • ... PropBank
Frames: Multiple Rolesets • Rolesets are not necessarily consistent between different senses of the same verb • Verb with multiple senses can have multiple frames, but not necessarily • Roles and mappings onto argument labels are consistent between different verbs that share similar argument structures, Similar to Framenet • Levin / VerbNet classes • http://www.cis.upenn.edu/~dgildea/VerbNet/ • Out of the 179 most frequent verbs: • 1 Roleset – 92 • 2 rolesets – 45 • 3+ rolesets – 42 (includes light verbs) PropBank
Annotation procedure • Extraction of all sentences with given verb • First pass – automatic tagging • Second pass: Double blind hand correction • Variety of backgrounds • less syntactic training than for treebanking • Script to discover discrepancies • Third pass: Solomonization (adjudication) PropBank
Inter-annotator agreement PropBank
Annotator Accuracy vs. Gold Standard • One version of annotation chosen (sr. annotator) • Solomon modifies => Gold Standard PropBank
Status • 179 verbs framed (+ Senseval2 verbs) • 97 verbs first-passed • 12,300+ predicates • Does not include ~3000 predicates tagged for Senseval • 54 verbs second-passed • 6600+ predicates • 9 verbs solomonized • 885 predicates PropBank
Throughput • Framing: approximately 2 verbs per hour • Annotation: approximately 50 sentences per hour • Solomonization: approximately 1 hour per verb PropBank
Automatic Predicate Argument Tagger • Predicate argument labels • Uses TreeBank “cues” • Consults lexical semantic KB • Hierarchically organized verb subcategorization frames and alternations associated with tree templates • Ontology of noun-phrase referents • Multi-word lexical items • Matches annotated tree templates against parse in Tree-adjoining Grammar style • standoff annotation in external file referencing treenodes • Preliminary accuracy rate of 83.7% (800+ predicates) PropBank
Summary • Predicate-argument structure labels are arbitrary to a certain degree, but still consistent, and generic enough to be mappable to particular theoretical frameworks • Automatic tagging as a first pass makes the task feasible • Agreement and accuracy figures are reassuring PropBank
Solomonization Source tree: Intel told analysts that the company will resume shipments of the chips within two to three weeks . *** kate said: arg0 : Intel arg1 : the company will resume shipments of the chips within two to three weeks arg2 : analysts *** erwin said: arg0 : Intel arg1 : that the company will resume shipments of the chips within two to three weeks arg2 : analysts PropBank
Solomonization Such loans to Argentina also remain classified as non-accruing, *TRACE*-1 costing the bank $ 10 million *TRACE*-*U* of interest income in the third period. *** kate said: argM-TMP : in the third period arg3 : the bank arg2 : $ 10 million *TRACE*-*U* of interest income arg1 : *TRACE*-1 *** erwin said: argM-TMP : in the third period arg3 : the bank arg2 : $ 10 million *TRACE*-*U* of interest income arg1 : *TRACE*-1 Such loans to Argentina PropBank
Solomonization Also , substantially lower Dutch corporate tax rates helped the company keep its tax outlay flat relative to earnings growth. *** kate said: argM-MNR : relative to earnings growth arg3-PRD : flat arg1 : its tax outlay arg0 : the company *** katherine said: argM-ADV : relative to earnings growth arg3-PRD : flat arg1 : its tax outlay arg0 : the company PropBank