1 / 38

Determining the Hierarchical Structure of Perspective and Speech Expressions

Determining the Hierarchical Structure of Perspective and Speech Expressions. Eric Breck and Claire Cardie Cornell University Department of Computer Science. Events in the News. Reporting events. Reporting in text.

omar
Download Presentation

Determining the Hierarchical Structure of Perspective and Speech Expressions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Determining the Hierarchical Structure of Perspective and Speech Expressions Eric Breck and Claire Cardie Cornell University Department of Computer Science

  2. Events in the News Cornell University Computer Science COLING 2004

  3. Reporting events Cornell University Computer Science COLING 2004

  4. Reporting in text • Clappsums upthe environmental movement’s reaction: “The polluters are unreasonable’’ • Charlie was angry at Alice’s claim that Bob was unhappy Cornell University Computer Science COLING 2004

  5. Perspective and Speech Expressions (pse’s) • A perspective expression is text denoting an explicit opinion, belief, sentiment, etc. • The actor was elated that … • John’sfirm belief in … • A speech expression is text denoting spoken or written communication • … arguedthe attorney ... • … the 9/11 Commission’s finalreport … Cornell University Computer Science COLING 2004

  6. writer (implicit) Charlie angry Alice claim Bob unhappy Grand Vision Charlie was angry at Alice’s claim that Bob was unhappy that Bob was unhappy Cornell University Computer Science COLING 2004

  7. (implicit) angry claim unhappy This Work Cornell University Computer Science COLING 2004

  8. (implicit) angry claim unhappy System Output: Pse Hierarchy 78% accurate! Charlie was angry at Alice’s claim that Bob was unhappy Cornell University Computer Science COLING 2004

  9. Related Work: Abstract • Bergler, 1993 • Lexical semantics of reporting verbs • Gerard, 2000 • Abstract model of news reader Cornell University Computer Science COLING 2004

  10. Related Work: Concrete • Bethard et al., 2004 • Extract propositional opinions & holders • Wiebe, 1994 • Tracks “point of view” in narrative text • Wiebe et al., 2003 • Preliminary results on pse identification • Gildea and Jurafsky, 2002 • Semantic Role ID - use for finding sources? Cornell University Computer Science COLING 2004

  11. Only 66% correct unhappy unhappy unhappy Baseline 1: Only filter through writer (implicit) angry claim Cornell University Computer Science COLING 2004

  12. claim claim unhappy unhappy Baseline 2: Dependency Tree (implicit) angry 72% correct claim unhappy Cornell University Computer Science COLING 2004

  13. A Learning Approach • How do we cast the recovery of hierarchical structure as a learning problem? • Simplest solution • Learn pairwise attachment decisions • Is pseparent the parent of psetarget? • Combine decisions to form tree • Other solutions are possible (n-ary decisions, tree-modeling, etc.) Cornell University Computer Science COLING 2004

  14. Training instances (implicit) angry claim unhappy Cornell University Computer Science COLING 2004

  15. Training instances (implicit) <unhappy, (implicit)> angry claim unhappy Cornell University Computer Science COLING 2004

  16. Training instances (implicit) <unhappy, (implicit)> <claim, (implicit)> angry claim unhappy Cornell University Computer Science COLING 2004

  17. Training instances (implicit) <unhappy, (implicit)> <claim, (implicit)> <angry, (implicit)> angry claim unhappy Cornell University Computer Science COLING 2004

  18. Training instances (implicit) <unhappy, (implicit)> <claim, (implicit)> <angry, (implicit)> <unhappy, claim> <claim, unhappy> angry claim unhappy Cornell University Computer Science COLING 2004

  19. Training instances (implicit) <unhappy, (implicit)> <claim, (implicit)> <angry, (implicit)> <unhappy, claim> <claim, unhappy> angry claim unhappy <unhappy, angry> <angry, unhappy> Cornell University Computer Science COLING 2004

  20. Training instances (implicit) <unhappy, (implicit)> <claim, (implicit)> <angry, (implicit)> <unhappy, claim> <claim, unhappy> angry claim unhappy <unhappy, angry> <angry, unhappy> <angry, claim> <claim, angry> Cornell University Computer Science COLING 2004

  21. Decision Combination (implicit) angry claim unhappy Cornell University Computer Science COLING 2004

  22. Decision Combination (implicit) angry 0.9 <angry, (implicit)> 0.1 <angry, claim> 0.1 <angry, unhappy> angry claim unhappy Cornell University Computer Science COLING 2004

  23. Decision Combination (implicit) angry claim unhappy Cornell University Computer Science COLING 2004

  24. Decision Combination (implicit) claim 0.5 <claim, (implicit)> 0.4 <claim, angry> 0.3 <claim, unhappy> angry claim unhappy Cornell University Computer Science COLING 2004

  25. Decision Combination (implicit) angry claim unhappy Cornell University Computer Science COLING 2004

  26. Decision Combination (implicit) unhappy 0.7 <unhappy, claim> 0.5 <unhappy, (implicit)> 0.2 <unhappy, angry> angry claim unhappy Cornell University Computer Science COLING 2004

  27. Decision Combination (implicit) angry claim unhappy Cornell University Computer Science COLING 2004

  28. Features(1) • All features based on error analysis • Parse-based features • Domination+ variants • Positional features • Relative position of pseparent and psetarget Cornell University Computer Science COLING 2004

  29. Features(2) • Lexical features • writer’s implicit pse • “said” • “according to” • part of speech • Genre-specific features • Charlie, shenoted, dislikes Chinese food. • “Alicedisagrees with me,” Bobsaid. Cornell University Computer Science COLING 2004

  30. Resources • GATE toolkit (Cunningham et al, 2002) - part-of-speech, tokenization, sentence boundaries • Collins parser (1999) - extracted dependency parses • CASS partial parser (Abney, 1997) • IND decision trees (Buntine, 1993) Cornell University Computer Science COLING 2004

  31. Data • From the NRRC Multi-Perspective Question Answering workshop (Wiebe, 2002) • 535 newswire documents (66 for development, 469 for evaluation) • All pse’s annotated, along with sources and other information • Hierarchical pse structure annotated for each sentence* Cornell University Computer Science COLING 2004

  32. Example (truncated) model • One learned tree, truncated to depth 3: • pse0 is parent of pse1 iff • pse0 is (implicit) • And pse1 is not in quotes • OR pse0 is said • Typical trees on development data: • Depth ~20, ~700 leaves Cornell University Computer Science COLING 2004

  33. Evaluation • Dependency-based metric (Lin, 1995) • Percentage of pse’s whose parents are identified correctly • Percentage of sentences with perfectly identified structure • Performance of binary classifier Cornell University Computer Science COLING 2004

  34. Results Cornell University Computer Science COLING 2004

  35. Error Analysis • Pairwise decisions prevent the model from learning larger structure • Speech events and perspective expressions behave differently • Treebank-style parses don’t always have the structure we need Cornell University Computer Science COLING 2004

  36. Future Work • Identify pse’s • Identify sources • Evaluate alternative structure-learning methods • Use the structure to generate perspective-oriented summaries Cornell University Computer Science COLING 2004

  37. Conclusions • Understanding pse structure is important for understanding text • Automated analysis of pse structure is possible Cornell University Computer Science COLING 2004

  38. Thank you! Cornell University Computer Science COLING 2004

More Related