350 likes | 526 Views
Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis (HLT/EMNLP 2005 ). Theresa Wilson Janyce Wiebe Paul Hoffmann ( University of Pittsburgh ) Acknowledgements: This slide is created based on the presentation slides from http://www.cs.pitt.edu/~wiebe/. Outline. Introduction
E N D
Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis (HLT/EMNLP 2005 ) Theresa Wilson Janyce Wiebe Paul Hoffmann (University of Pittsburgh) Acknowledgements: This slide is created based on the presentation slides from http://www.cs.pitt.edu/~wiebe/
Outline • Introduction • Manual Annotations • Corpus • Prior-Polarity Subjectivity Lexicon • Experiments • Conclusions
Introduction (1/6) • Sentiment analysis: task of identifying positive and negative opinions, emotions, and evaluations • How detailed? depends on the application • Flame detection, review classification document-level analysis • Question answering, review mining sentence or phrase-level analysis
Introduction (2/6) • QA example: • Q: What is the international reaction to the reelection of Robert Mugabe as President of Zimbabwe? • A: African observers generally approved of his victory while Western Governments denounced it.
Introduction (3/6) • Prior polarity: • Use a lexicon of positive and negative words • Examples: • beautiful positive • horrid negative • Out of context • Contextual polarity: • A word may appear in a phrase that expresses a different polarity in context • Example: • Cheers to Timothy Whitfield for the wonderfully horrid visuals.
Introduction (4/6) • Another interesting example: • Philip Clap, President of the National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable.
Introduction (5/6) • Another interesting example: • Philip Clap, President of the National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable. prior polarity contextual polarity
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Introduction (6/6) • Goal: automatically distinguish contextual polarity • Approach: use machine learning and variety of features
Manual Annotation (1/3) • Need: sentiment expressions (positive and negative expressions of emotions, evaluations, stances) with contextual polarity • Had: subjective expression (words/phrases expressing emotions, evaluations, stances, speculations, etc.) annotations in MPQA Opinion Corpus • Decision: annotate subjective expressions in MPQA Corpus with their contextual polarity
Manual Annotation (2/3) • Mark polarity of subjective expressions as positive, negative, both, or neutral • African observers generally approved (positive) of his victory while Western governments denounced (negative) it. • Besides, politicians refer to good and evil (both) … • Jerome says the hospital feels (neutral) no different than a hospital in the states. • Judge the contextual polarity of sentiment ultimately being conveyed • They have not succeeded, and will never succeed (positive), in breaking the will of this valiant people.
Manual Annotation (3/3) • Agreement study: • 2 annotators, using 10 documents with 447 subjective expressions • Kappa: 0.72 (82%) • Remove uncertain cases at least one annotator marked uncertain (18%) • Kappa: 0.84 (90%) • But all data are included in experiments
Corpus • 425 documents from MPQA Opinion Corpus • 15,991 subjective expressions in 8,984 sentences • Divided into two sets • Development set • 66 docs / 2,808 subjective expressions • Experiment set • 359 docs / 13,183 subjective expressions • Divided into 10 folds for cross-validation
Prior-Polarity Subjectivity Lexicon • Over 8,000 words from a variety of sources • Both manually and automatically identified • Positive/negative words from General Inquirer and Hatzivassiloglou and McKeown (1997) • All words in lexicon tagged with: • Prior polarity: positive, negative, both, neutral • Reliability: strongly subjective (strongsubj), weakly subjective (weaksubj)
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Experiment • Both Steps: • BoosTexter AdaBoost.HM 5000 rounds boosting • 10-fold cross validation • Give each instance its own label 28 features 10 features
Definition of Gold Standard • Given an instance inst from the lexicon: • if inst not in a subjective expression: goldclass(inst) = neutral • else if inst in at least one positive and one negative subjective expression: goldclass(inst) = both • else if inst in a mixture of negative and neutral: goldclass(inst) = negative • else if inst in a mixture of positive and neutral: goldclass(inst) = positive • else: goldclass(inst) = contextual polarity of subjective expression
Features • Many inspired by Polanya & Zaenen (2004): Contextual Valence Shifters • Examples: little threat, little truth • Others capture dependency relationships between words • Example: wonderfully horrid mod
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word features • Modification features • Structure features • Sentence features • Document feature • Word token • terrifies • Word part-of-speech • VB • Context (3 word tokens) • that terrifies me • Prior Polarity • negative • Reliability • strongsubj
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word features • Modification features • Structure features • Sentence features • Document feature • (Binary features) • Preceded by • adjective • adverb (other than not) • intensifier (e.g. deeply, entirely…) • Self intensifier • Modifies • strongsubj clue • weaksubj clue • Modified by • strongsubj clue • weaksubj clue Dependency Parse Tree
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus poses subj obj report challenge det p det mod adj adj The human rights a substantial … • Word features • Modification features • Structure features • Sentence features • Document feature • (Binary features) • Climbing up the tree toward the root • In subject • The human rights report poses • In copular • I am confident • In passive voice • must be regarded
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word features • Modification features • Structure features • Sentence features • Document feature • Count of strongsubj clues in • previous, current, next sentence • Count of weaksubj clues in • previous, current, next sentence • Counts of various parts of speech • adjectives, adverbs, whether a pronoun…
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word features • Modification features • Structure features • Sentence features • Document feature • Document topic (15) • economics • health • Kyoto protocol • presidential election in Zimbabwe • …… • For example, document on health may contain the word “fever,” but it is not being used to express a sentiment.
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Results 1a
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Results 1b
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Step 2: Polarity Classification • Classes • positive, negative, both, neutral 19,506 5,671
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter • Word token • terrifies • Word prior polarity • negative
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter • (Binary features) • Negated • not good • does not look very good • Negated subject • No politically prudent Israeli could support either of them.
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus substantial(pos)challenge(neg) • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter • Modifies polarity • 5 values: positive, negative, neutral, both, not mod • substantial: negative • Modified by polarity • 5 values: positive, negative, neutral, both, not mod • challenge: positive
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus good(pos) and evil(neg) • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter • Conjunction polarity • 5 values: positive, negative, neutral, both, not mod • good: negative
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter • 4 words before • General polarity shifter • pose little threat • contains little truth • Negative polarity shifter • lack of understanding • Positive polarity shifter • abate the damage
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Results 2a
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Results 2b
Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Ablation experiments removing features: • AB1: Negated, negated subject • AB2: Modifies polarity, modified by polarity • AB3: Conjunction polarity • AB4: General, negative, positive polarity shifters • Results: • The only significant difference is neutral F-measure when AB2 are removed the combination of features is needed to achieve significant performance
Conclusion • Automatically identify the contextual polarity of a large subset of sentiment expression • Presented a two-step approach to phrase-level sentiment analysis • Determine if an expression is neutral or polar • Determines contextual polarity of the ones that are polar • Achieve significant results for a large subset of sentiment expressions