1 / 25

Old Wine and Warm Beer: Target-Specific Sentiment Analysis of Adjectives

Old Wine and Warm Beer: Target-Specific Sentiment Analysis of Adjectives. Author: Angela Fahrni & Manfred Klenner Source: Artificial Intelligence and the Simulation of Behaviour (AISB) 2008. Motivation & Goal.

calais
Download Presentation

Old Wine and Warm Beer: Target-Specific Sentiment Analysis of Adjectives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Old Wine and Warm Beer: Target-Specific Sentiment Analysis of Adjectives Author: Angela Fahrni & Manfred Klenner Source: Artificial Intelligence and the Simulation of Behaviour (AISB) 2008

  2. Motivation & Goal • Rather than having a prior polarity, adjectives are often bearing a target-specific polarity. • A single adjective even switches polarity depending on the accompanying noun • This study focus on the target-specific polarity determination of adjectives

  3. Approach • Propose a two stage model • Identification of domain-specific targets • Construction of a target-specific polarity adjective lexicon by bootstrapping approach • In sentiment analysis, this model outperforms a baseline system that is based on a prior adjective lexicon derived from SentiWordNet

  4. Prior polarity • Several resources provide prior polarity • Adjective lists • SentiWordNet • WordNet-Affect • However, the polarity of words is not in any case domain-independent • For example: ‘unpredictable’ • ’unpredictable plot’ in movie domain is a good thing • ’unpredictable boss’ is not • A kind of target-specific sentiment disambiguation seems to be necessary

  5. Identify the targets of a domain • Wikipedia’s category system is used to organize the stock of Wikipedia articles • Items in the Wikipedia’s and Wikionary’s category system the used as targets

  6. Fast food domain • This study identifies 46,807 targets in fast food domain • Keep the hierarchy in order to propagate polarities • ’cold coca cola’ is positive • then ’cold coca cola cherry’ is positive

  7. Vague adjectives • Only a few adjectives do have a prior positive or negative polarity • vague adjectives such as ’big’, ’young’, ’large’, ’deep’ are best understood as bearing neutral prior polarity

  8. Three contextual effects • Context acting as intensifiers of the intrinsic positive or negative polarity of a target noun • (e.g. ’deep insight’, ’deep disappointment’) • Combine with a neutral noun to form a non-neutral polarity NP • (e.g. ’old bread’) • A single neutral adjective yields positive or negative polarity depending on the (neutral) noun • e.g. the violation (’cold pizza’) and affirmation (’cold coke’) of intrinsic or common sense properties of target objects (pizza, coke)

  9. Inverse effect • Even with the prior polarities • the adjective ’lost’ has a (prior) negative polarity • ’lost virtue’ (virtue=positive) is negative • ’lost glasses’ (glasses=neutral) is negative • ’lost anger’ (anger=negative) is positive

  10. Identify the polarity of non-seed adjectives • In the literature, adjectives with a clear prior polarity often been used as a seed list in order to identify the polarity of additional adjectives • The assumption of these approaches was that the augmented list again establishes a set of adjectives having a prior polarity • Contradicting polarities of an adjective encountered in a corpus were interpreted as a kind of noise and are resolved to one (predominant) polarity using statistical measures

  11. Building a target specific adjective lexicon by contextual pattern • Using same method Contextual pattern such as coordination to identify the polarity of non-seed • But building a target specific adjective lexicon instead of a domain-independent lexicon

  12. Target specific polarity lexicon • Seed adjective lexicon consists of 120 negative and 80 positive adjectives • the polarity is supposed to be domain- and target-independent • Two different corpora are being used • 1600 texts from epinions.com (corpus I) • Tagged • All targets are identified • Because manually choose articles about Fast food, adjectives and targets are relevant to target-specific adjective lexicon • The most frequent targets from corpus I are used to find new texts in corpus II • world wide web (corpus II) • Corpus II is the pool used to identify the polarity of the non-seed adjectives from corpus I with respect to specific targets

  13. Target specific polarity lexicon • According to contextual pattern, search both corpora for tag sequences that relate a target and at least two adjectives • the noun or noun sequence is a target • at least one of the adjectives is from the seed list • at least one of the adjectives comes from the stock of target-relevant adjectives

  14. Target specific polarity lexicon • Two sequence patterns are considered: • adjective co-ordination • e.g. ’good and tasteful burger’ • Copula constructions, e.g. NP BE Adj Adj+ • e.g. ’the french fries are soggy and rather tasteless’ • Now, assumed that adjectives in such constructions share the polarity • Require a target to be present relative to which the sentiment disambiguation is done • The adjective and target must be of interest according to a reference corpus (corpus I)

  15. Polarity tagged pairs • Examples of polarity tagged pairs generated by our systems

  16. Polarity values of non-seed adjectives • Polarity values of non-seed adjectives are given as the mean of the polarity values of their peers • Peers: e.g. a seed adjective that occurs together with it in a coordination

  17. SentiWordNet • SentiWordNet rely on the seed of paradigm words with a clear polarity • the adjective ’hot’ has 22 senses, • Neutral:7 • Negative:5 • Positive:10 • Only interested in the polarities, so merge the positive, negative and neutral senses into one polarity entry

  18. SentiWordNet • the adjective ’hot’ • neutral: 0.6 • positive: 0.28 • negative: 0.12 • Generated an adjective lexicon from SentiWordNet in this way • 21,194 adjective entries has been derived

  19. Evaluation • 3,891 manually classified noun phrases • resulting gold standard comprises 1,832 positive, 415 negative and 1,644 neutral instances

  20. Three different experimental settings • Compared the polarity decisions of SentiWordNet (our baseline system) and our system for the whole data set (all) • Took only those classifications that received different polarities from the two systems (conflict) • Third, only the instances where both systems agreed in their polarity assignment are taken (agree)

  21. Result

  22. Result

  23. domain-specific polarity • domain-specific polarity: All those adjectives that have a single polarity with all of its targets • If the polarity of an adjective depends on the target, an adjective-target pair is added to the polarity-specific lexicon

  24. target-specific polarity of adjectives • The target-specific polarity of adjectives is determined in a corpus-driven manner by searching for combinations of a target-specific adjective with adjectives that have a known prior polarity

More Related