recognizing contextual polarity in phrase level sentiment
play

Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis - PowerPoint PPT Presentation

Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis Presented by Kay Lu, Ashley Gao, Qusheng Sun Introduction of Sentiment Analysis Definition A task of identifying positive and negative opinions, emotions, and evaluations.


  1. Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis Presented by Kay Lu, Ashley Gao, Qusheng Sun

  2. Introduction of Sentiment Analysis Definition A task of identifying positive and negative opinions, emotions, and evaluations. Prior Out of context, positive or negative. Polarity Contextual Phrase in which a word appears may be different Polarity from the word’s prior polarity.

  3. Example Prior Polarity Contextual Polarity Philip Clap, President of the National Philip Clap, President of the National Environment Trust , sums up well the Environment Trust , sums up well the general thrust of the reaction of general thrust of the reaction of environmental movements: there is no environmental movements: there is no reason at all to believe that the reason at all to believe that the polluters are suddenly going to become polluters are suddenly going to reasonable . become reasonable .

  4. Mannal Annotation MPQA - a tool provides corpus contains news articles and other text documents manually annotated for opinion, sentiment, etc. Human annotation - Based on the MPQA, anotators were instructed to tag the polarity of subjective expressions as positive, negative, both, or neutral

  5. Agreement Study && Corpus Agreement study - measure reliability of the polarity annotation, which compares 2 annotations. Corpus - divide annotated documents to get experiment data

  6. Prior-Polarity Subjectivity Lexicon All words in lexicon tagged with: - Prior polarity: positive, negative, both, neutral - Reliability: strongly subjective and weakly subjective.

  7. Experiment - 2 Steps 1. Use AdaBoost on both parts 2. Use 10 fold cross validation when testing 3. Step 1: Find polar phrases 4. Step 2: Disambiguation

  8. Step 1 - Classifier of Clue Instances Given an instance inst from the lexicon, the classifier of inst is defined as: if inst not in a subjective expression: goldclass( inst ) = neutral else if inst in at least one positive and one negative subjective expression: goldclass( inst ) = both else if inst in a mixture of negative and neutral: goldclass( inst ) = negative else if inst in a mixture of positive and neutral: goldclass( inst ) = positive else: goldclass( inst ) = contextual polarity of subjective expression

  9. Step 1 - Features 1.Word features (e.g. outrageous ) Token, Part of Speech, Context, Prior Polarity, Reliability “Outrageous crimes against humanity.” 2.Modification features (Binary fields) 1. Preceded by adjective, adverb (other than not), intensifier? 2. Self intensifier? 3. Modifies strongsubj clue and/or weaksubj clue? 4. Modified by strongsubj clue and/or weaksubj clue?

  10. Step 1 - Features (Dependency Tree)

  11. Step 1 - Features (Cont’d ) 3.Structure features In subject (Human Rights) in copular (She’s right) in passive voice (must be right) 4.Sentence features Count of strongsubj clues in previous, current, next sentence Count of weaksubj clues in previous, current, next sentence Counts of various parts of speech 5.Document feature

  12. Step 2: Polarity Classification Note that Step 1 is automatic Remain noise: some neutrals do in fact get passed onto Step 2 For this step, the polarity classification task remains four-way: (1) Positive (2) Negative (3) Both (4) Neutral

  13. Step 2: Polarity Classification

  14. Step 2: Polarity Classification Binary features: Negated: 1. not good 2. does not look very good 3. not only good but amazing Negated subject: No politically prudent Israeli could support either of them

  15. Step 2: Polarity Classification ● Modifies polarity 5 values: positive, negative, neutral, both, not mod substantial : negative ● Modified by polarity 5 values: positive, negative, neutral, both, not mod challenge : positive ● Conjunction polarity substantial (pos) challenge (neg) 5 values: positive, negative, neutral, both, not mod good (pos) and evil (neg) good : negative

  16. Step 2: Polarity Classification ● General polarity shifter pose little threat contains little truth ● Negative polarity shifter lack of understanding ● Positive polarity shifter abate the damage

  17. Results of Step 2: Polarity Classification ● Classifier using all ten features significantly outperforms the two baseline classifier ● Combination of features is needed to achieve significant results over baseline

  18. Conclusions Presented a two-step approach to phrase-level sentiment analysis (1) Determine if an expression is neutral or polar (2) Determines contextual polarity of the ones that are polar Automatically identify the contextual polarity of a large subset of sentiment expression

  19. Thank you! Q&A Works Cited: [1] Theresa Wilson, Janyce Wiebe, and Paul Hoffmann. 2005. Recognizing contextual polarity in phrase-level sentiment analysis. In Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing (HLT '05). Association for Computational Linguistics, Stroudsburg, PA, USA, 347-354. DOI: https://doi.org/10.3115/1220575.1220619 [2] Theresa Wilson, Janyce Wiebe, and Paul Hoffmann. https://www.slideserve.com/brendy/recognizing-contextual-polarity-in-phrase-level-sentiment-analysis

Recommend


More recommend