140 likes | 212 Views
Parts of Speech. Generally speaking, the “grammatical type” of word: Verb, Noun, Adjective, Adverb, Article, … We can also include inflection: Verbs: Tense, number, … Nouns: Number, proper/common, … Adjectives: comparative, superlative, … …
E N D
Parts of Speech • Generally speaking, the “grammatical type” of word: • Verb, Noun, Adjective, Adverb, Article, … • We can also include inflection: • Verbs: Tense, number, … • Nouns: Number, proper/common, … • Adjectives: comparative, superlative, … • … • Most commonly used POS sets for English have 50-80 different tags
BNC Parts of Speech • Nouns: NN0 Common noun, neutral for number (e.g. aircraft NN1 Singular common noun (e.g. pencil, goose, time NN2 Plural common noun (e.g. pencils, geese, times NP0 Proper noun (e.g. London, Michael, Mars, IBM • Pronouns: PNI Indefinite pronoun (e.g. none, everything, one PNP Personal pronoun (e.g. I, you, them, ours PNQ Wh-pronoun (e.g. who, whoever, whom PNX Reflexive pronoun (e.g. myself, itself, ourselves
Verbs: VVB finite base form of lexical verbs (e.g. forget, send, live, return VVD past tense form of lexical verbs (e.g. forgot, sent, lived VVG -ing form of lexical verbs (e.g. forgetting, sending, living VVI infinitive form of lexical verbs (e.g. forget, send, live, return VVN past participle form of lexical verbs (e.g. forgotten, sent, lived VVZ -s form of lexical verbs (e.g. forgets, sends, lives, returns VBB present tense of BE, except for is …and so on: VBD VBG VBI VBN VBZ VDB finite base form of DO: do …and so on: VDD VDG VDI VDN VDZ VHB finite base form of HAVE: have, 've …and so on: VHD VHG VHI VHN VHZ VM0 Modal auxiliary verb (e.g. will, would, can, could, 'll, 'd)
Articles AT0 Article (e.g. the, a, an, no) DPS Possessive determiner (e.g. your, their, his) DT0 General determiner (this, that) DTQ Wh-determiner (e.g. which, what, whose, whichever) EX0 Existential there, i.e. occurring in “there is…”or “there are…” • Adjectives AJ0 Adjective (general or positive) (e.g. good, old, beautiful) AJC Comparative adjective (e.g. better, older) AJS Superlative adjective (e.g. best, oldest) • Adverbs AV0 General adverb (e.g. often, well, longer (adv.), furthest. AVP Adverb particle (e.g. up, off, out) AVQ Wh-adverb (e.g. when, where, how, why, wherever)
Miscellaneous: CJC Coordinating conjunction (e.g. and, or, but) CJS Subordinating conjunction (e.g. although, when) CJT The subordinating conjunction that CRD Cardinal number (e.g. one, 3, fifty-five, 3609) ORD Ordinal numeral (e.g. first, sixth, 77th, last) ITJ Interjection or other isolate (e.g. oh, yes, mhm, wow) POS The possessive or genitive marker 's or ' TO0 Infinitive marker to PUL Punctuation: left bracket - i.e. ( or [ PUN Punctuation: general separating mark - i.e. . , ! , : ; - or ? PUQ Punctuation: quotation mark - i.e. ' or " PUR Punctuation: right bracket - i.e. ) or ] XX0 The negative particle not or n't ZZ0 Alphabetical symbols (e.g. A, a, B, b, c, d)
Task: Part-Of-Speech Tagging • Goal: Assign the correct part-of-speech to each word (and punctuation) in a text. • Example: • Learn a local model of POS dependencies, usually from pre-tagged data • No parsing
0.2 AJ0 0.3 “cats” NN2 0.2 0.5 0.6 “a” 0.3 AT0 “men” 0.9 NN1 0.5 0.4 “the” 0.1 “cat” “bet” Hidden Markov Models • Assume: POS (state) sequence generated as time-invariant random process, and each POS randomly generates a word (output symbol)
Definition of HMM for Tagging • Set of states – all possible tags • Output alphabet – all words in the language • State/tag transition probabilities • Initial state probabilities: the probability of beginning a sentence with a tag t (t0t) • Output probabilities – producing word w at state t • Output sequence – observed word sequence • State sequence – underlying tag sequence
HMMs For Tagging • First-order (bigram) Markov assumptions: • Limited Horizon: Tag depends only on previous tag P(ti+1 = tk | t1=tj1,…,ti=tji) = P(ti+1 = tk | ti = tj) • Time invariance: No change over time P(ti+1 = tk | ti = tj) = P(t2 = tk | t1 = tj) = P(tj tk) • Output probabilities: • Probability of getting word wk for tag tj: P(wk | tj) • Assumption: Not dependent on other tags or words!
Combining Probabilities • Probability of a tag sequence: P(t1t2…tn) = P(t1)P(t1t2)P(t2t3)…P(tn-1tn) Assume t0 – starting tag: = P(t0t1)P(t1t2)P(t2t3)…P(tn-1tn) • Prob. of word sequence and tag sequence: P(W,T) = i P(ti-1ti) P(wi | ti)
Training from Labeled Corpus • Labeled training = each word has a POS tag • Thus: PMLE(tj) = C(tj) / N PMLE(tjtk) = C(tj,tk) / C(tj) PMLE(wk | tj) = C(tj:wk) / C(tj) • Smoothing applies as usual
Viterbi Tagging • Most probable tag sequence given text: T* = arg maxT Pm(T | W) = arg maxT Pm(W | T) Pm(T) / Pm(W) (Bayes’ Theorem) = arg maxT Pm(W | T) Pm(T) (W is constant for all T) = arg maxTi[m(ti-1ti) m(wi | ti) ] = arg maxTi log[m(ti-1ti) m(wi | ti) ] • Exponential number of possible tag sequences – use dynamic programming for efficient computation
w1 w2 w3 -1.7 -1.7 t1 t1 t1 -3 -2.3 -0.3 -0.3 -1.7 t0 t2 t2 t2 -3.4 -1.3 -1.3 -1 t3 t3 t3 -2.7 -7.3 -6 -10.3 -4.7 -6.7 -9.3
Viterbi Algorithm • D(0, START) = 0 • foreach tag t != STARTdo: D(1, t) = - • for i 1 toN do: • for each tag tjdo: D(i, tj) maxkD(i-1,tk) + lm(tktj) + lm(wi|tj) Record best(i,j)=k which yielded the max • log P(W,T) = maxj D(N, tj) • Reconstruct path from maxj backwards Where: lm(.) = log m(.) and D(i, tj) – max joint probability of state and word sequences till position i, ending at tj. Complexity: O(Nt2 N)