440 likes | 555 Views
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 29– CYK; Inside Probability; Parse Tree construction). Pushpak Bhattacharyya CSE Dept., IIT Bombay 22 nd March, 2011. Penn POS Tags. John wrote those words in the Book of Proverbs. [John/NNP ] wrote/VBD
E N D
CS460/626 : Natural Language Processing/Speech, NLP and the Web(Lecture 29– CYK; Inside Probability; Parse Tree construction) Pushpak BhattacharyyaCSE Dept., IIT Bombay 22nd March, 2011
Penn POS Tags • John wrote those words in the Book of Proverbs. [John/NNP ] wrote/VBD [ those/DT words/NNS ] in/IN [ the/DT Book/NN ] of/IN [ Proverbs/NNS ]
Penn Treebank • John wrote those words in the Book of Proverbs. (S (NP-SBJ (NP John)) (VP wrote (NP those words) (PP-LOC in (NP (NP-TTL (NP the Book) (PP of (NP Proverbs)))
PSG Parse Tree S • Official trading in the shares will start in Paris on Nov 6. NP VP NP PP Aux V PP PP NP N P AP A will start onNov6 inParis trading official in theshares
Penn POS Tags • Official trading in the shares will start in Paris on Nov 6. [ Official/JJ trading/NN ] in/IN [ the/DT shares/NNS ] will/MD start/VB in/IN [ Paris/NNP ] on/IN [ Nov./NNP 6/CD ]
Penn POS Tag Sset • Adjective: JJ • Adverb: RB • Cardinal Number: CD • Determiner: DT • Preposition: IN • Coordinating Conjunction CC • Subordinating Conjunction: IN • Singular Noun: NN • Plural Noun: NNS • Personal Pronoun: PP • Proper Noun: NP • Verb base form: VB • Modal verb: MD • Verb (3sg Pres): VBZ • Wh-determiner: WDT • Wh-pronoun: WP
CYK Parsing (some slides borrowed from Jimmy Lin’s “Syntactic Parsing with CFGs)
Shared Sub-Problems • Observation: ambiguous parses still share sub-trees • We don’t want to redo work that’s already been done • Unfortunately, naïve backtracking leads to duplicate work
Efficient Parsing • Dynamic programming to the rescue! • Intuition: store partial results in tables, thereby: • Avoiding repeated work on shared sub-problems • Efficiently storing ambiguous structures with shared sub-parts • Two algorithms: • CKY: roughly, bottom-up • Earley: roughly, top-down
CKY Parsing: CNF • CKY parsing requires that the grammar consist of ε-free, binary rules = Chomsky Normal Form • All rules of the form: • A BC or Aa • What does the tree look like? • What if my CFG isn’t in CNF? A → B C D → w
CKY Parsing with Arbitrary CFGs • Problem: my grammar has rules like VP → NP PP PP • Can’t apply CKY! • Solution: rewrite grammar into CNF • Introduce new intermediate non-terminals into the grammar • What does this mean? • = weak equivalence • The rewritten grammar accepts (and rejects) the same set of strings as the original grammar… • But the resulting derivations (trees) are different A X D X B C A B C D • (Where X is a symbol that doesn’t occur anywhere else in the grammar)
CKY Parsing: Intuition • Consider the rule D → w • Terminal (word) forms a constituent • Trivial to apply • Consider the rule A →B C • If there is an A somewhere in the input then there must be a B followed by a C in the input • First, precisely define span [ i, j ] • If A spans from i to j in the input then there must be some k such that i<k<j • Easy to apply: we just need to try different values for k i j k
CKY Parsing: Table • Any constituent can conceivably span [ i, j ] for all 0≤i<j≤N, where N = length of input string • We need an N × N table to keep track of all spans… • But we only need half of the table • Semantics of table: cell [ i, j ] contains A iff A spans i to j in the input string • Of course, must be allowed by the grammar!
CKY Parsing: Table-Filling • In order for A to span [ i, j ]: • A B C is a rule in the grammar, and • There must be a B in [ i, k ] and a C in [ k, j ] for some i<k<j • Operationally: • To apply rule A B C, look for a B in [ i, k ] and a C in [ k, j ] • In the table: look left in the row and down in the column
CKY Parsing: Recognize or Parse • Is this really a parser? • Recognizer to parser: add backpointers!
CKY: Algorithmic Complexity • What’s the asymptotic complexity of CKY? • O(n3)
CKY: Analysis • Since it’s bottom up, CKY populates the table with a lot of “phantom constituents” • Spans that are constituents, but cannot really occur in the context in which they are suggested • Conversion of grammar to CNF adds additional non-terminal nodes • Leads to weak equivalence wrt original grammar • Additional terminal nodes not (linguistically) meaningful: but can be cleaned up with post processing • Is there a parsing algorithm for arbitrary CFGs that combines dynamic programming and top-down control? • Yes: Earley Parsing
Penn Treebank • Official trading in the shares will start in Paris on Nov 6. ( (S (NP-SBJ (NP Official trading) (PP in (NP the shares))) (VP will (VP start (PP-LOC in (NP Paris)) (PP-TMP on (NP (NP Nov 6)
Probabilistic Context Free Grammars • DT the 1.0 • NN gunman 0.5 • NN building 0.5 • VBD sprayed 1.0 • NNS bullets 1.0 • S NP VP 1.0 • NP DT NN 0.5 • NP NNS 0.3 • NP NP PP 0.2 • PP P NP 1.0 • VP VP PP 0.6 • VP VBD NP 0.4
Example Parse t1 S1.0 P (t1) = 1.0 * 0.5 * 1.0 * 0.5 * 0.6 * 0.4 * 1.0 * 0.5 * 1.0 * 0.5 * 1.0 * 1.0 * 0.3 * 1.0 = 0.00225 • The gunman sprayed the building with bullets. NP0.5 VP0.6 NN0.5 DT1.0 PP1.0 VP0.4 P1.0 NP0.3 NP0.5 VBD1.0 The gunman DT1.0 NN0.5 with NNS1.0 sprayed the building bullets
Another Parse t2 S1.0 • The gunman sprayed the building with bullets. P (t2) = 1.0 * 0.5 * 1.0 * 0.5 * 0.4 * 1.0 * 0.2 * 0.5 * 1.0 * 0.5 * 1.0 * 1.0 * 0.3 * 1.0 = 0.0015 NP0.5 VP0.4 NN0.5 DT1.0 VBD1.0 NP0.2 The gunman sprayed NP0.5 PP1.0 DT1.0 NN0.5 P1.0 NP0.3 NNS1.0 the building with bullets
Illustrating CYK [Cocke, Younger, Kashmi] Algo • DT the 1.0 • NN gunman 0.5 • NN building 0.5 • VBD sprayed 1.0 • NNS bullets 1.0 • S NP VP 1.0 • NP DT NN 0.5 • NP NNS 0.3 • NP NP PP 0.2 • PP P NP 1.0 • VP VP PP 0.6 • VP VBD NP 0.4
CYK: Start with (0,1) 0The1 gunman 2 sprayed 3 the 4 building 5 with 6 bullets 7.
CYK: Keep filling diagonals 0The 1gunman2 sprayed 3 the 4 building 5 with 6 bullets 7.
CYK: Try getting higher level structures 0The1gunman2 sprayed 3 the 4 building 5 with 6 bullets 7.
CYK: Diagonal continues 0The 1 gunman 2sprayed3 the 4 building 5 with 6 bullets 7.
CYK (cont…) 0The 1 gunman 2 sprayed 3 the 4 building 5 with 6 bullets 7.
CYK (cont…) 0The 1 gunman 2 sprayed 3the4 building 5 with 6 bullets 7.
CYK (cont…) 0The 1 gunman 2 sprayed 3 the 4building5 with 6 bullets 7.
CYK: starts filling the 5th column 0The 1 gunman 2 sprayed 3the4building5 with 6 bullets 7.
CYK (cont…) 0The 1 gunman 2sprayed3the4building5 with 6 bullets 7.
CYK (cont…) 0The 1gunman2sprayed3the4building5 with 6 bullets 7.
CYK: S found, but NO termination! 0The1gunman2sprayed3the4building5 with 6 bullets 7.
CYK (cont…) 0The 1 gunman 2 sprayed 3 the 4 building 5with6 bullets 7.
CYK (cont…) 0The 1 gunman 2 sprayed 3 the 4 building 5 with 6 bullets 7.
CYK: Control moves to last column 0The 1 gunman 2 sprayed 3 the 4 building 5 with 6bullets7.
CYK (cont…) 0The 1 gunman 2 sprayed 3 the 4 building 5with6bullets7.
CYK (cont…) 0The 1 gunman 2 sprayed 3the4building5with6bullets7.
CYK (cont…) 0The 1 gunman 2sprayed3the4building5with6bullets7.
CYK: filling the last column 0The 1 gunman 2sprayed3the4building5with6bullets7.
CYK: terminates with S in (0,7) 0The1gunman2sprayed3the4building5with6bullets7.
CYK: Extracting the Parse Tree • The parse tree is obtained by keeping back pointers. S (0-7) NP (0-2) VP (2-7) DT (0-1) NN (1-2) NP (3-7) VBD (2-3) NP (3-5) PP (5-7) gunman The DT (3-4) NN (4-5) P (5-6) NP (6-7) sprayed NNS (6-7) with the building bullets