280 likes | 407 Views
LING 581: Advanced Computational Linguistics. Lecture Notes February 2nd. From last time…. Homework Exercise. Use the bracketing guides and choose three “interesting” constructions. Find all occurrences in the WSJ PTB. Homework Exercise. 581 Homework rules Due next lecture
E N D
LING 581: Advanced Computational Linguistics Lecture Notes February 2nd
Homework Exercise Use the bracketing guides and choose three “interesting” constructions Find all occurrences in the WSJ PTB
Homework Exercise • 581 Homework rules • Due next lecture • Present your findings in class (slides)
Today’s Lecture • More on Bikel Collins parser • Evaluation: EVALB • Homework
Bikel Collins • Paper • Daniel M. Bikel. 2004. Intricacies of Collins’ Parsing Model. (PS) (PDF) in Computational Linguistics, 30(4), pp. 479-511. • http://www.cis.upenn.edu/~dbikel/papers/collins-intricacies.pdf
Observations from Training Data • (mod ((with IN) (milk NN) PP (+START+) ((+START+ +START+)) NP-A NPB () false right) 1.0) • modHeadWord(with IN) • headWord(milk NN) • modifier PP • previousMods(+START+) • previousWords((+START+ +START+)) • parent NP-A • head NPB • subcat() • verbIntervening false • side right • (mod ((+STOP+ +STOP+) (milk NN) +STOP+ (PP) ((with IN)) NP-A NPB () false right) 1.0) • modHeadWord (+STOP+ +STOP+) • headWord (milk NN) • modifier +STOP+ • previousMods (PP) • previousWords ((with IN)) • parent NP-A • head NPB • subcat () • verbIntervening false • side right • Frequency 1 observed data for: • (NP (NP (DT a)(NNmilk))(PP(IN with)(NP (ADJP (CD 4)(NN %))(NN butterfat))))
Observations from Training Data 76.8% singular events 94.2% 5 or fewer occurrences
Example of Brittleness • “Milk” example …
EVALB • How to evaluate parsing accuracy? • count bracketing matches • (LR) Bracketing recall = (number of correct constituents) ---------------------------------------- (number of constituents in the goldfile) • (LP) Bracketing precision = (number of correct constituents) ---------------------------------------- (number of constituents in the parsed file) • Program is called evalb • http://nlp.cs.nyu.edu/evalb/ • written in C • get it to compile on your system (Makefile)
EVALB file evalb evalb: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.2.0, not stripped source file Use command make to build executable on your machine
EVALB • Example (on MacOSX) douglass-dhcp8:EVALB sandiway$ make gcc -Wall -g -o evalbevalb.c evalb.c:25:20: error: malloc.h: No such file or directory evalb.c: In function ‘main’: evalb.c:379: warning: pointer targets in passing argument 1 of ‘fgets’ differ in signedness
EVALB On MacOSX delete this line
EVALB Can ignore c compiler warnings • douglass-dhcp8:EVALB sandiway$ make • gcc -Wall -g -o evalbevalb.c • evalb.c: In function ‘main’: • evalb.c:378: warning: pointer targets in passing argument 1 of ‘fgets’ differ in signedness • evalb.c:385: warning: pointer targets in passing argument 1 of ‘__builtin___strcpy_chk’ differ in signedness • evalb.c:385: warning: pointer targets in passing argument 2 of ‘__builtin___strcpy_chk’ differ in signedness • evalb.c:385: warning: pointer targets in passing argument 1 of ‘__inline_strcpy_chk’ differ in signedness • evalb.c:385: warning: pointer targets in passing argument 2 of ‘__inline_strcpy_chk’ differ in signedness • evalb.c:388: warning: pointer targets in passing argument 1 of ‘fgets’ differ in signedness • evalb.c:403: warning: pointer targets in passing argument 1 of ‘fgets’ differ in signedness • evalb.c: In function ‘calc_result’: • evalb.c:878: warning: pointer targets in passing argument 2 of ‘__builtin___strncpy_chk’ differ in signedness • evalb.c:878: warning: pointer targets in passing argument 2 of ‘__inline_strncpy_chk’ differ in signedness • evalb.c:892: warning: pointer targets in passing argument 2 of ‘__builtin___strncpy_chk’ differ in signedness • evalb.c:892: warning: pointer targets in passing argument 2 of ‘__inline_strncpy_chk’ differ in signedness • evalb.c:904: warning: pointer targets in passing argument 2 of ‘__builtin___strncpy_chk’ differ in signedness • evalb.c:904: warning: pointer targets in passing argument 2 of ‘__inline_strncpy_chk’ differ in signedness • evalb.c:932: warning: pointer targets in passing argument 2 of ‘__builtin___strncpy_chk’ differ in signedness • evalb.c:932: warning: pointer targets in passing argument 2 of ‘__inline_strncpy_chk’ differ in signedness • douglass-dhcp8:EVALB sandiway$ file evalb • evalb: Mach-O 64-bit executable x86_64
EVALB [6] THE PARAMETER (.prm) FILE The .prm file sets options regarding the scoring method. COLLINS.prmgives the same scoring behaviour as the scorer used in (Collins 97). The options chosen were: 1) LABELED 1 to give labelled precision/recall figures, i.e. a constituent must have the same span *and* label as a constituent in the goldfile. 2) DELETE_LABEL TOP Don't count the "TOP" label (which is always given in the output of tgrep) when scoring. 3) DELETE_LABEL -NONE- Remove traces (and all constituents which dominate nothing but traces) when scoring. For example .... (VP (VBD reported) (SBAR (-NONE- 0) (S (-NONE- *T*-1)))) (. .))) would be processed to give .... (VP (VBD reported)) (. .))) 4) DELETE_LABEL , -- for the purposes of scoring remove punctuation DELETE_LABEL : DELETE_LABEL `` DELETE_LABEL '' DELETE_LABEL . 5) DELETE_LABEL_FOR_LENGTH -NONE- -- don't include traces when calculating the length of a sentence (important when classifying a sentence as <=40 words or >40 words) 6) EQ_LABEL ADVP PRT Count ADVP and PRT as being the same label when scoring.
EVALB • To run the scorer: • > evalb -p Parameter_fileGold_fileTest_file • For example to use the sample files: • > evalb -p sample.prmsample.gldsample.tst
EVALB Gold standard: • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A-SBJ-1 (P this)) (B-WHATEVER (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test))) (-NONE- *)) • (S (A (P this)) (B (Q is) (A (R a) (T test))) (: *)) Test: • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (C (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (U test)))) • (S (C (P this)) (B (Q is) (A (R a) (U test)))) • (S (A (P this)) (B (Q is) (R a) (A (T test)))) • (S (A (P this) (Q is)) (A (R a) (T test))) • (S (P this) (Q is) (R a) (T test)) • (P this) (Q is) (R a) (T test) • (S (A (P this)) (B (Q is) (A (A (R a) (T test))))) • (S (A (P this)) (B (Q is) (A (A (A (A (A (R a) (T test)))))))) • (S (A (P this)) (B (Q was) (A (A (R a) (T test))))) • (S (A (P this)) (B (Q is) (U not) (A (A (R a) (T test))))) • (TOP (S (A (P this)) (B (Q is) (A (R a) (T test))))) • (S (A (P this)) (NONE *) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (S (NONE abc) (A (NONE *))) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (TT test)))) • (S (A (P This)) (B (Q is) (A (R a) (T test)))) • (S (A (P That)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test))) (A (P this)) (B (Q is) (A (R a) (T test)))) • (S (A (P this)) (B (Q is) (A (R a) (T test))) (-NONE- *)) • (S (A (P this)) (B (Q is) (A (R a) (T test))) (: *))
EVALB Results: Sent. Matched Bracket Cross Correct Tag ID Len. Stat. Recal Prec. Bracket gold test Bracket Words Tags Accracy ============================================================================ 1 4 0 100.00 100.00 4 4 4 0 4 4 100.00 2 4 0 75.00 75.00 3 4 4 0 4 4 100.00 3 4 0 100.00 100.00 4 4 4 0 4 3 75.00 4 4 0 75.00 75.00 3 4 4 0 4 3 75.00 5 4 0 75.00 75.00 3 4 4 0 4 4 100.00 6 4 0 50.00 66.67 2 4 3 1 4 4 100.00 7 4 0 25.00 100.00 1 4 1 0 4 4 100.00 8 4 0 0.00 0.00 0 4 0 0 4 4 100.00 9 4 0 100.00 80.00 4 4 5 0 4 4 100.00 10 4 0 100.00 50.00 4 4 8 0 4 4 100.00 11 4 2 0.00 0.00 0 0 0 0 4 0 0.00 12 4 1 0.00 0.00 0 0 0 0 4 0 0.00 13 4 1 0.00 0.00 0 0 0 0 4 0 0.00 14 4 2 0.00 0.00 0 0 0 0 4 0 0.00 15 4 0 100.00 100.00 4 4 4 0 4 4 100.00 16 4 1 0.00 0.00 0 0 0 0 4 0 0.00 17 4 1 0.00 0.00 0 0 0 0 4 0 0.00 18 4 0 100.00 100.00 4 4 4 0 4 4 100.00 19 4 0 100.00 100.00 4 4 4 0 4 4 100.00 20 4 1 0.00 0.00 0 0 0 0 4 0 0.00 21 4 0 100.00 100.00 4 4 4 0 4 4 100.00 22 44 0 100.00 100.00 34 34 34 0 44 44 100.00 23 4 0 100.00 100.00 4 4 4 0 4 4 100.00 24 5 0 100.00 100.00 4 4 4 0 4 4 100.00 ============================================================================ 87.76 90.53 86 98 95 16 108 106 98.15 === Summary === -- All -- Number of sentence = 24 Number of Error sentence = 5 Number of Skip sentence = 2 Number of Valid sentence = 17 Bracketing Recall = 87.76 Bracketing Precision = 90.53 Complete match = 52.94 Average crossing = 0.06 No crossing = 94.12 2 or less crossing = 100.00 Tagging accuracy = 98.15 -- len<=40 -- Number of sentence = 23 Number of Error sentence = 5 Number of Skip sentence = 2 Number of Valid sentence = 16 Bracketing Recall = 81.25 Bracketing Precision = 85.25 Complete match = 50.00 Average crossing = 0.06 No crossing = 93.75 2 or less crossing = 100.00 Tagging accuracy = 96.88
EVALB Paper on evalb http://www.aclweb.org/anthology-new/H/H91/H91-1060.pdf
EVALB [5] HOW TO CREATE A GOLDFILE FROM THE PENN TREEBANK The gold and parsed files are in a format similar to this: (TOP (S (INTJ (RB No)) (, ,) (NP (PRP it)) (VP (VBD was) (RB n't) (NP (NNP Black) (NNP Monday))) (. .))) To create a gold file from the treebank: tgrep -wn '/.*/' | tgrep_proc.prl will produce a goldfile in the required format. ("tgrep -wn '/.*/'" prints parse trees, "tgrep_process.prl" just skips blank lines). For example, to produce a goldfile for section 23 of the treebank: tgrep -wn '/.*/' | tail +90895 | tgrep_process.prl | sed 2416q > sec23.gold You don’t have the ancient program tgrep…
EVALB • However you can use tsurgeon from the Stanford tregexyou downloaded to accomplish the same thing • Example: • file: wsj_0927.mrg
EVALB ./tsurgeon.sh -treeFile wsj_0927.mrg -s ( (S (NP-SBJ-1 (NNP H.) (NNP Marshall) (NNP Schwarz)) (VP (VBD was) (VP (VBN named) (S (NP-SBJ (-NONE- *-1)) (NP-PRD (NP (NP (NN chairman)) (CC and) (NP (NN chief) (JJ executive) (NN officer))) (PP (IN of) (NP (NP (NNP U.S.) (NNP Trust) (NNP Corp.)) (, ,) (NP (NP (DT a) (JJ private-banking) (NN firm)) (PP (IN with) (NP (NP (NNS assets)) (PP (IN under) (NP (NN management))) (PP (IN of) (NP (QP (IN about) ($ $) (CD 17) (CD billion)) (-NONE- *U*)))))))))))) (. .))) ( (S (NP-SBJ (NP (NNP Mr.) (NNP Schwarz)) (, ,) (ADJP (NP (CD 52) (NNS years)) (JJ old)) (, ,)) (VP (MD will) (VP (VB succeed) (NP (NNP Daniel) (NNP P.) (NNP Davison)) (NP-TMP (NNP Feb.) (CD 1)) (, ,) (SBAR-TMP (RB soon) (IN after) (S (NP-SBJ (NNP Mr.) (NNP Davison)) (VP (VBZ reaches) (NP (NP (NP (DT the) (NN company) (POS 's)) (JJ mandatory) (NN retirement) (NN age)) (PP (IN of) (NP (CD 65))))))))) (. .))) ( (S (NP-SBJ-1 (NP (NNP Mr.) (NNP Schwarz)) (, ,) (SBAR (WHNP-2 (WP who)) (S (NP-SBJ (-NONE- *T*-2)) (VP (VBZ is) (NP-PRD (NP (NN president)) (PP (IN of) (NP (NNP U.S.) (NNP Trust))))))) (, ,)) (VP (MD will) (VP (VB be) (VP (VBN succeeded) (NP (-NONE- *-1)) (PP-LOC (IN in) (NP (DT that) (NN post))) (PP (IN by) (NP-LGS (NP (NNP Jeffrey) (NNP S.) (NNP Maurer)) (, ,) (NP (CD 42)) (, ,) (SBAR (WHNP-3 (WP who)) (S (NP-SBJ (-NONE- *T*-3)) (VP (VBZ is) (NP-PRD (NP (JJ executive) (NN vice) (NN president)) (PP (IN in) (NP (NP (NN charge)) (PP (IN of) (NP (NP (DT the) (NN company) (POS 's)) (NN asset-management) (NN group)))))))))))))) (. .))) ( (S (NP-SBJ (NP (NNP U.S.) (NNP Trust)) (, ,) (NP (NP (DT a) (JJ 136-year-old) (NN institution)) (SBAR (WHNP-2 (WDT that)) (S (NP-SBJ (-NONE- *T*-2)) (VP (VBZ is) (NP-PRD (NP (CD one)) (PP (IN of) (NP (NP (DT the) (JJS earliest) (NN high-net) (JJ worth) (NNS banks)) (PP-LOC (IN in) (NP (DT the) (NNP U.S.)))))))))) (, ,)) (VP (VBZ has) (VP (VBN faced) (NP (NP (VBG intensifying) (NN competition)) (PP (IN from) (NP (NP (JJ other) (NNS firms)) (SBAR (WHNP-3 (WDT that)) (S (NP-SBJ (-NONE- *T*-3)) (VP (VBP have) (VP (VP (VBN established) (NP (-NONE- *RNR*-1))) (, ,) (CC and) (VP (ADVP-MNR (RB heavily)) (VBN promoted) (NP (-NONE- *RNR*-1))) (, ,) (NP-1 (NP (JJ private-banking) (NNS businesses)) (PP (IN of) (NP (PRP$ their) (JJ own))))))))))))) (. .))) • You can then redirect standard output to a file …
EVALB Example • Putchapter 23 in one file (but not one tree per line) cat~/research/TREEBANK_3/parsed/mrg/wsj/23/*.mrg > wsj_23.mrg • Run tsurgeon ./tsurgeon.sh -treeFile wsj_23.mrg -s > wsj_23.gold • File wsj_23.gold contains one tree per line
Homework • WSJ corpus: sections 00 through 24 • Evaluation: on section 23 • Training: normally 02-21 (20 sections) • How does the Bikel Collins vary in precision and recall? • if you randomly pick 1, 2, 3 up to 20 sections to do the training with… • plot graph with evalb… • Present your results next time
BikelCollins Training • Relevant WSJ PTB files
BikelCollins Parsing • Parsing • Command wsj_XX.obj.gz file • Input file format (sentences)
Bikel Collins Parsing • You can extract the sentences in section 23 for parsing yourself or you can download • wsj-23.txt • from the course webpage