530 likes | 713 Views
Dynamic Conditional Random Fields for Labeling and Segmenting Sequences. Khashayar Rohanimanesh Joint work with Charles Sutton Andrew McCallum University of Massachusetts Amherst. Noun Phrase Segmentation (CoNLL-2000, Sang and Buckholz, 2000).
E N D
Dynamic Conditional Random Fieldsfor Labeling and Segmenting Sequences Khashayar Rohanimanesh Joint work with Charles Sutton Andrew McCallum University of Massachusetts Amherst
Noun Phrase Segmentation(CoNLL-2000, Sang and Buckholz, 2000) B I I B I I O O O Rockwell International Corp. 's Tulsa unit said it signed B I I O B I O B Ia tentative agreement extending its contract with Boeing Co. O O B I O B B I I to provide structural parts for Boeing 's 747 jetliners.
Named Entity Recognition [McCallum & Li, 2003] CRICKET - MILLNS SIGNS FOR BOLAND CAPE TOWN 1996-08-22 South African provincial side Boland said on Thursday they had signed Leicestershire fast bowler David Millns on a one year contract. Millns, who toured Australia with England A in 1992, replaces former England all-rounder Phillip DeFreitas as Boland's overseas professional. Labels: Examples: PER Yayuk Basuki Innocent Butare ORG 3M KDP Leicestershire LOC Leicestershire Nirmal Hriday The Oval MISC Java Basque 1,000 Lakes Rally
Information Extraction Seminar Announcements [Peshkin,Pfeffer 2003] a seminar entitled “Nanorheology of Polymers & Complex STIMELOC Fluids," at 4:30 p.m, Monday, Feb. 27, in Wean Hall 7500. SPEAK The seminar will be given by Professor Steven Granick Biological Abstracts [Skounakis,Craven,Ray 2003] PROTEIN SNC1, a gene from the yeast Saccharomyces cerevisiae, LOC encodes a homolog of vertebrate synaptic vesicle-associated membrane proteins (VAMPs) or synaptobrevins. ” subcellular-localization(SNC1,vesicle)
Simultaneous noun-phrase & part-of-speech tagging B I I B I I O O O N N N O N N V O V Rockwell International Corp. 's Tulsa unit said it signed B I I O B I O B IO J N V O N O N N a tentative agreement extending its contract with Boeing Co.
Linear-Chain CRFs Finite-State
Linear-Chain CRFs Graphical Model Training y x Um… what's ?
Linear-Chain CRFs Graphical Model Training y x Rewrite as: Now solve forkby convex optimization. for some features fk and weights k
General CRFs Trainkby convex optimization to maximize conditional log-likelihood. A CRF is an undirected, conditionally-trained graphical model. Features fk can be arbitrary, overlapping, domain-specific.
CRF Training Trainkby convex optimization to maximize conditional log-likelihood.
Optimization Methods • Generalized Iterative Scaling (GIS) • Improved Iterative Scaling • First order methods • Non-Linear conjugate gradient • Second Order methods • Limited memory Quasi-Newton (BFGS)
From Generative to Conditional Model Graphical Model Models observation HMMs - Does not model observation - Label bias problem MEMMs - Does not model observation - Eliminates label bias problem Linear chain CRFs
Simultaneous noun-phrase & part-of-speech tagging B I I B I I O O O N N N O N N V O V Rockwell International Corp. 's Tulsa unit said it signed B I I O B I O B IO J N V O N O N N a tentative agreement extending its contract with Boeing Co.
Features • Word identity “International” • Capitalization Xxxxxxx • Character classes Contains digits • Character n-gram …ment • Lexicon memberships In list of company names • WordNet synset (speak, say, tell) • … • Part of speech Proper Noun
Multiple Nested Predictionson the Same Sequence Noun phrase Part-of-speech (output prediction) Word identity (input observation) Rockwell Int’l Corp. 's Tulsa
Multiple Nested Predictionson the Same Sequence Noun phrase (output prediction) Part-of-speech (input observation) Word identity (input observation) Rockwell Int’l Corp. 's Tulsa But errors in each stage are compounding. Uncertainty from one stage to the next is not preserved.
Cascaded Predictions Named-entity tag Part-of-speech Segmentation (output prediction) Chinese character (input observation)
Cascaded Predictions Named-entity tag Part-of-speech (output prediction) Segmentation (input observation) Chinese character (input observation)
Cascaded Predictions Named-entity tag (output prediction) Part-of-speech (input obseration) Segmentation (input observation) Chinese character (input observation) Even more stages here, so compounding of errors is worse.
Joint PredictionCross-Product over Labels O(|V| x 9902) parameters O(T x 9902) running time 2 x 45 x 11 = 990 possible states e.g.: state label = (Wordbeg, Noun, Person) Segmentation+POS+NE (output prediction) Chinese character (input observation)
Joint PredictionFactorial CRF O(|V| x 990) parameters Named-entity tag (output prediction) Part-of-speech (output prediction) Segmentation (output prediction) Chinese character (input observation)
Linear-Chain to Factorial CRFsModel Definition Linear-chain ... y ... x Factorial ... u ... v ... w ... x where
Linear-Chain to Factorial CRFsLog-likelihood Training Linear-chain ... y ... x Factorial ... u ... v ... w ... x
Dynamic CRFsUndirected conditionally-trained analogue to Dynamic Bayes Nets (DBNs) Factorial Higher-Order Hierarchical
Need for Inference Marginal distributions ... y ... x Used during training Most-likely (Viterbi) labeling ... y ... x Used to label a sequence 9000 training instances x 100 maximizer iterations = 900,000 calls to inference algorithm!
Inference (Exact)Junction Tree Max-clique: 3 x 45 x 45 = 6075 assignments NP POS
Inference (Exact)Junction Tree Max-clique: 3 x 45 x 45 x 11 = 66825 assignments NER POS SEG
m2(v3) m1(v2) m1(v4) m2(v5) m3(v6) m5(v2) m6(v3) m4(v1) Inference (Approximate)Loopy Belief Approximation v1 v2 v3 m2(v1) m3(v2) m5(v4) m5(v4) v5 v6 v4 m4(v5) m5(v6)
Inference (Approximate)Tree Re-parameterization [Wainwright, Jaakkola, Willsky 2001]
Inference (Approximate)Tree Re-parameterization [Wainwright, Jaakkola, Willsky 2001]
Inference (Approximate)Tree Re-parameterization [Wainwright, Jaakkola, Willsky 2001]
Inference (Approximate)Tree Re-parameterization [Wainwright, Jaakkola, Willsky 2001]
ExperimentsSimultaneous noun-phrase & part-of-speech tagging B I I B I I O O O N N N O N N V O V Rockwell International Corp. 's Tulsa unit said it signed B I I O B I O B IO J N V O N O N N a tentative agreement extending its contract with Boeing Co. • Data from CoNLL Shared Task 2000 (Newswire) • Training subsets of various sizes: from 223-894 sentences • Features include: word identity, neighboring words, capitalization, lexicons of parts-of-speech, company names (1,358227 feature functions !)
ExperimentsSimultaneous noun-phrase & part-of-speech tagging B I I B I I O O O N N N O N N V O V Rockwell International Corp. 's Tulsa unit said it signed B I I O B I O B IO J N V O N O N N a tentative agreement extending its contract with Boeing Co. Two experiments • Compare exact and approximate inference • Compare accuracy of cascaded CRFs and Factorial DCRFs
Accuracy F1 for NP on 8936: 93.87 POS-tagger, (Brill, 1994)
Summary • Many natural language tasks are solved by chaining errorful subtasks. • Approach: Jointly solve all subtasks in a single graphical model. • Learn dependence between subtasks • Allow higher-level to inform lower level • Improved joint and POS accuracy over cascaded model, but NP accuracy lower. • Current work: Emphasize one subtask
Maximize Marginal Likelihood (Ongoing work) NP POS
State-of-the-art Performance • POS tagging: • 97% (Brill, 1999) • NP chinking: • 94.38% (Sha and Pereira) • 94.39% (?)
Alternatives to Traditional Joint • Optimize Marginal Likelihood • Optimize Utility • Optimize Margin (M3N) [Taskar, Guestrin, Koller 2003]
Maximize Marginal Likelihood (Ongoing work) NP POS
Undirected Graphical Models Directed Undirected
Hidden Markov Models Graphical Model Training p(,)=p() p(|) p(|) p(|) p(|) p(|)
Hidden Markov Models Graphical Model Finite-State p(,)=p() p(|) p(|) p(|) p(|) p(|)