1 / 27

Finite State Parsing & Information Extraction

Finite State Parsing & Information Extraction. CMSC 35100 Intro to NLP January 10, 2006. Roadmap. Motivation Limitations & Advantages Example: Fastus Finite state cascades Other applications. Why NOT Finite State?. Fundamental representational limitations

jlankford
Download Presentation

Finite State Parsing & Information Extraction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Finite State Parsing &Information Extraction CMSC 35100 Intro to NLP January 10, 2006

  2. Roadmap • Motivation • Limitations & Advantages • Example: Fastus • Finite state cascades • Other applications

  3. Why NOT Finite State? • Fundamental representational limitations • Finite state systems can’t handle recursion • Unsupported phenomena: center embedding, etc • Fundamentally a strict subset of context-free languages

  4. Why Finite State? • Significant computational advantages • FAST!!!! • 10 mins vs 36 hours for 100 sentences • Can compile rules, even CFGs, to transducers • Approximate CFGs, overgenerate in specific ways • Toolkits • Minimal representational limitations • Most recursion is actually bounded • Human memory practically limits depth of recursion • Unroll finite number of recursions • Sufficient simple representation for many tasks • Information extraction, speech recognition

  5. Fastus & MUC • MUC: Message Understanding Conference • DARPA shared-task evaluation • Task: Information extraction • Essentially form-filling • Only 10% info relevant, no nuance • Joint ventures, terrorist incidents • Original system: Deep syntax, KR, Semantics • High precision – best in task • SLOW!!!! 36 hours for 100 messages

  6. MUC Example Bridgestone Sports Co. said Friday it has set up a joint venture in Taiwan with a local concern and a Japanese trading house to produce golf clubs to be shipped to Japan. The joint venture, Bridgestone Sports Taiwan Co., capitalized at 20 million new Taiwan dollars, will start production in January 1990 with production of 20,000 iron and “metal wood” clubs a month. TIE-UP-1: Relationship: TIE-UP Entities: “Bridgestone Sports Co.”, “a local concern”, “a Japanese trading house Joint Venture Company: “Bridgestone Sports Taiwan Co.” Activity: ACTIVITY-1 Amount: NT$20000000 A-1: Activity: PRODUCTION Company: “Bridgestone Sports Taiwan Co.” Product: “iron and ‘metal wood’ clubs” Start Date: DURING: January 1990

  7. Finite-State Cascade • Cascade of FSTs • Separates stages of processing • Initially: smaller units, linguistically base • Later: larger units, domain specific information • Complex words: multi-words, proper names • Basic phrases: noun groups, verb groups, part • Complex phrases: Complex NG, VG • Domain events: Application info • Merging structures: co-ref, related info

  8. Complex Words • Identifies “multiwords” • E.g. set up, trading house, joint venture • Company names, people, locations, etc • Fixed expressions recognized with microgrammars • Subsequent stages can also distinguish • E.g. preceding appositive

  9. MUC Example: Basic Phrases Bridgestone Sports Co. said Friday it has set up a joint venture in Taiwan with a local concern and a Japanese trading house to produce golf clubs to be shipped to Japan. Company name: Bridgestone Sports Co. Verb Group: to be shipped Verb Group: said Preposition: to Noun Group: Friday Location: Japan Noun Group: it Verb Group: had set up Noun Group: a joint venture Preposition: in Location: Taiwan Preposition: with Noun Group: a local concern Conjunction: and Noun Group: a Japanese trading house Verb Group: to produce Noun Group: golf clubs

  10. Noun Group Extraction • Noun Group: Head noun + premodifiers NG -> Pronoun | Time-NP | Date-NP | (DETP) (Adjs) HdNns DETP Ving HdNns | DETP-CP (and HdNns) DETP -> DETP-CP | DETP-INCP DETP-CP -> ({Adv-pre-num|”another”| {Det|Pro-Poss}({Adv-pre-num|”only” (“other”)})} Number | Q | Q-er | (“the”) Q-est | “another” | Det-cp | DetQ | Pro-Poss-cp DETP-INCP -> {{Det|Pro-Poss} “only” | “a” | “an” | Det-incomp | Pros-Poss-incomp } (“other”) | (DET-CP) “other”}

  11. Noun Group Extraction • Adjs -> AdjP ({“,”| (”,”) Conj} {AdjP | Vparticiple})* • AdjP -> Ordinal | ({Q-er|Q-est}{Adj|Vparticiple}+ | {N[sing,!Time-NP](“-”){Vparticiple} • | Number (“-”) {“month”|”day”|”year”}(“-”)”old”} • HdNns -> HdNn (“and” HdNn) • HdNn -> PropN | {PreNs | PropN PreNs} N[!Time-NP]| • {PropN CommonN[!Time-NP]} • PreNs -> PreN (“and” PreN2) • PreN -> (Adj “-”) Common-Sing-N • PreN2 -> PreN | Ordinal | Adj-noun-like

  12. Noun Group Extraction: AdjP FSA e “and” AdjP AdjP “,” 1 0 2 3 Vparticiple “,””and”

  13. Noun Group Extraction: Adj FSA “-” Vparticiple Nsing[!TimeNP] 1 2 Ordinal 3 0 Vparticiple Adj Q-est Vparticiple e “old” 4 9 e Q-er Adj 5 8 “-” “-” “month” “day” “year” 6 7 e e

  14. Complex Phrases • Build up from basic noun and verb groups • Attach appositives • Construct measure phrases • Attach prepositional phrases • Conjoin noun phrases • Combine syntactic variants, modalities with common meaning • Identify domain entities and events

  15. Domain Events • Ordered list of complex phrases • Drops out all other elements -> robustness • Transitions driven by headword + phrasetype • E.g. “company-NounGroup”,”Formed-PassiveVerbGroup” • <Company> <Set-up><Joint-Venture>with <Company> • <Produce> <Product> • Map to particular extracted units • E.g. Entities in set-up, Production+Product Type

  16. Multi-layer Cascades • Finesse the recursion problem • Automata construction expands rules->automata • AdjP’s are duplicated, but no self-reference • AdjPs and NPs in conjunction independent • One level identifies base, non-recursive NGs • Next levels combine with • Measure phrases, prepositional phrases, conjunction • Limits depth of possible “recursive” constructs

  17. More Complete FST Parsing • Roche 1996, 97, etc • Construct syntactic dictionary • S | N thinks that S; S| N kept N • N | John; N| Peter; N|the book • Convert entries to finite-state transducers • [S a thinks that B S]-> • (S [N a N] <V thinks V> that [S b S] S) • [N John N] => (N John N)

  18. Transducer Dictionary

  19. Transducer Dictionary

  20. Full Transducer Dictionary

  21. Transducers -> Parser • Transducer dictionary = Union of transducers • T_dic = U T_i • Parser = Repeated application of transducers • Repeat until output = input • Transduction causes no change

  22. Finite-State Extensions • Finite-State Approaches to • Tree Adjoining Grammars • Machine translation • Multimodal analysis and interpretation

  23. Probabilistic CFGs

  24. Handling Syntactic Ambiguity • Natural language syntax • Varied, has DEGREES of acceptability • Ambiguous • Probability: framework for preferences • Augment original context-free rules: PCFG • Add probabilities to transitions 0.2 NP -> N NP -> Det N NP -> Det Adj N NP -> NP PP 0.45 0.85 VP -> V VP -> V NP VP -> V NP PP S -> NP VP S -> S conj S 1.0 PP -> P NP 0.65 0.45 0.15 0.10 0.10 0.05

  25. PCFGs • Learning probabilities • Strategy 1: Write (manual) CFG, • Use treebank (collection of parse trees) to find probabilities • Parsing with PCFGs • Rank parse trees based on probability • Provides graceful degradation • Can get some parse even for unusual constructions - low value

  26. Parse Ambiguity • Two parse trees S S NP VP NP VP N V NP NP PP N V NP PP Det N P NP Det N P NP Det N Det N I saw the man with the duck I saw the man with the duck

  27. Parse Probabilities • T(ree),S(entence),n(ode),R(ule) • T1 = 0.85*0.2*0.1*0.65*1*0.65 = 0.007 • T2 = 0.85*0.2*0.45*0.05*0.65*1*0.65 = 0.003 • Select T1 • Best systems achieve 92-93% accuracy

More Related