1 / 81

Learning for Semantic Parsing of Natural Language

Explore the shift from syntactic processing to semantic analysis in natural language learning, focusing on semantic parsers and their application in tasks such as word sense disambiguation and information extraction. Learn about domain-dependent interfaces like CLang and GeoQuery in semantic parsing for applications in RoboCup and database querying. Discover how semantic parsers can be trained automatically from NL-LF pairs, making specific applications more manageable. Gain insights into CHILL, WOLFIE, and SCISSOR semantic-parser learners, and understand the cognitive science motivation behind NL-LF training examples.

bmulloy
Download Presentation

Learning for Semantic Parsing of Natural Language

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning for Semantic Parsing of Natural Language Raymond J. Mooney Ruifang Ge, Rohit Kate, Yuk Wah Wong John Zelle, Cynthia Thompson July 31, 2005

  2. Syntactic Natural Language Learning • Most computational research in natural-language learning has addressed “low-level” syntactic processing. • Morphology (e.g. past-tense generation) • Part-of-speech tagging • Shallow syntactic parsing • Syntactic parsing

  3. Semantic Natural Language Learning • Learning for semantic analysis has been restricted to relatively small, isolated tasks. • Word sense disambiguation (e.g. SENSEVAL) • Semantic role assignment (determining agent, patient, instrument, etc., e.g. FrameNet, PropBank) • Information extraction

  4. Semantic Parsing • A semantic parser maps a natural-language sentence to a complete, detailed semantic representation (logical form). • For many applications, the desired output is immediately executable by another program. • Two application domains: • CLang: RoboCup Coach Language • GeoQuery: A Database Query Application

  5. CLang: RoboCup Coach Language • In RoboCup Coach competition teams compete to coach simulated players • The coaching instructions are given in a formal language called CLang If the ball is in our penalty area, then all our players except player 4 should stay in our half. Simulated soccer field Coach Semantic Parsing ((bpos (penalty-area our)) (do (player-except our{4}) (pos (half our))) CLang

  6. GeoQuery: A Database Query Application • Query application for U.S. geography database containing about 800 facts [Zelle & Mooney, 1996] How many cities are there in the US? User Semantic Parsing answer(A, count(B, (city(B), loc(B, C), const(C, countryid(USA))),A)) Query

  7. Semantic-Parser Learner Semantic Parser Logical Form Natural Language Learning Semantic Parsers • Manually programming robust semantic parsers is difficult due to the complexity of the task. • Semantic parsers can be learned automatically from sentences paired with their logical form. NLLF Training Exs

  8. Engineering Motivation • Most computational language-learning research strives for broad coverage while sacrificing depth. • “Scaling up by dumbing down” • Realistic semantic parsing currently entails domain dependence. • Domain-dependent natural-language interfaces have a large potential market. • Learning makes developing specific applications more tractable. • Training corpora can be easily developed by tagging existing corpora of formal statements with natural-language glosses.

  9. Cognitive Science Motivation • Most natural-language learning methods require supervised training data that is not available to a child. • General lack of negative feedback on grammar. • No POS-tagged or treebank data. • Assuming a child can infer the likely meaning of an utterance from context, NLLF pairs are more cognitively plausible training data.

  10. Our Semantic-Parser Learners • CHILL+WOLFIE (Zelle & Mooney, 1996; Thompson & Mooney, 1999, 2003) • Separates parser-learning and semantic-lexicon learning. • Learns a deterministic parser using ILP techniques. • SILT (Kate, Wong & Mooney, 2005) • Learns symbolic transformation rules for mapping directly from NL to LF. • SCISSOR (Ge & Mooney, 2005) • Integrates semantic interpretation into a statistical syntactic parser (Collins model-3).

  11. CHILL(Zelle & Mooney, 1992-96) • Semantic parser acquisition system using Inductive Logic Programming (ILP) to induce a parser written in Prolog. • Starts with a parsing “shell” written in Prolog and learns to control the operators of this parser to produce the given I/O pairs. • Requires a semantic lexicon, which for each word gives one or more possible semantic representations. • Parser must disambiguate words, introduce proper semantic representations for each, and then put them together in the right way to produce a proper representation of the sentence.

  12. CHILL Example • U.S. Geographical database • Sample training pair • Cuál es el capital del estado con la población más grande? • answer(C, (capital(S,C), largest(P, (state(S), population(S,P))))) • Sample semantic lexicon • cuál : answer(_,_) • capital: capital(_,_) • estado: state(_) • más grande: largest(_,_) • población: population(_,_)

  13. WOLFIE(Thompson & Mooney, 1995-1999) • Learns a semantic lexicon for CHILL from the same corpus of semantically annotated sentences. • Determines hypotheses for word meanings by finding largest isomorphic common subgraphs shared by meanings of sentences in which the word appears. • Uses a greedy-covering style algorithm to learn a small lexicon sufficient to allow compositional construction of the correct representation from the words in a sentence.

  14. WOLFIE Lexicon Learner Semantic Lexicon Logical Form CHILL Parser Learner Natural Language Semantic Parser WOLFIE + CHILLSemantic Parser Acquisition NLLF Training Exs

  15. Semantic Parsing using Transformation Rules • SILT(Semantic Interpretation by Learning Transformations) • Uses pattern-based transformation rules which map natural language phrases to formal language constructs • Transformation rules are repeatedly applied to the sentence to construct its formal language expression

  16. RULE CONDITION DIRECTIVE bowner TEAM UNUM do TEAM UNUM ACTION our our 4 shoot 4 Formal Language Grammar NL: If our player 4 has the ball, our player 4 should shoot. CLang: ((bowner our {4}) (do our {4} shoot)) CLang Parse: • Non-terminals: RULE, CONDITION, ACTION… • Terminals: bowner, our, 4… • Productions:RULE  CONDITION DIRECTIVE DIRECTIVE  do TEAM UNUM ACTION ACTION  shoot

  17. S VP NP NP VBZ TEAM UNUM has DT NN the ball Transformation Rule Representation • Rule has two components: a natural language pattern and an associated formal language template • Two versions of SILT: • String-based rules: used to convert natural language sentence directly to formal language • Tree-based rules: used to convert syntactic tree to formal language word gap

  18. Example of Semantic Parsing If ourplayer 4 has the ball, our player 4 should shoot.

  19. TEAM TEAM our our Example of Semantic Parsing If player 4 has the ball, player 4 should shoot . our our

  20. TEAM TEAM our our Example of Semantic Parsing If player 4 has the ball, player 4 should shoot .

  21. UNUM UNUM TEAM TEAM our our 4 4 Example of Semantic Parsing If has the ball, should shoot . player 4 player 4

  22. UNUM UNUM TEAM TEAM our our 4 4 Example of Semantic Parsing If has the ball, should shoot .

  23. ACTION UNUM UNUM TEAM TEAM shoot our our 4 4 Example of Semantic Parsing If has the ball, should . shoot

  24. ACTION UNUM UNUM TEAM TEAM shoot our our 4 4 Example of Semantic Parsing If has the ball, should .

  25. CONDITION ACTION UNUM UNUM TEAM TEAM (bowner our {4}) shoot our our 4 4 Example of Semantic Parsing If , should . has the ball

  26. CONDITION ACTION UNUM TEAM (bowner our {4}) shoot our 4 Example of Semantic Parsing If , should .

  27. CONDITION DIRECTIVE ACTION UNUM TEAM (do our {4} shoot) (bowner our {4}) shoot our 4 Example of Semantic Parsing If , . should

  28. CONDITION DIRECTIVE (do our {4} shoot) (bowner our {4}) Example of Semantic Parsing If , .

  29. CONDITION DIRECTIVE RULE ((bowner our {4}) (do our {4} shoot)) (do our {4} shoot) (bowner our {4}) Example of Semantic Parsing If , .

  30. Learning Transformation Rules • SILT induces rules from a corpora of NL sentences paired with their formal representations • Patterns are learned for each production by bottom-up rule learning • For every production: • Call those sentences positives whose formal representations’ parses use that production • Call the remaining sentences negatives

  31. Rule Learning for a Production • SILT applies greedy-covering, bottom-up rule induction method that repeatedly generalizes positives until they start covering negatives CONDITION (bpos REGION) positives negatives • The ball is in REGION , our player 7 is in REGION and no opponent is around our player 7 within 1.5 distance. • If the ball is in REGION and not in REGION then player 3 should intercept the ball. • During normal play if the ball is in the REGION then player 7 , 9 and 11 should dribble the ball to the REGION . • When the play mode is normal and the ball is in the REGION then our player 2 should pass the ball to the REGION . • All players except the goalie should pass the ball to REGION if it is in RP18. • If the ball is inside rectangle ( -54 , -36 , 0 , 36 ) then player 10 should position itself at REGION with a ball attraction of REGION . • Player 2 should pass the ball to REGION if it is in REGION . • If our player 6 has the ball then he should take a shot on goal. • If player 4 has the ball , it should pass the ball to player 2 or 10. • If the condition DR5C3 is true , then player 2 , 3 , 7 and 8 should pass the ball to player 3. • During play on , if players 6 , 7 or 8 is in REGION , they should pass the ball to players 9 , 10 or 11. • If "Clear_Condition" , players 2 , 3 , 7 or 5 should clear the ball REGION . • If it is before the kick off , after our goal or after the opponent's goal , position player 3 at REGION . • If the condition MDR4C9 is met , then players 4-6 should pass the ball to player 9. • If Pass_11 then player 11 should pass to player 9 and no one else.

  32. Generalization of String Patterns ACTION  (pos REGION) Pattern 1:Always position player UNUM at REGION . Pattern 2: Whenever the ball is in REGION, position player UNUM near the REGION . Find the highest scoring common subsequence:

  33. Generalization of String Patterns ACTION  (pos REGION) Pattern 1:Always position player UNUM at REGION. Pattern 2: Whenever the ball is in REGION, position player UNUM near the REGION. Find the highest scoring common subsequence: Generalization:position player UNUM [2] REGION .

  34. Generalization of Tree Patterns REGION  (penalty-area TEAM) Pattern 1: Pattern 2 Find common subgraphs. NP NP PRP$ NN NN NP NN NN penalty area TEAM penalty TEAM POS box ’s

  35. Generalization of Tree Patterns REGION  (penalty-area TEAM) Pattern 1: Pattern 2 Find common subgraphs. NP NP PRP$ NN NN NP NN NN penalty area TEAM penalty TEAM POS box ’s NP * NN NN Generalization: TEAM penalty

  36. Rule Learning for a Production CONDITION  (bpos REGION) positives negatives • The ball is in REGION , our player 7 is in REGION and no opponent is around our player 7 within 1.5 distance. • If the ball is in REGION and not in REGION then player 3 should intercept the ball. • During normal play if the ball is in the REGION then player 7 , 9 and 11 should dribble the ball to the REGION . • When the play mode is normal and the ball is in the REGION then our player 2 should pass the ball to the REGION . • All players except the goalie should pass the ball to REGION if it is in REGION. • If the ball is inside REGION then player 10 should position itself at REGION with a ball attraction of REGION . • Player 2 should pass the ball to REGION if it is in REGION . • If our player 6 has the ball then he should take a shot on goal. • If player 4 has the ball , it should pass the ball to player 2 or 10. • If the condition DR5C3 is true , then player 2 , 3 , 7 and 8 should pass the ball to player 3. • During play on , if players 6 , 7 or 8 is in REGION , they should pass the ball to players 9 , 10 or 11. • If "Clear_Condition" , players 2 , 3 , 7 or 5 should clear the ball REGION . • If it is before the kick off , after our goal or after the opponent's goal , position player 3 at REGION . • If the condition MDR4C9 is met , then players 4-6 should pass the ball to player 9. • If Pass_11 then player 11 should pass to player 9 and no one else. Bottom-up Rule Learner

  37. Rule Learning for a Production CONDITION  (bpos REGION) positives negatives • The CONDITION , our player 7 is in REGION and no opponent is around our player 7 within 1.5 distance. • If the CONDITION and not in REGION then player 3 should intercept the ball. • During normal play if the CONDITION then player 7 , 9 and 11 should dribble the ball to the REGION . • When the play mode is normal and the CONDITION then our player 2 should pass the ball to the REGION . • All players except the goalie should pass the ball to REGION if CONDITION. • If the CONDITION then player 10 should position itself at REGION with a ball attraction of REGION . • Player 2 should pass the ball to REGION if CONDITION . • If our player 6 has the ball then he should take a shot on goal. • If player 4 has the ball , it should pass the ball to player 2 or 10. • If the condition DR5C3 is true , then player 2 , 3 , 7 and 8 should pass the ball to player 3. • During play on , if players 6 , 7 or 8 is in REGION , they should pass the ball to players 9 , 10 or 11. • If "Clear_Condition" , players 2 , 3 , 7 or 5 should clear the ball REGION . • If it is before the kick off , after our goal or after the opponent's goal , position player 3 at REGION . • If the condition MDR4C9 is met , then players 4-6 should pass the ball to player 9. • If Pass_11 then player 11 should pass to player 9 and no one else. Bottom-up Rule Learner

  38. accuracy coverage Rule Learning for All Productions • Transformation rules for productions should cooperate globally to generate complete semantic parses • Redundantly cover every positive example by β = 5 best rules • Find the subset of these rules which best cooperate to generate complete semantic parses on the training data

  39. S-bowner NP-player VP-bowner PRP$-team NN-player CD-unum VB-bowner NP-null our player 2 has DT-null NN-null the ball S-bowner NP-player VP-bowner PRP$-team NN-player CD-unum VB-bowner NP-null our player 2 has DT-null NN-null the ball SCISSOR: Semantic Composition that Integrates Syntax and Semantics to get Optimal Representations • Based on a fairly standard approach to compositional semantics [Jurafsky and Martin, 2000] • A statistical parser is used to generate a semantically augmented parse tree (SAPT) • Augment Collins’ head-driven model 2 (Bikel’s implementation, 2004) to incorporate semantic labels • Translate SAPT into a complete formal meaning representation (MR) MR: bowner(player(our,2))

  40. NL Sentence learner SAPT Training Examples SAPT TRAINING TESTING ComposeMR MR Overview of SCISSOR Integrated Semantic Parser

  41. ComposeMR bowner player bowner null team player unum bowner 2 null null our player has the ball

  42. ComposeMR bowner(_) player(_,_) bowner(_) null team player(_,_) unum bowner(_) 2 null null our player has the ball

  43. player(team,unum) bowner(player) ComposeMR bowner(player(our,2)) bowner(_) bowner(_) bowner(_) bowner(_) player(our,2) player(_,_) player(_,_) null null team player(_,_) unum bowner(_) 2 null null our player has the ball

  44. S(has) NP(player) VP(has) NP(ball) PRP$ NN CD VB DT NN our player 2 has the ball Collins’ Head-Driven Model 2 • A generative, lexicalized model • Each node on the tree has a syntactic label, it is also lexicalized with its head word

  45. Modeling Rule Productions as Markov Processes S(has) VP(has) Ph(VP | S, has)

  46. Modeling Rule Productions as Markov Processes S(has) VP(has) {NP } { } Ph(VP | S, has) × × Prc({} | S, VP, has) Plc({NP} | S, VP, has)

  47. Modeling Rule Productions as Markov Processes S(has) NP(player) VP(has) {NP } { } Ph(VP | S, has) × × Prc({} | S, VP, has) × Plc({NP} | S, VP, has) Pd(NP(player) | S, VP, has, LEFT, {NP})

  48. Modeling Rule Productions as Markov Processes S(has) NP(player) VP(has) { } { } Ph(VP | S, has) × × Prc({} | S, VP, has) × Plc({NP} | S, VP, has) Pd(NP(player) | S, VP, has, LEFT, {NP})

  49. Modeling Rule Productions as Markov Processes S(has) NP(player) VP(has) STOP { } { } Ph(VP | S, has) × × Prc({} | S, VP, has) × Plc({NP} | S, VP, has) Pd(NP(player) | S, VP, has, LEFT, {NP}) × Pd(STOP | S, VP, has, LEFT, {})

  50. Modeling Rule Productions as Markov Processes S(has) NP(player) VP(has) STOP STOP { } { } Ph(VP | S, has) × × Prc({} | S, VP, has) × Plc({NP} | S, VP, has) Pd(NP(player) | S, VP, has, LEFT, {NP}) × Pd(STOP | S, VP, has, LEFT, {}) × Pd(STOP | S, VP, has, RIGHT, {})

More Related