280 likes | 462 Views
Learning Semantic Context-sensitive Term Associations for Information Retrieval. Tamsin Maxwell School of Informatics, University of Edinburgh Dawei Song School of Computing, The Robert Gordon University. Outline. Motivation Context-sensitive Information Inference and Semantics
E N D
Learning Semantic Context-sensitive Term Associations for Information Retrieval Tamsin Maxwell School of Informatics, University of Edinburgh Dawei Song School of Computing, The Robert Gordon University
Outline • Motivation • Context-sensitive Information Inference and Semantics • Event Extraction Algorithm • Application in Information Retrieval
Motivation T1 = “ President Ronald Reagan ” US former president, administration, budget, tax, etc. T2 = “ President Reagan and Iran-Contra affair ” T3 = “ Reagan and Nakasone ” Iran arms sales scandal Japan trade war “Reagan” in different contexts
T2 = “President Reagan and Iran-Contra affair” Information Inference Iran arms sales scandal Motivation “Reagan” in context of “Iran contra” carries/implies the information of “arms sales scandal”
Context-sensitive Information Inference • Automatic derivation of implicit term associations from text • Multi-dimensional representation of information • Concept combination • Information flow computation
Multi-dimensional Representation of Information Hyperspace Analogue to Language (HAL)
Reagan = <administration: 0.46, bill: 0.07, budget: 0.08, congress: 0.07, economic: 0.05, house: 0.09, officials: 0.05, president: 0.80, reagan: 0.09, senate: 0.05, tax: 0.06, trade: 0.09,veto: 0.08, white: 0.06, …> Multi-dimensional Representation of Information Collection: Reuters-21589
Handling Complex Sentences “…presence on German soil. The Germans, given as they are to romanticism, pacifism and self-absorption, aren't sure whether they will allow American nuclear weapons to remain in Germany much longer.” --WSJ 1990 ...presence (on) German soil. (The) Germans, given (as they are to) romanticism, pacifism... window size: 6 6 6 5 5 4 weight 4 weight = window_size – distance + 1 • soil: 6, given: 6, German: 5, romanticism: 5, presence: 4, pacifism: 4
HAL vs Semantic HAL • Semantic HAL • allow: 6, weapons: 6, want: 6, missiles: 6, seem: 6, believe: 6, American: 5, nuclear: 4 “The Germans, given as they are to romanticism, pacifism and self-absorption, aren't sure whether they will allow American nuclear weapons to remain in Germany much longer.” • allow Germans weapons American nuclear • want not Germans missiles • seem Germans • believe Germans
Combining Vectors in HAL Space Information Flow Reagan + Iran = ? A more general and flexible way of deriving the meaning from any arbitrary composition of related terms, not being limited to syntactically valid phrases.
Combining Vectors in HAL Space • Concepts ordered by dominance values (based on IDF) • Scaling the dimensions in the dominant concept higher • Increase the weights of intersecting dimensions • Vector addition • Normalize the composition vector and set a threshold to cut off lowly weighted dimensions • For more than two concepts, this can be done recursively
Reagan = <administration: 0.46, bill: 0.07, budget: 0.08, congress: 0.07, economic: 0.05, house: 0.09, officials: 0.05, president: 0.80, reagan: 0.09, senate: 0.05, tax: 0.06, trade: 0.09,veto: 0.08, white: 0.06, …, > Iran = < arms: 0.71, attack: 0.18, gulf: 0.21, iran: 0.33, iraq:0.31, missiles: 0.11, offensive: 0.13, oil: 0.18, reagan: 0.10, sales: 0.20, scandal: 0.25, war: 0.20,… > Reagan Iran= <administration:0.11, affair:0.06, arms:0.72, attack:0.08,contra:0.14, deal:0.08, diversion:0.07, gulf:0.11, house:0.10, initiative:0.06,iran:0.22, november:0.06, policy:0.07,president:0.26, profits:0.08,reagan:0.23, sales:0.15, scandal:0.31, secret:0.06, senate:0.06, war:0.12 > Combining Vectors in HAL Space
Combining Vectors in HAL Space with Semantics • Concepts can be ordered by semantic dominance (based on IDF) • weapons American nuclear • Use modification dictionary in event parser • Proceed as for normal HAL space • Pred=allow Arg0=they modArg0a=weapons modArg0b=American modArg0c=nuclear dominates dominates
HAL-based “information flow” Information described by tokens i1…,incarries information described by j ..with respect to a given collection iff concepts are included Barwise & Seligman (1997)
Event Extraction Algorithm • Preprocessing • Combined syntactic-semantic parsing • Semantic role labeling • Dependency parsing • Trace the dependency tree from predicates and arguments to identify event structure • Event or modifier pruning
Semantic-Syntactic Parsing • Not all predicates indicate events • Events are interpreted using dependencies
Event Extraction replied defendant permit not replied defendant enjoy lands The defendant replied that no City permit was necessary as defendant lands enjoy interjurisdictional immunity…
Application in Information Retrieval • IR can be viewed as a reasoning process to capture the information transformation • Query Expansion: QQ’ • The use of information flow to derive an improved query
Information Flow for Query Expansion space program |- program:1.00 space:1.00 nasa:0.97 U.S.:0.96 agency:0.95 shuttle:0.95 national:0.95 soviet:0.95 aeronautics:0.87 satellite:0.87 scientists:0.83 flights:0.78 pentagon:0.78 • Q as initial query submitted to a search system • Apply information flow computation to a number (e.g., 30) of pseudo-relevant documents • A number of top ranked information flows derived from Q and their associated weights form an expanded query • Submit the expanded query back to the retrieval system and evaluate the average precision of the newly retrieved documents
Aspect Hidden Markov Model Q = {space program} {{space}, {program}, {space program}} Importance of Qj in Q Information flow Huang, Q., and Song, D. (2008) A Latent Variable Model for Query Expansion Using the Hidden Markov Model. ACM 17th Conference on Information and Knowledge Management (CIKM 2008), poster, pp. 1417-1418.
Food for Thought • Can incorporation of semantic word dependencies consistently enhance IR precision/performance? • Can they be incorporated into existing IR systems?
References • Dawei Song and Peter Bruza (2001), Discovering Information Flow Using a High Dimensional Conceptual Space. SIGIR 2001: 327-333. • Dawei Song and Peter Bruza (2003), Towards Context Sensitive Information Inference. JASIST 54(4): 321-334. • K. Tamsin Maxwell, Jon Oberlander and Victor Lavrenko (2008). Evaluation of Semantic Events for Legal Case Retrieval. ESAIR 2008: 39-41. • Huang and Dawei Song (2008), A Latent Variable Model for Query Expansion Using the Hidden Markov Model. ACM 17th Conference on Information and Knowledge Management (CIKM 2008), poster, pp. 1417-1418.
Questions? Thank you!
Sample Legal Query Query: “What is the liability of the United States under the Federal Tort Claims Act for injuries sustained by employees of an independent contractor working under contract with an agency of the United States government?” Document: “The DEFENDANT replied that no City permit was necessary because DEFENDANT lands enjoy interjurisdictional immunity as public property within the meaning of STATUTE of the Constitution Act , 1867 , or because the management of those lands is vital to the DEFENDANT ‘s federal under taking pursuant to the federal STATUTE jurisdiction over navigation and shipping .”