120 likes | 217 Views
Tensor Query Expansion: a cognitively motivated relevance model. Mike Symonds, Peter Bruza, Laurianne Sitbon and Ian Turner Queensland University of Technology. Introduction. Introduction Background Tensor Query expansion Results Future work.
E N D
Tensor Query Expansion: a cognitively motivated relevance model Mike Symonds, Peter Bruza, Laurianne Sitbon and Ian TurnerQueensland University of Technology
Introduction IntroductionBackgroundTensor Query expansionResultsFuture work • We use a formal model of word meaning to simulate cognitive processes used when a user formulates their query • Use this approach for query expansion in an ad hoc retrieval task. • Our approach shows significant improvement in retrieval effectiveness over state-of-the-art for: • short queries and • newswire TREC data sets
Query Expansion (QE) IntroductionBackgroundTensor Query expansionResultsFuture work • Geometric representations: • Rocchio (Rocchio, 1971); • Probabilistic representations: • Relevance models (Lavrenko and Croft, 2001); P(w|R) • Term dependency approaches • Latent concept expansion (Metzler and Croft, 2007) and positional relevance model (Lv and Zhai, 2010)
Motivation IntroductionBackgroundTensor Query expansionResultsFuture work • The user’s information need is a cognitive construct. • The use of cognitive models in query expansion has not been extensively studied. • Trend in QE research: term dependency approaches are outperforming term independent. • However, current semantic features have little, if any, linguistic meaning.
Hypothesis IntroductionBackgroundTensor Query expansionResultsFuture work • Using a cognitively motivated model of word meaning within the query expansion process can significantly improve retrieval effectiveness. • Model of Word Meaning • Tensor Encoding Model (Symonds,2011) • Structural Linguistic Theory • Ferdinand de Saussure (1916) • Syntagmatic associations (hot-sun) • Paradigmatic associations (quick-fast)
Modeling word meaning IntroductionBackgroundTensor Query ExpansionResultsFuture work • Syntagmatic Associations • Use efficient cosine measure • Paradigmatic Associations • Use probabilistic based measure Ssyn(Q,w) =
Tensor Query Expansion IntroductionBackgroundTensor Query ExpansionResultsFuture work • Formally combine the syntagmatic and paradigmatic features • using a Markov random field, and • fit into the relevance modeling framework; replace P(w|R) with PG, Γ(w|Q)
Ad Hoc Retrieval Results IntroductionBackgroundTensor Query ExpansionResultsFuture work • Mean average precision (MAP)
Ad Hoc Retrieval Results IntroductionBackgroundTensor Query ExpansionResultsQuestions • Robustness Associated Press Wall Street Journal
Ad Hoc Retrieval Results IntroductionBackgroundTensor Query ExpansionResultsFuture work • Parameter sensitivity • Observe the change in MAP for different mix of syntagmatic and paradigmatic (i.e., gamma) Associated Press Wall Street Journal
Summary of contribution IntroductionBackgroundTensor Encoding ModelResultsFuture work • Cognitively motivated approach to performing query expansion. • Use of semantic features that have explicit linguistic meaning. • Demonstrated significant improvement in retrieval effectiveness over the unigram relevance model.
Future Work IntroductionBackgroundTensor Query ExpansionResultsFuture work • Evaluate on larger data sets • TREC GOV2, ClueWeb • Compare to the Positional Relevance Model or LCE • Evaluate on verbose queries • More semantic information in longer queries Questions?