950 likes | 1.17k Views
Advanced Question Answering: Plenty of Challenges to Go Around. Dr. John D. Prange AQUAINT Program Director JPrange@nsa.gov 301-688-7092 http://www.ic-arda.org 25 March 2002. Outline. Introducing ARDA Advanced Question Answering There is Room for Multiple Approaches The AQUAINT Program
E N D
Advanced Question Answering:Plenty of Challenges to Go Around Dr. John D. Prange AQUAINT Program Director JPrange@nsa.gov 301-688-7092 http://www.ic-arda.org 25 March 2002
Outline • Introducing ARDA • Advanced Question Answering • There is Room for Multiple Approaches • The AQUAINT Program • Challenges from an AQUAINT Perspective • Some Final Thoughts . . . • Questions and Comments
Introducing ARDA • MISSION: • Incubate revolutionary R&D for the shared benefit of the Intelligence Community • MEANS: • A nimble, cross-community organization • A modest, yet significant budget • Small, outward-looking staff working as “honest brokers” and “agent provocateurs” A joint Department of Defense / Intelligence Community organization launched in Dec 98
What ARDA Does We originate and manage R&D programs • With fundamental impact on future operational needs and strategies • That demand substantial, long-term venture investment to spur risk-taking • That progress measurably toward mid-term and final goals • That take many forms and employ many delivery vehicles
Community organizations Plans, forecasts, oversight Customer champions Thrust panels / managers R&D problem statements Internal peer review Industry and academia Principal funding recipients External peer review and staff How ARDA Interacts
National Security Agency Fort George G. Meade, MD Room 12A69 NBP#1 Building 301-688-7092 800-276-3747 301-688-7410 (FAX) http://www.ic-arda.org ARDA@nsa.gov Where Is ARDA?
Mr. John Lyons Community Participation Exploratory Research Programs Dr. John Prange Information Exploitation Novel Intelligence from Massive Data Quantum Information Science R&D Thrusts Dr. Dean Collins Dr. Greg Smith Digital Networking Resource Enhancement Program Mr. Greg Puffenbarger Ms. Penny Lehtola Current ARDA Programs
Outline • Introducing ARDA • Advanced Question Answering • There is Room for Multiple Approaches • The AQUAINT Program • Challenges from an AQUAINT Perspective • Some Final Thoughts . . . • Questions and Comments
Move Closer to the Question e.g. Question Classification System Specific Query; often Tailored to Question Type Traditional Information Retrieval Single Data Source Ranked List of Hopefully “Relevant” Documents QA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Shallow Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Move Closer to the Answer e.g. Passage Retrieval “Answer” Open Domain Factoid Question Answering Single, Factoid Question ?
ARDA & DARPA co-sponsoring the Question Answering Track in the NIST’s organized Text Retrieval Conference (TREC) Program. (Starting with TREC-8 in Nov 1999) TREC-10 Results (Nov 2001): 500- factual questions; About 50 questions had no answer in the TREC-10 Data sources; Used “Real” Questions Data source: approx. 3 GByte database of ~980K news stories 36 US & international organizations participated; 92 separate runs evaluated System output: top 5 regions (50 bytes) in a single story believed to contain Answer to the given question Top System: 70% of the “Answers” found in their top 5 50-byte Passages TREC QA Track Results
Pilot EvaluationsTREC 10 QA Track • The “List Task” • Sample Questions: • “Name 4 US cities that have a “Shubert” Theater” • “Name 30 individuals who served as a cabinet officer under Ronald Reagan” • Evaluation Metric: (Number of distinct instances divided by the target number of instances averaged over 25 questions) • Top System among 18 runs: Achieved 76% Accuracy • The “Context Task” • Sample Series of Questions: • “How many species of spiders are there?” • “How many are poisonous to humans?” • “What percentage of spider bites in the US are fatal?” • Evaluation Metric: Same as Main Task; 10 Series of Questions; 42 total Questions) • Top System: Found answer for 34 of the 42 total questions (81%)
“Ask Jeeves” Approach • Start with Your Question • Identify Key Words & • Classifies the Type of • Question • Respond with rephrased • “Questions” for which • “Ask Jeeves” knows the • Answer • Provide Additional Web • Sites as a fall back position • (a la --- a more traditional • web search engine)
Tailored Question Answering Approaches • FAQ (Frequently Asked Questions) • Help Desks / Customer Service Phone Centers • Accessing Complex set of Technical Maintenance Manuals • Integrating QA in Knowledge Management and Portals • Wide variety of Other E-Business Applications Multiple Commercial/Research Groups are currently pursuing the Application of Question Answering Methods to:
Structured Knowledge-Base Approach • Create comprehensive • Knowledge Base(s) or • other Structured Data • Base(s) • At the 10K Axiom • Level -- Capable of • Answering factual • questions within • domain • At the 100K Axiom • Level -- Answer cause • & effect/capability • Questions • At the 1000K Axiom • Level -- Answer Novel • Questions; ID • alternatives Deepest QA but Limited to Given Subject Domain
Overarching Context / Operational Requirement Intelligence Analysts AQUAINTAdvanced QUestion & Answering for INTelligence In a foreign news broadcast a team of analysts observe a previously unknown individual conferring with the Foreign Minister. They suspect that he/she is really a new senior advisor. What influence does he/she have on FM? Does this signal that other policy changes are coming? What are his/her views? What do we know about him/her? Who is this advisor? And still more questions ???
Judgement Questions? Predictive Questions? Interpretive Questions? Interpreting Complex QA Scenario within a Larger Context Question Understanding And Interpretation Why Questions? Other Questions? Factoid Questions? Voice Text System Specific Queries; Fully Tailored to Series of Questions Multi-Media Information Analysts Structured Other Extend Traditional Information Retrieval Ranked Lists of “Relevant” Data Objects Determine The Answer Deeper, Automated Understanding; Extract & Analyze Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Multiple Heterogeneous Data Sources Advanced QA Interpret Results And Formulate The Answer Provide Answers in a Form Analysts Want Overarching Context / Operational Requirement Answers AQUAINTAdvanced QUestion & Answering for INTelligence
Outline • Introducing ARDA • Advanced Question Answering • There is Room for Multiple Approaches • The AQUAINT Program • Challenges from an AQUAINT Perspective • Some Final Thoughts . . . • Questions and Comments
AQUAINT:ARDA’s Plan of Attack • ARDA’s newest major Info-X R&D Program • Envisioned as a high risk, long term R&D Program: • Phase I Fall 2001 - Fall 2003 • Phase II Fall 2003 - Fall 2005 • Phase III Fall/Winter 2005 - Fall/Winter 2007 • Focus on Final Objective from start • Incrementally add media, data sources, & complexity of questions & answers during each phase • Each of AQUAINT’s 3 Phases: • Use Zero-Based, Open BAA-styled Solicitations • Focus on Key Research Objectives • Be Closely Linked to Parallel System Integration/Testbed Efforts & Data Collection/Preparation and Evaluation Efforts
Knowledge Bases; Technical Databases Partially Annotated & Structured Data Other Analysts Supplemental Use Question & Requirement Context; Analyst Background Automatic Metadata Creation QUESTION ???? KB Queries Knowledge Query Multiple Source Specific Queries Translate Queries into Source Specific Retrieval Languages Assessment, Natural Statement of Advisor, Question; Queries Collaboration Use of Answer Context Single, Merged Ranked List of Relevant “Documents” Question Under- standing and Interpretation Multimedia Examples Question & Answer Context Multiple Ranked Lists Clarification Supple- mental Use Relevant “Documents” Relevant “Knowledge” FINAL ANSWER Relevant information • Analyst Feed- back Proposed Answer extracted and combined Query Refinement Multiple Sources; Multiple Media; Multi-Lingual; Multiple Agencies where possible; based on Analyst Accumulation of Knowledge Feedback • across “Documents” • Cross “Document” • Formulate Answer for • Analyst in form they want • Multimedia Navigation • Tools for Analyst Review Results of Analysis Summaries created; Language/Media • Determine the Answer Independent Concept Iterative Refinement of Results based on Analyst Feedback Representation Inconsistencies noted; • Answer Formulation Proposed Conclusions • and Inferences Generated AQUAINT:R&D Focused on Three Functional Components
AQUAINT:Cross Cutting/Enabling Technologies R&D Areas Specifically Solicited Research Areas include: 1) Advanced Reasoning for Question Answering 2) Sharable Knowledge Sources 3) Content Representation 4) Interactive Question Answering Sessions 5) Role of Context 6) Role of Knowledge 7) Deep, Human Language Processing and Understanding
Annotated and ‘Ground Truthed’ Data Component Level / End-to-End Testing & Evaluation Separate Coordinated Activities QUESTION ???? Question Under- standing and Inter- pretation Information Retrieval Process FINAL ANSWER AQUAINT Phase I Solicitation Analysis & Synthesis Process Answer Formulation Determine the Answer Cross Cutting/Enabling Technologies Research Issues Component Integration and System Architecture Issues AQUAINT:Separate, Coordinated Activities
AQUAINT:User Testbed / System Integration • Pull together best available system components emerging from AQUAINT Program research efforts • Couple AQUAINT components with existing GOTS and COTS software • Develop end-to-end AQUAINT prototype(s) aimed at specific Operational QA environments • Government-led effort: • Directly Linked into Sponsoring Agency’s Technology Insertion Organizations • Close, working relationship with working Analysts • Provide external system development support • Mitre/Bedford will lead External System Integration / Testbed efforts • Plan to also utilize additional external researchers as Consultants / Advisors
AQUAINT:Data & Evaluation Issues • Data • Start by Using Existing Data Collections • NIST’s TREC Text Corpora • Linguistic Data Consortium (LDC) Human Language Corpora (e.g. TDT, Switchboard, Call Home, Call Friend Corpora) • Existing Knowledge Bases and Other Structured Databases • Future Data Collection & Annotation and Question/Answer Key Development will be a major effort • Will likely use combined efforts of NIST and LDC • Evaluation • Build upon highly successful TREC Q&A Track Evaluations -- NIST has lead and is currently developing a Phased Evaluation Plan tied to AQUAINT Program Plans • Cooperate to maximum extent possible with DARPA’s RKF (Rapid Knowledge Formation) Program Evaluation Efforts
ARDA’s AQUAINT Partners Program Committee Active External Stakeholders Active External Stakeholders
Univ. of Colorado-Boulder Carnegie Mellon Univ. Univ. of Massachusetts Univ. of Albany IBM Univ. of California- Berkeley BBN (2) Columbia Univ. Stanford Univ. Rutgers Univ. Princeton Univ. SRI Univ. of Southern California / Info Science Institute Cycorp SAIC Univ. of Texas-Dallas Language Computer Corp. (2) AQUAINT Program Contractors
AQUAINT Phase I Projects (Fall 01 - Fall 03) Total End-to-End Systems (6)
AQUAINT Phase I Projects (Fall 01- Fall 03) Emphasis on One or more Advanced QA System Components (6)
AQUAINT Phase I Projects (Fall 01- Fall 03) Focused Effort -- Cross Cutting / Enabling Technologies (4)
Northeast Regional Research Center • Conduct 6-8 week workshops on multiple AQUAINT-related challenge problems during FY 2002 • Sep 2001: Planning Workshop held at MITRE. • Attended by Government Technical Leaders, MITRE, and invited set of industrial, FFRDC and Academic researchers in the field • Four Potential Challenge Problems identified; Formal Proposals developed for each Challenge Problem • Two Full Workshops Funded (Temporal Issues & Multiple Perspectives) • One Mini Workshop to further explore challenge problem planned (Re-Use of Accumulated Knowledge) Hosted By MITRE, Bedford, MA Administered by CIA
FY2002 NRRC Wkshp Challenge Problems • Temporal Issues • Generate Sequence of events and activities along evolving timeline, resolving multiple levels of time references across series of documents/sources. • Leader:James Pustejovsky, Brandeis University • Multiple Perspectives • Develop approaches for handling situations where relevant information is obtained from multiple sources on the same topic but generated from different perspectives (e.g. cultural or political differences). • Leader:Jan Wiebe, University of Pittsburgh NRRC Web Site: http://nrrc.mitre.org
NRRC Planning Workshops • Re-Use of Accumulated Knowledge • Investigate strategies for structuring and maintaining previously generated knowledge for possible future use. E.g. previous knowledge might include questions and answers (original and amplified) as well as relevant and background information retrieved and processed. • Leaders:Marc Light, MITRE and Abraham Ittycheriah, IBM
Evaluation User Testbed & Northeast Regional Research Center (NRRC) Data / Operational Scenarios TBD ?? Other Support Supporting Roles
Outline • Introducing ARDA • Advanced Question Answering • There is Room for Multiple Approaches • The AQUAINT Program • Challenges from an AQUAINT Perspective • Some Final Thoughts . . . • Questions and Comments
Top 10 Challenges 1) Satisfy QA requirements of the “Professional” Information Analyst
Professional Information Analysts:Target Audience for AQUAINT -- Who are They? • For ARDA and AQUAINT they are: • Intelligence Community and Military Analysts • But there are other Potential Target Audiences of “Professional Information Analysts”: • Investigative / “CNN-type” Reporters • Financial Industry Analysts / Investors • Historians / Biographers • Lawyers / Law Clerks • Law Enforcement Detectives • And Others
Professional Information Analysts Professional Information Analysts:What do They have in Common? • They are far more than just casual users of information • They work in an information rich environment where they have access to large quantities of heterogeneous data • They are almost always subject matter experts within their assigned task areas • They track and follow a given event, scenario, problem, or situation for an extended period of time • They frequently have extensive collaboration with other analysts • They are focused on their assigned task or mission and will do whatever it takes to accomplish it • The end product that results from their analysis is often judged against the standards of: Timeliness Accuracy Usability Completeness Relevance
Top 10 Challenges 1) Satisfy QA requirements of the “Professional” Information Analyst 2) Pursue QA Scenarios and not just isolated, factually based QA
Judgement Questions? Predictive Questions? Interpretive Questions? Why Questions? Other Questions? Factoid Question? Overarching Context / Operational Requirement Information Analysts Implications of QA Scenarios • Requires handling a Full Range of Complexity & Continuity of Questions • Need to understand & track the analysts’ line of reasoning and flow of argument • QA System requires significantly greater insight into knowledge, desires, past experiences, likes and dislikes of “Questioner” • Place much higher value on recognizing and capturing “background” information • Questioner/System dialogue is now more than just a means for clarification
Increasing Complexity Levels of Questions & Answers Level 1 Level 2 Level 3 Level 4 ”Simple "Template & “Cross Media & ”Context-Based QA Scenarios” Factual QA’s" Multi-valued QA’s” Cross Document QA’s" Near Term Long Term Mid Term Current AQUAINT:Intermediate Goals
Top 10 Challenges 1) Satisfy QA requirements of the “Professional” Information Analyst 2) Pursue QA Scenarios and not just isolated, factually based QA 3) Support a collaborative, multiple analyst environment
Non-Standard Discovery(From a System Perspective) Identify previous QA Scenarios that have “similarity” to current QA Scenario. Compare & Contrast Use / Build-on / Update previous results Uncover new data sources Borrow a successful “line of reasoning” or “argument flow” Alerts analyst to different interpretations or to overlooked / undervalued data Standard Collaboration (From an Analyst Perspective) Who else is working all or a portion of my task? What do they know that I don’t and vice versa? Can we share/work together? Knowledge Bases;Technical Databases Other Analysts Other Analysts Question & Requirement Question & Requirement Context; Analyst Background Context; Analyst Background QUESTION ???? QUESTION ???? Knowledge Knowledge Query Query Assessment, Assessment, Natural Statement of Natural Statement of Advisor, Advisor, Focus Question; Question; Collaboration Collaboration Use of Use of Multimedia Examples Multimedia Examples Question Understanding and Interpretation Question Understanding and Interpretation Clarification Clarification Collaboration within QA
Top 10 Challenges 1) Satisfy QA requirements of the “Professional” Information Analyst 2) Pursue QA Scenarios and not just isolated, factually based QA 3) Support a collaborative, multiple analyst environment 4) Some times SMALL things really matter and other times BIG things don’t
“Small & Big” - Can we tell the difference? • Some times SMALL differences can produce significantly different results/interpretations: • Stop Words • “Books {by; for; about} kids” • Attachments • “The man saw the woman in the park with the telescope.” • Co-reference • “John {persuaded; promised} Bill to go. He just left.” • “Mary took the pill from the bottle. She swallowed it.” • Other times BIG differences can produce the same/similar results: • “Name the films in which Denzel Washington starred.” • “Denzel Washington played a leading role in which movies?” • “In what Hollywood productions did Denzel Washington receive top billing?”
Top 10 Challenges 1) Satisfy QA requirements of the “Professional” Information Analyst 2) Pursue QA Scenarios and not just isolated, factually based QA 3) Support a collaborative, multiple analyst environment 4) Some times SMALL things really matter and other times BIG things don’t 5) Advanced QA must attack the “Data Chasm”
Today Level I Level II Future Level III Mulit-Valued Factual Questions Questions Cross Media Cross Document Simple Judgement Full Context-Based Question Scenario Full Context-Based Question Scenario Single Factual Isolated Questions Increasing Volumes (Petabyte & up) Data Chasm Synthesis Across “Documents”/Media Contradictory Data MANY Heterogeneous Data Sources; All Types, Sizes, Locations Multiple Perspectives Missing Data Reliability of Data Answers Fully Intersected; Automatically Generated; Variable Structure/ Format; Full Context Responses Fully Intersected; Automatically Generated; Variable Structure/ Format; Full Context Responses Variable Narrative Summary; Multi-Media Presentations; Simple Interpreted Results 50/250 Byte Passage from Single Text Document Fixed Templates or Tabular Lists Attacking the Data Chasm Synthesis Across “Documents”/Media
Some Challenges: Alternative Wording * AP—Supporters of Fiji's coup leaderdetained and beat apolice officer today atthe parliament building… CNN—Meanwhile,dozens of Speight's followersattackeda police officer, an indigenous Fijian, after he arrived atthe parliament complex… SMH—… journalists at the gate of the parliamentary complex in Suva witnesseda violent assaulton a man believed to be a plainclothes police officer. • Answers may be stated with a wide variety of terminology * Reference: BBN Technology AQUAINT Briefing
Some Challenges: Synthesizing Info * AP—Dozens of Speight’s supporterskicked and punchedhim even as he lay on the ground. CNN—The officer waskicked and punched, even as he lay on the ground. SMH—The man was dragged inside the compound and set upon by up to 30 men, whopunched and kickedhim for five minutes. • Tell me about the Fiji police officer’s attackers. • Up to 30 of Speight’s supporters attacked him for five minutes. • Need to synthesize information across sources * Reference: BBN Technology AQUAINT Briefing
Some Challenges: Evolving Info * Reuters, June 5— A CBS television report on Sunday said a senior intelligence service defector, now being debriefed by the CIA in Turkey, had said he had documents to prove Iran trained a group of Libyans to stage the bombing of the Pan Am flight. Reuters, June 5—Much would depend … on verifying the identity of the man who CBS said gave his name as Ahmad Behbahani. It also said he had been the czar of Iranian state-sponsored “terrorism” until four months ago. Reuters, June 11—But following debriefing sessions in Turkey, where the man is in protective custody, the CIA and FBI have concluded the 32-year-old defector is not Behbahani, the Post quoted a senior U.S. official as saying. • Is the defector Behbahani? • Synthesis of information across sources and time • Updateas the understanding evolves * Reference: BBN Technology AQUAINT Briefing
Today Level I Level II Future Level III Mulit-Valued Factual Questions Questions Cross Media Cross Document Simple Judgement Full Context-Based Question Scenario Full Context-Based Question Scenario Single Factual Isolated Questions Increasing Volumes (Petabyte & up) Data Chasm Synthesis Across “Documents”/Media Contradictory Data MANY Heterogeneous Data Sources; All Types, Sizes, Locations Multiple Perspectives Missing Data Reliability of Data Answers Fully Intersected; Automatically Generated; Variable Structure/ Format; Full Context Responses Fully Intersected; Automatically Generated; Variable Structure/ Format; Full Context Responses Variable Narrative Summary; Multi-Media Presentations; Simple Interpreted Results 50/250 Byte Passage from Single Text Document Fixed Templates or Tabular Lists Attacking the Data Chasm MANY Heterogeneous Data Sources; All Types, Sizes, Locations