1 / 13

Project Halo

Project Halo . Chris Kirkland Bret Wilson. Background. Vulcan, Inc.’s Goal Pilot Phase (2002) 30-50% correct (SRI, Cycorp , Ontoprise ) Halo Phase II Initiated 2004 Intermediate Testing 2006 Final Testing 2008-9. Halo Phase II. AP Chemistry, Biology, and Physics Questions (4 types):

dani
Download Presentation

Project Halo

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Project Halo Chris Kirkland Bret Wilson

  2. Background • Vulcan, Inc.’s Goal • Pilot Phase (2002) • 30-50% correct (SRI, Cycorp, Ontoprise) • Halo Phase II • Initiated 2004 • Intermediate Testing 2006 • Final Testing 2008-9

  3. Halo Phase II • AP Chemistry, Biology, and Physics • Questions (4 types): • Conceptual Questions • Mathematical Equations • Tables* • Diagrams**

  4. AURA • Automated User-Centered Reasoning and Acquisition System • Layout • Document Base • Knowledge Base • Inference Engine • User Interface

  5. Document Base • Biology : 44 pp (23%) • Cell structure, function, and division; DNA rep. and protein synthesis • Chemistry : 67 (11%) • Stoichiometry, Chemical Equilibria, aqueous reactions, acids and bases • Physics : 78 (15%) • Kinematics and Newtonian Dynamics

  6. Knowledge Machine (KM) • Prototypes • Semantic Nets • Unification Mapping (UMAP) • Component Library (CLP) • Independent Library • “Attach, Penetrate, Physical Object, Location…” (37)

  7. Inference Engine • Pattern Matching • Equation Solver • Inference Tracker* • Explanation Generator* • Outputs “chain of reasoning” to user

  8. Computer Processable Language (CPL) • Simplified English syntax • Multiple choice questions split into list of true/false questions • Complex questions broken into simpler parts • Simple format: • Subject + Verb + Complements + Adjuncts • Not allowed: “probably”, “mostly”

  9. User Querying • User enters question in CPL • System notifies of any errors • Correct questions are shown as graphical feedback • Answer contains: • Simple direct answer • Explanation in basic English

  10. BBN Evaluation • Results • Biology • with nonexpert KF, nonexpert QF outperformed expert QF (why?) • Overall 47% success • Chemistry • no significant differences • Overall 18% success • Physics • Expert KF, QF outperformed nonexpert KF, QF • Overall 36% success

  11. Experts vs. non-experts • Experts • Expert in domain (physics, chemistry, biology) • Extensive training, previous experience with AURA • Collaboration with AURA team members • Non-experts • Graduate-level experience in domain (KF) • Undergraduate-level experience in domain (QF) • Limited training, no previous experience with AURA

  12. MUKE (India) • Multi-User Knowledge Entry • High performance (75% success rate on all novel questions in biology)

  13. Difficulties / Concerns • CPL format is limited and handles multiple choice inelegantly • KF not automated • No handling for diagrams in Document Base • Optimized to small subset of curriculum • Difficult to encode physics vector equations

More Related