280 likes | 328 Views
Knowledge Engineering for Bayesian Networks. Ann Nicholson. School of Computer Science and Software Engineering Monash University. Overview. Representing uncertainty Introduction to Bayesian Networks Syntax, semantics, examples The knowledge engineering process
E N D
Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University
Overview • Representing uncertainty • Introduction to Bayesian Networks • Syntax, semantics, examples • The knowledge engineering process • Case Study: Intelligent Tutoring • Summary of other BN research • Open research questions
Sources of Uncertainty • Ignorance • Inexact observations • Non-determinism • AI representations • Probability theory • Dempster-Shafer • Fuzzy logic
Probability theory for representing uncertainty • Assigns a numerical degree of belief between 0 and 1 to facts • e.g. “it will rain today” is T/F. • P(“it will rain today”) = 0.2 prior probability (unconditional) • Posterior probability (conditional) • P(“it wil rain today” | “rain is forecast”) = 0.8 • Bayes’ Rule: P(H|E) = P(E|H) x P(H) P(E)
Bayesian networks • Directed acyclic graphs • Nodes: random variables, • R: “it is raining”, discrete values T/F • T: temperature, cts or discrete variable • C: colour, discrete values {red,blue,green} • Arcs indicate dependencies (can have causal interpretation)
X Flu Y Te Q Th Bayesian networks • Conditional Probability Distribution (CPD) • Associated with each variable • probability of each state given parent states “Jane has the flu” P(Flu=T) = 0.05 Models causal relationship “Jane has a high temp” P(Te=High|Flu=T) = 0.4 P(Te=High|Flu=F) = 0.01 Models possible sensor error “Thermometer temp reading” P(Th=High|Te=H) = 0.95 P(Th=High|Te=L) = 0.1
Flu Flu TB Flu Flu Y Te Te Te Y Te Th Th Th Diagnostic inference Causal inference Mixed inference Intercausal inference BN inference • Evidence: observation of specific state • Task: compute the posterior probabilities for query node(s) given evidence. Flu
BN software • Commerical packages: Netica, Hugin, Analytica (all with demo versions) • Free software: Smile, Genie, JavaBayes, … http://HTTP.CS.Berkeley.EDU/~murphyk/Bayes/bnsoft.html • Example running Netica software
Decision networks • Extension to basic BN for decision making • Decision nodes • Utility nodes • EU(Action) = p(o|Action,E) U(o) o • choose action with highest expect utility • Example
Elicitation from experts • Variables • important variables? values/states? • Structure • causal relationships? • dependencies/independencies? • Parameters (probabilities) • quantify relationships and interactions? • Preferences (utilities)
BN EXPERT Domain EXPERT BN TOOLS Expert Elicitation Process • These stages are done iteratively • Stops when further expert input is no longer cost effective • Process is difficult and time consuming. • Current BN tools • inference engine • GUI • Next generation of BN tools?
Knowledge discovery • There is much interest in automated methods for learning BNS from data • parameters, structure (causal discovery) • Computationally complex problem, so current methods have practical limitations • e.g. limit number of states, require variable ordering constraints, do not specify all arc directions • Evaluation methods
The knowledge engineering process 1. Building the BN • variables, structure, parameters, preferences • combination of expert elicitation and knowledge discovery 2. Validation/Evaluation • case-based, sensitivity analysis, accuracy testing 3. Field Testing • alpha/beta testing, acceptance testing 4. Industrial Use • collection of statistics 5. Refinement • Updating procedures, regression testing
Case Study: Intelligent tutoring • Tutoring domain: primary and secondary school students’ misconceptions about decimals • Based on Decimal Comparison Test (DCT) • student asked to choose the larger of pairs of decimals • different types of pairs reveal different misconceptions • ITS System involves computer games involving decimals • This research also looks at a combination of expert elicitation and automated methods (UAI2001)
“apparent expert” “longer is larger” “shorter is larger” Expert classification of Decimal Comparison Test (DCT) results H = high (all correct or only one wrong) L = low (all wrong or only one correct)
The ITS architecture Adaptive Bayesian Network Inputs Student Generic BN model of student Decimal comparison test (optional) Item Answers Answer • Diagnose misconception • Predict outcomes • Identify most useful information Information about student e.g. age (optional) Computer Games Hidden number Answer Classroom diagnostic test results (optional) Feedback Answer Flying photographer • Select next item type • Decide to present help • Decide change to new game • Identify when expertise gained System Controller Module Item type Item Decimaliens New game Sequencing tactics Number between Help Help …. Report on student Classroom Teaching Activities Teacher
Expert Elicitation • Variables • two classification nodes: fine and coarse (mut. ex.) • item types: (i) H/M/L (ii) 0-N • Structure • arcs from classification to item type • item types independent given classification • Parameters • careless mistake (3 different values) • expert ignorance: - in table (uniform distribution)
Evaluation process • Case-based evaluation • experts checked individual cases • sometimes, if prior was low, ‘true’ classification did not have highest posterior (but usually had biggest change in ratio) • Adaptiveness evaluation • priors changes after each set of evidence • Comparison evaluation • Differences in classification between BN and expert rule • Differences in predictions between different BNs
Comparison evaluation • Development of measure: same classification, desirable and undesirable re-classification • Use item type predictions • Investigation of effect of item type granularity and probability of careless mistake
Comparison: expert BN vs rule Same Undesirable Desirable
Results Same varying prob. of careless mistake Desir. Undes. varying granularity of item type: 0-N and H/M/L
Investigation by Automated methods • Classification (using SNOB program, based on MML) • Parameters • Structure (using CaMML)
Another Case Study: Seabreeze prediction • 2000 Honours project, joint with Bureau of Meteorology (with Russell Kennett and Kevin Korb, PAKDD’2001 paper, TR) • BN network built based on existing simple expert rule • Several years data available for Sydney seabreezes • CaMML (Wallace and Korb, 1999) and Tetrad-II (Spirtes et al. 1993) programs used to learn BNs from data • Comparative analysis showed automated methods gave improved predictions.
Other BN-related projects • DBNS for discrete monitoring (PhD, 1992) • Approximate BN inference algorithms based on a mutual information measure for relevance (with Nathalie Jitnah, ICONIP97, ECSQARU97, PRICAI98,AI99) • Plan recognition: DBNs for predicting users actions and goals in an adventure game (with David Albrecht,Ingrid Zukerman,UM97,UMUAI1999,PRICAI2000) • Bayesian Poker (with Kevin Korb,UAI’99, honours students)
Other BN-related projects (cont.) • DBNs for ambulation monitoring and fall diagnosis (with biomedical engineering,PRICAI’96) • Autonomous aircraft monitoring and replanning (withPh.D. studentTim Wilkin,PRICAI2000) • Ecological risk assessment (2003 honours project with Water Studies Centre) • Writing a textbook! (with Kevin Korb) Bayesian Artificial Intelligence
Open Research Questions • Methodology for combining expert elicitation and automated methods • expert knowledge used to guide search • automated methods provide alternatives to be presented to experts • Evaluation measures and methods • may be domain depended • Improved tools to support elicitation • e.g. visualisation of d-separation • Industry adoption of BN technology