190 likes | 284 Views
Cognitive Computer Vision. Kingsley Sage khs20@sussex.ac.uk and Hilary Buxton hilaryb@sussex.ac.uk Prepared under ECVision Specific Action 8-3 http://www.ecvision.org. Lecture 6. Inference in Bayesian networks Predictive inference Diagnostic inference Combined inference
E N D
Cognitive Computer Vision Kingsley Sage khs20@sussex.ac.uk and Hilary Buxton hilaryb@sussex.ac.uk Prepared under ECVision Specific Action 8-3 http://www.ecvision.org
Lecture 6 • Inference in Bayesian networks • Predictive inference • Diagnostic inference • Combined inference • Intercausal inference • General approaches for inference • Bayesian inference tools
So why is Bayesian inference relevant to Cognitive CV? • Provides a well-founded methodology for reasoning with uncertainty • These methods are the basis for our model of perception guided by expectation • We can develop well-founded methods of learning rather than just being stuck with hand-coded models
Inference • Inference • Calculating a probability over a set of nodes given the values of other nodes • Four modes of inference: • PREDICTIVE (from root to leaf) • DIAGNOSTIC (from leaf to root) • COMBINED (predictive and diagnostic) • INTERCAUSAL
Inference • Inference Also called conditioning or belief updating • We will have some values (evidence nodes) and want to establish others (query nodes) • Don’t confuse priors with evidence • Priors are statistical statements of how likely something is to “happen” (frequentist view) • Evidence means that you know it has happened
B A O C N A vision example • All discrete nodes • A and B are feature detectors for some area in an image (perhaps A is colour based and B is shape based) • O is an object detector that bases its decision solely on A and B • N determines how likely another is to be found nearby when the object detector finds its object • C represents an action context that is relevant when the object detector finds its object
B A O C N A vision example • A detects red areas, B detects the cup shape, O detects the cup of tea, the potential nearby object is a saucer and the action context is someone picking up the tea to drink it!!
B A O C N A vision example These priors are established during a training process This table specifies the performance of the object detector where T =detected, and F = not detected The context is “will be picked up” if c=T. The saucer object is nearby if n=T
Predictive inference • Let’s see this applied to our example • We use marginalisation to evaluate our queries based on the evidence we have observed (if we have any)
Predictive inference • In the absence of any observed evidence
Predictive inference • Let’s say we now have evidence that a=T • And if a=T and b=T
Diagnostic inference • Reasoning from leaf upwards to root nodes • Use Bayes rule
X O N Diagnostic inference • If there had been a link from another node into N, we would have needed to have normalised our expression over the additional node
A O B C N Combined inference Evidence • Where you have evidence from say N and B and form a query on an intermediate node • E.g. use diagnostic inference to determine p(o=T|n=?) and then use predictive inference to determine p(o=T) given the evidence • Can compute, for example p(o=T|n=T,b=T) Query Evidence
Evidence Query A O B Evidence Intercausal inference“explaining away” • A and B are independent • A is dependent on B given O • If, for example, p(a=T|o=T) > p(a=T|o=T,b=T) then the odds are that a=T rather than b=T caused o=T • We say that O is “explained away”
General approach to inference • Having their origins in Pearl’s work on Junction Trees (“Probabilistic Reasoning in Intelligent Systems”, Pearl 1988) • Efficient schemes exist for global computation of probabilities using local message passing (e.g. Jensen and Lauritzen 1990 and Lauritzen and Spiegelhalter 1988) • Beyond the scope of this course, but …
Bayesian inference tools • there are a number of packages out there to do the work for you!! • http://www.cs.ubc.ca/~murphyk/Software : Kevin Murphy’s BNT • http://www.csse.monash.edu.au/bai/book/appendix_b.pdf : Excellent summary of various packages and their capabilities
Summary • Bayesian inference allows the values of evidence nodes to be used systematically to update query nodes • We can distinguish 4 modes of inference: predictive, diagnostic, combined and explaining away • Large Bayesian networks can be evaluated efficiently using Bayesian inference toolkits available on the Internet
Next time … • Gaussian mixtures • A lot of excellent reference material on Bayesian reasoning can be found at: http://www.csse.monash.edu.au/bai http://www.dcs.qmw.ac.uk/~norman/BBNs/idxlist.htm