1 / 23

Object- Oriented Bayesian Networks : An Overview

Object- Oriented Bayesian Networks : An Overview. Presented By: Asma Sanam Larik Course: Probabilistic Reasoning. Limitations of BN. Standard BN representation makes it hard to construct update reuse learn reason with complex models. Scaling up.

kylar
Download Presentation

Object- Oriented Bayesian Networks : An Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Object- Oriented Bayesian Networks : An Overview Presented By: Asma SanamLarik Course: Probabilistic Reasoning

  2. Limitations of BN • Standard BN representation makes it hard to • construct • update • reuse • learn • reason with complex models.

  3. Scaling up • Our goal is to scale BNs to more complex domains • Large-scale diagnosis. • Monitor complex processes: • highway traffic; • military situation assessment. • Control intelligent agents in complex environments: • Smart robot; • intelligent building.

  4. Problem : Knowledge Engineering • Main reuse mechanism: cut & paste • How is the model updated? • How do we construct large BNs?

  5. Problem: BN Inference • BN Inference can be exponential • Inference complexity depends on subtle properties of BN structure. =>Will a large BN support efficient inference?

  6. Approach 1: • Proposed by Laskey Network fragments • A Network fragment is basically a set of related variable together with knowledge about the probabilistic relationships among the variables. • Two types of object were identified Input and Result fragments. Input fragments are composed together to form a result fragment. To join input fragments together an influence combinationruleis needed to compute local probability

  7. Exploit structure!The architecture of complexity [Herbert Simon, 1962] • many complex systems have a nearly decomposable, hierarchic structure. • Hierarchic systems are usually composed of only a few different kinds of subsystems. • By appropriate “recoding”, the redundancy that is present but unobvious in the structure of a complex system can often be made patent.

  8. Our goal ? • Our goal is a more expressive representation language with • rigorous probabilistic semantics; • model-based; • supports hierarchical structure & redundancy; • exploits structure for effective inference!

  9. Object-Oriented Bayesian Network • Classes represent types of object – Attributes for a class are represented as OOBN nodes – Input nodes refer to instances of another class – Output nodes can be referred to by other classes – Encapsulated nodes are private » Conditionally independent of other objects given input and output nodes • Classes may have subclasses – Subclass inherits attributes from superclass – Subclass may have additional attributes not in superclass • Classes may be instantiated – Instances represent particular members of the class

  10. Example Reference :F.V.Jensen , T.D.Nelson “Bayesian Networks and Decision Graphs ”, vol. 2, Springer 2007

  11. OOBN • An OOBN models a domain with hierarchical structure & redundancy • An OOBN consists of a set of objects: • simple objects: random variables • complex objects :have attributes which are enclosed objects.

  12. Inter Object Interaction • Related objects can influence each other via imports and exports. • X imports A from Y => • value of X can depend on the value of A. • objects related to X can import A from X.

  13. Imports and Exports / Inputs and Output Variables • Value of object depends probabilistically on the value of its imports • A simple object is associated with a conditional probability table • distribution over its values given values for its imports. • The value of a complex object X is composed of the values for its attributes • Its probabilistic model is defined recursively from the models of its attributes

  14. Semantics • Theorem: The probabilistic model for an object X defines a conditional probability distribution • P( value of X | imports into X from enclosing object)

  15. Old Mac Donald Case Study Reference: O. Bangsø and P.-H. Wuillemin. “Top-down construction and repetitive structures representation in Bayesian networks”. Proceedings of the 13th International Florida Artificial Intelligene Research Society Conference (FLAIRS-2000), pp. 282–286, AAAI Press, 2000

  16. Sub Classing and Inheritance • If a class C’ should be a subclass of C it should hold • the set of input variables for C is a subset of input variables for C’ • the set of output variables for C is a subset of output variables for C’

  17. Reference: F.V.Jensen, T.D.Nelson “Bayesian Networks and Decision Graphs ” ,vol. 2, Springer 2007

  18. OOBN Inference • The OOBN representation allows us to easily construct large complex models • Can we do inference in these models? • BN constructed very large… efficient inference?

  19. Approaches to Inferencing • Convert to normal BN and use standard inference techniques • Convert OOBN to MSBN and apply MSBN inference approach • By exploiting the modularity we can obtain good results • Algorithms are being developed in this area

  20. Conclusion • In essence, where Bayesian networks contain two types of knowledge relevance relationships and conditional probabilities OOBNs contain a third type of knowledge organizational structure. • They can model static situations but cannot model situations where instances are changing

  21. References • D.Koller and A.Pfeffer. “Object Oriented Bayesian Networks” .Proceedings of the Thirteenth Annual Conference on Uncertainty in Artificial Intelligence. August 1-3, 1997, Brown University, Providence, Rhode Island, USA. Morgan Kaufman Publishers Inc, San Francisco, 1997. • K. B. Laskey and S. M. Mahoney “Network Fragments: Representing Knowledge for Constructing Probabilistic Models”. Proceedings of Thirteenth Annual Conference on uncertainty in Artificial Intelligence. Morgan Kaufman Publishers Inc., San Francisco, 1997. • O. Bangsø and P.-H. Wuillemin. “Top-down construction and repetitive structures representation in Bayesian networks”. Proceedings of the 13th International Florida Artificial Intelligene Research Society Conference (FLAIRS-2000), pp. 282–286, AAAI Press, 2000. • M. Fenton, Nielsen, L. M. (2000). Building Large-Scale Bayesian Networks,The Knowledge Engineering Review 15(3): 257–284. • J.Pearl(1988). Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference, Series in Representation and Reasoning, Morgan Kaufmann Publishers,San Mateo, CA. • M. Julia Gallego, “Bayesian networks inference: Advanced algorithms for triangulation and partial abduction”, Ph.D. dissertation, Departamento de SistemasInform´aticos, University of Castilla - La Mancha (UCLM), 2005 • U.B. Kjaerulff, A.L. Madsen, “Bayesian Networks and Influence Diagrams : A Guide to Construction and Analysis”, Springer 2008 ,pp. 91-98 • F.V.Jensen, T.D.Nelson “Bayesian Networks and Decision Graphs ”,vol. 2, Springer 2007, pp.84-91 • HuginTutorial, www.hugin.com/developer/tutorials/OOBN • H.Simon,"The Architecture of Complexity", Proceedings of American Philosophical Association, 1962

More Related