1 / 17

Bayesian Networks II: Dynamic Networks and Markov Chains By Peter Woolf (pwoolf@umich)

Bayesian Networks II: Dynamic Networks and Markov Chains By Peter Woolf (pwoolf@umich.edu) University of Michigan Michigan Chemical Process Dynamics and Controls Open Textbook version 1.0. Creative commons. Existing plant measurements.

jafari
Download Presentation

Bayesian Networks II: Dynamic Networks and Markov Chains By Peter Woolf (pwoolf@umich)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian Networks II: Dynamic Networks and Markov Chains By Peter Woolf (pwoolf@umich.edu) University of MichiganMichigan Chemical Process Dynamics and Controls Open Textbookversion 1.0 Creative commons

  2. Existing plant measurements Physics, chemistry, and chemical engineering knowledge & intuition Bayesian network models to establish connections Patterns of likely causes & influences Efficient experimental design to test combinations of causes ANOVA & probabilistic models to eliminate irrelevant or uninteresting relationships Process optimization (e.g. controllers, architecture, unit optimization, sequencing, and utilization) Dynamical process modeling

  3. Static Bayesian Network Example 1: Car failure diagnosis network From http://www.norsys.com/netlib/car_diagnosis_2.htm

  4. Static Bayesian Network Example 2: ALARM network: A Logical Alarm Reduction Mechanism A medical diagnostic system for patient monitoring with 8 diagnoses, 16 findings, and 13 intermediate values From Beinlich, Ingo, H. J. Suermondt, R. M. Chavez, and G. F. Cooper (1989) "The ALARM monitoring system: A case study with two probabilistic inference techniques for belief networks" in Proc. of the Second European Conf. on Artificial Intelligence in Medicine (London, Aug.), 38, 247-256. Also Tech. Report KSL-88-84, Knowledge Systems Laboratory, Medical Computer Science, Stanford Univ., CA.

  5. Collapsed Network RBC ALT procedure OR weight  survival These are both examples of Dynamic Bayesian Networks (DBNs) Dynamic Bayesian Networks Unrolled Network Yesterday (ti-1) Today (ti) ALT ALT RBC RBC procedure procedure survival survival weight  weight 

  6. Predicts future responses Model derived from past data ti+2 ti ti+1 ti-1 ti-2 ti-3 ALT ALT ALT ALT ALT ALT ALT ALT ALT ALT ALT ALT RBC RBC RBC RBC RBC RBC RBC RBC RBC RBC RBC RBC procedure procedure procedure procedure procedure procedure procedure procedure procedure procedure procedure procedure survival survival survival survival survival survival survival survival survival survival survival survival weight  weight  weight  weight  weight  weight  weight  weight  weight 

  7. ALT ALT RBC RBC procedure procedure survival survival weight  weight  Dynamic Bayesian Networks:Predict to explore alternatives Today (ti) Tomorrow (ti+1) DBNs provide a suitable environment for MPC!

  8. Time(t) ti ti+1 N: fluctuations N: fluctuations H: Heater H: Heater T: Temperature T: Temperature G: Temp Set Pt. G: Temp Set Pt. S: Switch S: Switch V: Value/Cost V: Value/Cost Unrolled network DBN: Thermostat example Collapsed network From http://www.norsys.com/networklibrary.html#

  9. {000} {110} {001} {100} {011} {111} {101} {010} Time(t) ti ti+1 A Dynamic Bayesian Network can be recast as a Markov Network A Markov network describes how a system will transition from system state to state N: fluctuations N: fluctuations H: Heater H: Heater T: Temperature T: Temperature Simplified DBN Assume each variable is binary (has states 1 or 0), thus any configuration could be written as {010} meaning N=0, H=1, T=0

  10. A Dynamic Bayesian Network can be recast as a Markov Network A Markov network describes how a system will transition from system state to state {000} {110} {001} {100} Note: All rows must sum to 1 P1+P2=1 P5=1 etc. {011} {111} {101} {010} Each edge has a probability associated with it.

  11. Case Study: Synthetic Study Situation: Imagine that we are exploring the effect of a DNA damaging drug and UV light on the expression of 4 genes. GFP Gene A Gene B Gene C

  12. Case Study 1: Synthetic Study GFP Gene A Gene B Gene C Idealized Data

  13. Case Study 1: Synthetic Study GFP Gene A Gene B Gene C Noisy data

  14. Case Study 1: Synthetic Study Noisy data Idealized Data Given idealized or noisy data, can we find any relationships between the drug, UV exposure, GFP, and the gene expression profiles? See miniTUBA.demodata.xls

  15. Case Study 1: Synthetic Study Google “miniTUBA” or go to http://ncibi.minituba.org

  16. Case study 1: synthetic data • Observations: • Stronger relationships require fewer observations to identify • Noise in measurements are okay • Moderate binning errors are forgivable • Uncontrolled experiments can be your friend in model learning

  17. Take Home Messages • Noisy, time varying processes can be modeled as a Dynamic Bayesian Network (DBN) • A DBN can be recast as a Markov model of a stochastic system • DBNs can be learned directly from data using tools such as miniTUBA

More Related