310 likes | 462 Views
Introduction to probability theory and graphical models. Translational Neuroimaging Seminar on Bayesian Inference Spring 2013. Jakob Heinzle Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering (IBT) University and ETH Zürich. Literature and References.
E N D
Introductiontoprobabilitytheoryandgraphicalmodels Translational Neuroimaging Seminar on BayesianInference Spring 2013 Jakob Heinzle Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering (IBT) University and ETH Zürich
Literatureand References • Literature: • Bishop (Chapters 1.2, 1.3, 8.1, 8.2) • MacKay (Chapter 2) • Barber (Chapters 1, 2, 3, 4) • Manyimages in thislecturearetakenfromtheabovereferences. Bayesian Inference - Introduction to probability theory
Probabilitydistribution A probability P(x=true) isdefined on a sample space (domain) anddefines (foreverypossible) event in the sample spacethecertaintyofittooccur. Sample space: dom(X)={0,1} , Probabilitiessumtoone. Bishop, Fig. 1.11 Bayesian Inference - Introduction to probability theory
Probabilitytheory: Basic rules . Sumrule* - P(X) is also calledthe marginal distribution Productrule - * Accordingto Bishop Bayesian Inference - Introduction to probability theory
Conditionaland marginal probability Bayesian Inference - Introduction to probability theory
Conditionaland marginal probability Bishop, Fig. 1.11 Bayesian Inference - Introduction to probability theory
Independent variables Questionforlater: WhatdoesthismeanforBayes? Bayesian Inference - Introduction to probability theory
Probabilitytheory: Bayes’ theorem isderivedfromtheproductrule Bayesian Inference - Introduction to probability theory
RephrasingandnamingofBayes’ rule D: data, q: parameters, H: hypothesisweputintothemodel. MacKay Bayesian Inference - Introduction to probability theory
Example: Bishop Fig. 1.9 Box (B): blue (b) orred (r) Fruit (F): apple (a) or orange (o) p(B=r) = 0.4, p(B=b) = 0.6. Whatistheprobabilityofhaving a red box ifonehasdrawn an orange? Bishop, Fig. 1.9 Bayesian Inference - Introduction to probability theory
Probabilitydensity Bayesian Inference - Introduction to probability theory
PDF and CDF Bishop, Fig. 1.12 Bayesian Inference - Introduction to probability theory
Cumulativedistribution Short example: Howtousethecumulativedistribution totransform a uniform distribution! Bayesian Inference - Introduction to probability theory
Marginal densities p Integration insteadofsumming Bayesian Inference - Introduction to probability theory
Twoviewson probability • Probabilitycan … • … describethefrequencyofoutcomes in randomexperiments classicalinterpretation. • … describethedegreeof belief about a particularevent Bayesianviewpointorsubjectiveinterpretationofprobability. MacKay, Chapter 2 Bayesian Inference - Introduction to probability theory
Expectationof a function Or Bayesian Inference - Introduction to probability theory
Graphicalmodels They provide a simple way to visualize the structure of a probabilistic model and can be used to design and motivate new models. Insights into the properties of the model, including conditional independence properties, can be obtained by inspection of the graph. Complex computations, required to perform inference and learning in sophisticated models, can be expressed in terms of graphical manipulations, in which underlying mathematical expressions are carried along implicitly. Bishop, Chap. 8 Bayesian Inference - Introduction to probability theory
Graphicalmodelsoverview Directed Graph Undirected Graph Names: nodes (vertices), edges (links), paths, cycles, loops, neighbours Forsummaryofdefinitionssee Barber, Chapter 2 Bayesian Inference - Introduction to probability theory
Graphicalmodelsoverview Barber, Introduction Bayesian Inference - Introduction to probability theory
Graphicalmodels Bishop, Fig. 8.1 Bayesian Inference - Introduction to probability theory
Graphicalmodels: parentsandchildren Node a is a parentofnode b, node b is a childofnode a. Bishop, Fig. 8.1 Bayesian Inference - Introduction to probability theory
Belief networks = Bayesian belief networks = BayesianNetworks In general: • Every probabilitydistribution • canbeexpressedas a • Directedacyclicgraph (DAG) • Important: Nodirectedcycles! Bishop, Fig. 8.2 Bayesian Inference - Introduction to probability theory
Conditionalindependence A variable aisconditionallyindependentofbgivenc, if In bayesiannetworksconditionalindependencecanbe testedbyfollowingsome simple rules Bayesian Inference - Introduction to probability theory
Conditionalindependence – tail-to-tailpath Is a independentof b? No! Yes! Bishop, Chapter 8.2 Bayesian Inference - Introduction to probability theory
Conditionalindependence – head-to-tailpath Is a independentof b? No! Yes! Bishop, Chapter 8.2 Bayesian Inference - Introduction to probability theory
Conditionalindependence – head-to-headpath Is a independentof b? Yes! No! Bishop, Chapter 8.2 Bayesian Inference - Introduction to probability theory
Conditionalindependence – notation Bishop, Chapter 8.2 Bayesian Inference - Introduction to probability theory
Conditionalindependence – threebasicstructures Bishop, Chapter 8.2.2 Bayesian Inference - Introduction to probability theory
More conventions in graphicalnotations Regression model Short form Parameters explicit = = Bishop, Chapter 8 Bayesian Inference - Introduction to probability theory
More conventions in graphicalnotations Completemodelused forprediction Trained on datatn Bishop, Chapter 8 Bayesian Inference - Introduction to probability theory
Summary – thingstoremember • ProbabilitiesandhowtocomputewiththeProductrule, Bayes’ Rule, Sumrule • Probabilitydensities PDF, CDF • Conditionaland Marginal distributions • Basic conceptsofgraphicalmodels Directed vs. Undirected, nodesandedges, parentsandchildren. • Conditionalindependence in graphsandhowto check it. Bishop, Chapter 8.2.2 Bayesian Inference - Introduction to probability theory