1 / 42

Introduction to Bayesian Framework: Philosophy, Neural Network, Priors, and More

Explore the Bayesian philosophy, Bayesian neural network, priors, and Bayesian prediction in a hierarchical model. Learn about Bayesian framework applications in LS.RBF Kernel SVM and MUD with a three-level inference framework. Discover insights, results, and discussions on Bayesian networks.

eharmon
Download Presentation

Introduction to Bayesian Framework: Philosophy, Neural Network, Priors, and More

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian Framework EE 645 ZHAO XIN

  2. A Brief Introduction to Bayesian Framework • The Bayesian Philosophy • Bayesian Neural Network • Some Discussion on Priors

  3. Bayesian’s Rule • Likelihood • Prior Distribution • Normalizing Constant

  4. Bayesian Prediction

  5. Hierarchical Model

  6. An Example Bayesian Network

  7. Some Discussion on Priors • Priors Converging to Gaussian Process • If the number of Hidden Units is infinite • Priors Leads to smooth and Brownian Functions • Fractional Brownian Priors • Priors Converging to Non-Gaussian Stable Process

  8. Bayesian Framework for LS RBF Kernel SVM MUD • Basic Problem and Solution • Probabilistic Interpretation of the LS SVM • First Level Inference • Second Level Inference • Third Level Inference • Basic MUD Model • Results and Discussion • Summary

  9. Basic Problem for LS SVM

  10. Basic Solution for LS SVM

  11. The Formula for SVM

  12. First Level Inference

  13. Some Assumptions of this Level • Separable Gaussian Prior for conditional P(w,b) • Independent Data Points • Gaussian Distributed Errors • Variance of b goes to infinite

  14. Result of the First Level

  15. Conditional Distribution of Weight w and Bias b

  16. Unbalance Case of 1st Level If the means of +1 class and –1 class are not perfectly project to +1 and –1, the bias term will come. We will introduce 2 new random variables as followed.

  17. Last Solution for First Level

  18. Second Level Inference

  19. Result of Second Level Inference

  20. Last Solution for Second Level

  21. Third Level Inference

  22. Some Assumption in this Level

  23. Last Solution for Third Level

  24. Some Comments for this Level • For Gaussian Kernel machine, the variance of Gaussian function can represent the model H • It’s impossible to calculate for all the possible model • Luckily, in general, such as in Gaussian Kernel SVM, the performance of classifier is pretty smooth with respect to the varying of model parameter. Therefore, we can just take sample of the model in the area we feel interested.

  25. A Synchronous CDMA Transmitter

  26. The LS SVM Receiver Diagram

  27. Results and Discussions

  28. First Inference

  29. Second Inference

  30. Third Inference (Plot 1)

  31. Third Inference (Plot 1)

  32. A Sample of Parameter Chosen

  33. Detector Performance

  34. Some Discussions on this Detector • The first inference does better the performance of LS SVM detector especially in high SNR region by considering the bias term. • The LS SVM detector is very smooth with respect to the varying of those hyper-parameters, which means the adaptive LS SVM will reasonably work well if the channel properties are not varying fast. • The computation for 2nd and 3rd inference are very complex, so it’s not worthwhile to do calculation here. We can choose some approximation formula instead.

  35. Summary of Bayesian Network • Pick up a basic neural network. • Properly choose the Priors (physically right and easy for theoretical deduction). • Find a reasonable hierarchical framework (a three-level inference framework is very typical), apply the Bayesian Rule there and find some beneficial assumption to simplify the problem.

  36. Some Comments on Bayesian Framework • It can help us to physically understand a neural network model. • It can theoretically help us to find the way to optimize the parameters and more important those hyper-parameters which can be sometimes impossibly set otherwise. • It even can make up some exist methods in some given problems.

  37. Reference • Tony V. G., Johan A. K. Suykens, A Bayesian Framework for Least Square Support Vector Machine Classifiers • N. Cristianini, John S., An Introduction to Support Vector Machine, 2000 • Radford M. Neal, Bayesian Learning for Neural Network, 1996 • Sergio Verdo, Multiuser Detection

More Related