420 likes | 444 Views
Explore the Bayesian philosophy, Bayesian neural network, priors, and Bayesian prediction in a hierarchical model. Learn about Bayesian framework applications in LS.RBF Kernel SVM and MUD with a three-level inference framework. Discover insights, results, and discussions on Bayesian networks.
E N D
Bayesian Framework EE 645 ZHAO XIN
A Brief Introduction to Bayesian Framework • The Bayesian Philosophy • Bayesian Neural Network • Some Discussion on Priors
Bayesian’s Rule • Likelihood • Prior Distribution • Normalizing Constant
Some Discussion on Priors • Priors Converging to Gaussian Process • If the number of Hidden Units is infinite • Priors Leads to smooth and Brownian Functions • Fractional Brownian Priors • Priors Converging to Non-Gaussian Stable Process
Bayesian Framework for LS RBF Kernel SVM MUD • Basic Problem and Solution • Probabilistic Interpretation of the LS SVM • First Level Inference • Second Level Inference • Third Level Inference • Basic MUD Model • Results and Discussion • Summary
Some Assumptions of this Level • Separable Gaussian Prior for conditional P(w,b) • Independent Data Points • Gaussian Distributed Errors • Variance of b goes to infinite
Unbalance Case of 1st Level If the means of +1 class and –1 class are not perfectly project to +1 and –1, the bias term will come. We will introduce 2 new random variables as followed.
Some Comments for this Level • For Gaussian Kernel machine, the variance of Gaussian function can represent the model H • It’s impossible to calculate for all the possible model • Luckily, in general, such as in Gaussian Kernel SVM, the performance of classifier is pretty smooth with respect to the varying of model parameter. Therefore, we can just take sample of the model in the area we feel interested.
Some Discussions on this Detector • The first inference does better the performance of LS SVM detector especially in high SNR region by considering the bias term. • The LS SVM detector is very smooth with respect to the varying of those hyper-parameters, which means the adaptive LS SVM will reasonably work well if the channel properties are not varying fast. • The computation for 2nd and 3rd inference are very complex, so it’s not worthwhile to do calculation here. We can choose some approximation formula instead.
Summary of Bayesian Network • Pick up a basic neural network. • Properly choose the Priors (physically right and easy for theoretical deduction). • Find a reasonable hierarchical framework (a three-level inference framework is very typical), apply the Bayesian Rule there and find some beneficial assumption to simplify the problem.
Some Comments on Bayesian Framework • It can help us to physically understand a neural network model. • It can theoretically help us to find the way to optimize the parameters and more important those hyper-parameters which can be sometimes impossibly set otherwise. • It even can make up some exist methods in some given problems.
Reference • Tony V. G., Johan A. K. Suykens, A Bayesian Framework for Least Square Support Vector Machine Classifiers • N. Cristianini, John S., An Introduction to Support Vector Machine, 2000 • Radford M. Neal, Bayesian Learning for Neural Network, 1996 • Sergio Verdo, Multiuser Detection