1 / 17

Hidden Markov Models

Explore the applications and computational problems of Hidden Markov Models in identifying coding regions in DNA sequences, separating "bad" pixels in images, and more. Learn about the Baum-Welch algorithm and how to determine transition and emission probabilities.

mcmillin
Download Presentation

Hidden Markov Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hidden Markov Models Usman Roshan CS 675 Machine Learning

  2. HMM applications • Determine coding and non-coding regions in DNA sequences • Separating “bad” pixels from “good” ones in image files • Classification of DNA sequences

  3. Hidden Markov Models • Alphabet of symbols: • Set of states that emit symbols from the alphabet: • Set of probabilities • State transition: • Emission probabilities:

  4. Loaded die problem

  5. Loaded die automata aFL F L aFF aLL eF(i) eL(i) aLF

  6. Loaded die problem • Consider the following rolls: Observed : 21665261 Underlying die : FFLLFLLF • What is the probability that the underlying path generated the observed sequence?

  7. HMM computational problems

  8. Probabilities unknown but hidden sequence known • Akl: number of transitions from state k to l • Ek(b): number of times state k emits symbol b

  9. Probabilities known but hidden sequence unknown • Problem: Given an HMM and a sequence of rolls, find the most probably underlying generating path. • Let be the sequence of rolls. • Let VF(i) denote the probability of the most probable path of that ends in state F. (Define VL(i) similarly.)

  10. Probabilities known but hidden sequence unknown • Initialize: • Recurrence: for i=0..n-1

  11. Probabilities and hidden sequence unknown • Use Expected-Maximization algorithm (also known as EM algorithm) • Very popular and many applications • For HMMs also called Baum-Welch algorithm • Outline: • Start with random assignment to transition and emission probabilities • Find expected transition and emission probabilities • Estimate actual transition and emission probabilities from expected values in previous step • Go to step 2 if probabilities not converged

  12. HMM forward probabilities • Consider the total probability of all hidden sequences under a given HMM. • Let fL(i) be the sum of the probabilities of all hidden sequences upto i that end in the state L. • Then fL(i) is given by • We calculate fF(i) in the same way. • We call these forward probabilities: • f(i) = fL(i)+fF(i)+fB(i)

  13. HMM backward probabilities • Similarly we can calculate backward probabilties bL(i). • Let bL(i) be the sum of the probabilities of all hidden sequences from i to the end that start in state L. • Then bL(i) is given by • We calculate bF(i) in the same way. • We call these forward probabilities: • b(i) = bL(i)+bF(i)+bB(i)

  14. Baum Welch • How do we calculate expected transition and emission probabilities? • Consider the fair-loaded die problem. What is the expected transition of fair (F) to loaded (L)? • To answer we have to count the number of times F transitions to L in all possible hidden sequences and multiply each by the probability of the hidden sequence

  15. Baum Welch • For example suppose input is 12 • What are the different hidden sequences? • What is the probability of each? • What is the total probability? • What is the probability of all hidden sequences where the first state is F? • Can we determine these answers automatically with forward and backward probabilities?

  16. Baum Welch General formula for expected number of transitions from state k to l. General formula for expected number of emissions of b from state k. Equations from Durbin et. al., 1998

  17. Baum Welch • Initialize random values for all parameters • Calculate forward and backward probabilities • Calculate new model parameters • Did the new probabilities (parameters) change by more than 0.001? If yes goto step 2. Otherwise stop.

More Related