510 likes | 675 Views
Question. Answer. Pairwise alignment using HMMs. Wouter van Gool and Thomas Jellema. Pairwise alignment using HMMs. Contents Most probable path Thomas Probability of an alignment Thomas Sub-optimal alignments Thomas Pause
E N D
Pairwise alignment using HMMs Wouter van Gool and Thomas Jellema
Pairwise alignment using HMMs Contents • Most probable path Thomas • Probability of an alignment Thomas • Sub-optimal alignments Thomas • Pause • Posterior probability that xi is aligned to yi Wouter • Pair HMMs versus FSAs for searching Wouter • Conclusion and summary Wouter • Questions
4.1 Most probable path Model that emits a single sequene
4.1 Most probable path Begin and end state
4.1 Most probable path Model that emits a pairwise alignment
4.1 Most probable path Example of a sequence Seq1: A C T _ C Seq2: T _ G G C All : M X M Y M
4.1 Most probable path Begin and end state
4.1 Most probable path Finding the most probable path • The path you choose is the path that has the highest probability of being the correct alignment. • The state we choose to be part of the alignment has to be the state with the highest probability of being correct. • We calculate the probability of the state being a M, X or Y and choose the one with the highest probability • If the probability of ending the alignment is higher then the next state being a M, X or Y then we end the alignment
4.1 Most probable path The probability of emmiting an M is the highest probability of: 1 previous state X new state M 2 previous state Y new state M 3 previous state M new state M
4.1 Most probable path Probability of going to the M state
4.1 Most probable path Viterbi algorithm for pair HMMs
4.1 Most probable path Finding the most probable path using FSAs -The most probable path is also the optimal FSA alignment
4.1 Most probable path Finding the most probable path using FSAs
4.1 Most probable path Recurrence relations
4.1 Most probable path The log odds scoring function • We wish to know if the alignment score is above or below the score of random alignment. • The log-odds ratio s(a,b) = log (pab/ qaqb). • log (pab/ qaqb)>0 iff the probability that a and b are related by our model is larger than the probability that they are picked at random.
4.1 Most probable path Random model
4.1 Most probable path M X Y END M 1-2δ -τ δ δ τ X 1-ε -τ ε τ X Y END X 1- η η Y 1-ε -τ ε τ Y 1- η η END 1 END 1 “Random” “Model”
4.1 Most probable path Transitions
4.1 Most probable path Transitions
4.1 Most probable path Optimal log-odds alignment
4.1 Most probable path A pair HMM for local alignment
Pairwise alignment using HMMs Contents • Most probable path Thomas • Probability of an alignment Thomas • Sub-optimal alignments Thomas • Pause • Posterior probability that xi is aligned to yi Wouter • Pair HMMs versus FSAs for searching Wouter • Conclusion and summary Wouter • Questions
4.2 Probability of an allignment Probability that a given pair of sequences are related.
4.2 Probability of an allignment Summing the probabilities
Pairwise alignment using HMMs Contents • Most probable path Thomas • Probability of an alignment Thomas • Sub-optimal alignments Thomas • Pause • Posterior probability that xi is aligned to yi Wouter • Pair HMMs versus FSAs for searching Wouter • Conclusion and summary Wouter • Questions
4.3 Suboptimal alignment Finding suboptimal alignments How to make sample alignments?
4.3 Suboptimal alignment Finding distinct suboptimal alignments
Pairwise alignment using HMMs Contents • Most probable path Thomas • Probability of an alignment Thomas • Sub-optimal alignments Thomas • Pause • Posterior probability that xi is aligned to yi Wouter • Example Wouter • Pair HMMs versus FSAs for searching Wouter • Conclusion or summary Wouter • Questions
Pairwise alignment using HMMs Contents • Most probable path Thomas • Probability of an alignment Thomas • Sub-optimal alignments Thomas • Pause • Posterior probability that xi is aligned to yi Wouter • Pair HMMs versus FSAs for searching Wouter • Conclusion and summary Wouter • Questions
Posterior probability that xi is aligned to yi • Local accuracy of an alignment? • Reliability measure for each part of an alignment • HMM as a local alignment measure • Idea: P(all alignments trough (xi,yi)) P(all alignments of (x,y))
Posterior probability that xi is aligned to yi Notation: xi ◊ yi means xi is aligned to yi
Probability alignment • Miyazawa: it seems attractive to find alignment by maximising P(xi ◊ yi ) • May lead to inconsistencies: e.g. pairs (i1,i1) & (i2,j2) i2 > i1 andj1 < j2 Restriction to pairs (i,j) for which P(xi ◊ yi )>0.5
Posterior probability that xi is aligned to yi The expected accuracy of an alignment • Expected overlap between π and paths sampled from the posterior distribution • Dynamic programming
Pairwise alignment using HMMs Contents • Most probable path Thomas • Probability of an alignment Thomas • Sub-optimal alignments Thomas • Pause • Posterior probability that xi is aligned to yi Wouter • Pair HMMs versus FSAs for searching Wouter • Conclusion and summary Wouter • Questions
Pairwise alignment using HMMs Contents • Most probable path Thomas • Probability of an alignment Thomas • Sub-optimal alignments Thomas • Pause • Posterior probability that xi is aligned to yi Wouter • Pair HMMs versus FSAs for searching Wouter • Conclusion and summary Wouter • Questions
Pair HMMs versus FSAs for searching • P(D | M) > P(M | D) • HMM: maximum data likelihood by giving the same parameters (i.e. transition and emission probabilities) • Bayesian model comparison with random model R
Pair HMMs versus FSAs for searching Problems: 1. Most algorithms do not compute full probability P(x,y | M) but only best match or Viterbi path 2. FSA parameters may not be readily translated into probabilities
qa a b a c Pair HMMs vs FSAs for searching Example: a model whose parameters match the data need not be the best model α S PS(abac) = α4qaqbqaqc 1 1-α PB(abac) = 1-α B Model comparison using the best match rather than the total probability 1 1 1
Pair HMMs vs FSAs for searching Problem: no fixed scaling procedure can make the scores of this model into the log probabilities of an HMM
Pair HMMs vs FSAs for searching Bayesian model comparision: both HMMs have same log-odds ratio as previous FSA
Pair HMMs vs FSAs for searching • Conversion FSA into probabilistic model • Probabilistic models may underperform standard alignment methods if Viterbi is used for database searching. • Buf if forward algorithm is used, it would be better than standard methods.
Pairwise alignment using HMMs Contents • Most probable path Thomas • Probability of an alignment Thomas • Sub-optimal alignments Thomas • Pause • Posterior probability that xi is aligned to yi Wouter • Example Wouter • Pair HMMs versus FSAs for searching Wouter • Conclusion and summary Wouter • Questions
Why try to use HMMs? • Many complicated alignment algorithms can be described as simple Finite State Machines. • HMMs have many advantages: - Parameters can be trained to fit the data: no need for PAM/BLOSSUM matrices - HMMs can keep track of all alignments, not just the best one
New things HMMs we can do with pair HMMs • Compute probability overall alignments. • Computerelativeprobability of Viterbi alignment (or any other alignment). • Sample over all alignments in proportion totheir probability. • Find distinctsub-optimal alignments. • Computereliability of each part of the best alignment. • Compute themaximally reliable alignment.
Conclusion • Pairs-HMM work better for sequence alignment and database search than penalty score based alignment algorithms. • Unfortunately both approaches are O(mn) andhence too slow for large database searches!