160 likes | 481 Views
The Occasionally Dishonest Casino. Narrated by: Shoko Asei Alexander Eng. Basic Probability. Basic probability can be used to predict simple, isolated events, such as the likelihood a tossed coin will land on heads/tails
E N D
The Occasionally Dishonest Casino Narrated by: Shoko Asei Alexander Eng
Basic Probability • Basic probability can be used to predict simple, isolated events, such as the likelihood a tossed coin will land on heads/tails • The occasionally dishonest casino concept can be used to assess the likelihood of a certain sequence of events occurring
Loaded Dice • A casino’s use of a fair die most of the time, but occasional switch to a loaded die • A loaded die displays preference for landing on a particular face(s) • This is difficult to detect because of the low probability of it appearing
Emissions • Model of a casino where two dice are rolled • One is fair with all faces being equally probable P(1) = P(2) = P(3) = P(4) = P(5) = P(6) = 1/6 • The other is loaded where the number “6” accumulates 1/2 of the probability distribution of the faces of this die P(1) = P(2) = P(3) = P(4) = P(5) = 1/10 P(6) = 1/2 • These are emission probabilities
State Transitions • The changes of state are called transitions • Exchange of fair/loaded dice • Stays or changes at each point in time • The probability of the outcome of a roll is different for each state • The casino may switch dice before each roll • A fair to a loaded die with probability of 0.05 • A loaded to a fair die with probability of 0.1 • These are transition probabilities
Bayes’ Rule • A concept for finding the probability of related series of events • It relates the probability of: • Event A conditional to event B • Event B conditional to event A • Both probabilities not necessarily the same
Markov Chain Models • A Markov chain model (MCM) is a sequence of states whose probabilities at a time interval depend only upon the value preceding it • It is based on the Markov assumption, which states that “the probability of a future observation given past and present observations depends only on the present”
Hidden Markov Models • A Hidden Markov model (HMM) is a statistical model with unknown parameters • Transitions between different states are nondeterministic with known probabilities • Extension of a Markov chain model • The system has observable parameters from which hidden parameters can be explained
The Occasionally Dishonest Casino: An Example • Hidden Markov model structure • Emission and transition probabilities are known • Sequence observed is “656” • Usage of fair and/or loaded die is unknown • Find path, or sequence of states, that most likely produced the observed sequence
The Occasionally Dishonest Casino: An Example P(656|FFL) = Probability of “FFL” path given “656” sequence = P(F) * P(6|F)* P(FF) * P(5|F) * P(FL) * P(6|L) 1st toss 2ndtoss 3rdtoss P(F) = P(L) = Probability of starting with fair/loaded die Emission Probabilities P(6|F) = 1/6 P(5|F) = 1/6 P(6|L) = 1/2 Transition Probabilities P(FF) = 0.95 P(FL) = 0.05
The Occasionally Dishonest Casino: An Example • Most likely path comprises greatest portion of the probability distribution • Three consecutive tosses of a loaded die most likely produced the sequence “656”
A Final Note • The occasionally dishonest casino concept is applicable to many systems • Commonly used in bioinformatics to model DNA or protein sequences • Consider a twenty-sided die with a different amino acid representing each face…