150 likes | 253 Views
Probability theory and random number generation A quick review. L ászló Szirmay-Kalos. Probability space: throwing a die. Subsets of elementary events: event: union and complement of an event is also an event. Elementary events : possible outcomes. P: probability =
E N D
Probability theory and random number generationA quick review László Szirmay-Kalos
Probability space: throwing a die Subsets of elementary events: event: union and complement of an event is also an event Elementary events: possible outcomes P: probability = measure on the events • Kolgomorov axioms: • probability is in [0,1] • measure of the empty set is 0, of the total set is 1 • Pr(A+B) = Pr(A)+Pr(B) if A and B are disjoint 0 1
Random variables Maps the elementary events onto real numbers f (elementary event) 1 2 3 4 5 6 Probability Distribution Function: PDF(x) = P{ f < x } Probability Density Function: pdf(x) = P{ f = x } Expected value: E[ f ] = f(x)·pdf(x) Variance: D2[ f ] = E[ (f - E[ f ])2 ]= E[ f 2]- E2[ f ] Standard deviation: D[ f ]
Continuous random variables cold Maps the elementary events onto real numbers freezing hot f = temperature in Celsius -40 50 Probability Distribution Function: PDF(x) = P{ f < x } Probability Density Function: pdf(x) = d PDF(x) / dx Expected value: E[ f ] = f(x)·pdf(x) dx Variance: D2[ f ] = E[ (f - E[ f ])2 ]= E[ f 2]- E2[ f ] Standard deviation: D[ f ]
Conditional probability and independence B We know that the outcome is in A What is the probability that it is in B? A AB Pr(B|A) = Pr(AB)/Pr(A) Event space Independence: knowing A does not help: Pr(B|A) = Pr(B) Pr(AB) = Pr(A) · Pr(B)
Operations on random variables • Expected value is a linear operation: • E[ f 1+ f 2] = E[ f 1 ] + E[ f 2] • E[ a f ] = a E[ f ] • E[ f 1· f 2] = E[ f 1 ] · E[ f 2] if they are independent • Variance is a quadratic operation • D2[ a f ] = a2D2[ f ] • D2[ f 1+ f 2] = D2[ f 1 ]+ D2[ f 2] if they are independent
Theorems of large numbers • f 1, f 2,…, fMare ”independent” random variables of the ”same distribution” • weak and strong laws: 1/M fm E[ f ] • theorem of iterated logarithm sup{1/M fm- E[ f ]} = D 2 loglogM/M • Central limit theorem1/M fm is normal distribution mean: E[ f ], standard deviation D[f ]/M
Random number generation • It is enough to generate uniformly distributed numbers in [0,1]: r • Transformation of random variables: f = PDF-1(r) • Proof: Pr{ f < x }= Pr{PDF-1(r) < x }= Pr{r < PDF(x)}= PDF(x)
Real random number generators • Device to generate a single digit of the number LSB: 0,1 noise E0 Clk counter E0 Comparator Dt Dt Random digit: number of changes mod 2
Pseudo random number generation • deterministic iterated functions: • behave like random sequences (chaos) • What is random ???: statistical tests rn+1= F(rn) Derivative of F is large!
Chaos: number of rabbits rn+1= C rn (1-rn) C = 2 C = 4
Bad random number generators rn+1= F(rn) 1 1 1 1 1 1 1 (rn,rn+1) pairs 1
Requirements of F • densely fills the rectangle • has large derivative everywhere • should be in [0, 1] periodicity Aperiodic length
Congruential generator F(x) = { g ·x + c } fractional part of g ·x+c g is large Selection of g and c and x0: making the aperiodic length and the period large
Is it ”random”? • What properties of ”randomness” are needed • Statistical tests: • uniformness: Kolgomorov Smirnov • Spectral probe m(A)/M Dif(m(A)/M, A) is small A 1 0 Distances between the planes are small