210 likes | 486 Views
Distribution Gamma Function Stochastic Process. Tutorial 4, STAT1301 Fall 2010, 12OCT2010 , MB103@HKU By Joseph Dong. Reference. Wikipedia. Recall: Distribution of a Random Variable.
E N D
DistributionGamma FunctionStochastic Process Tutorial 4, STAT1301 Fall 2010, 12OCT2010, MB103@HKUBy Joseph Dong
Reference Wikipedia
Recall: Distribution of a Random Variable • One way to describe the random behavior of a random variable is to give its probability distribution, specifying the probability of taking each element in its range (the sample space). • The representation of a probability distribution comes either in a differential form: the pdf/pmf, or in an integral form: the cdf. • The cdf is a never-decreasing, right-continuous function from to . • The pdf/pmf is a non-negative, normalized function from to a subset of .
Recall: versus • is never-decreasing • is rightward continuous • , • A slightly modified formula can apply to :
Gamma Function • ,
Handout Problems 6 & 7 Technical • Problem 6: • Gamma function and integration practice • Problem 7: • important continuous distributions and their relationships
From Bernoulli Trials to Discrete Waiting Time (Handout Problems 1-4) • A single Bernoulli trial: • Tossing a coin • Only two outcomes and they are complementary to each other. • Bernoulli trials: we want to count#success, this gives rise to a Binomial random variable • Bernoulli trials: we want to know how long we should wait until the first success (Geometric random variable). • Bernoulli trials: we want to know how long we should wait until the success (Negative Binomial) • Bernoulli trials: we want to know how long we should wait between two successes (?)
Poisson [pwa’sɔ̃] Distribution • Poisson Approximation to Binomial (PAB) • Handout Problem 5 • The true utility of Poisson distribution—Poisson process: • Sort of the limiting case of Bernoulli trials (use PAB to facilitate thinking) • “continuous” Bernoulli trials
Sequence of Random Variables • A sequence of random variables is an ordered and countable collection of random variables, usually indexed by integers starting from one: , where can be finite or . • Shortly written as • A sequence of Random Variables is a discrete-time stochastic process. • For example, a sequence of Bernoulli trials is a discrete-time stochastic process called a Bernoulli process.
Stochastic Process: Discrete-time and Continuous-time • A stochastic process is (nothing but) an ordered, not necessarily countable, collection of random variables, indexed by an index set. • Shortly written as • Usually bears a physical meaning of Time • If is a continuous(discrete) set, we call the indexed r.v.’s a “continuous(discrete)-time process.” • In many continuous-time cases, we choose, and in that case, we can write the stochastic process as .
Stochastic Process = Set of RVs + Index Set Sample Path of a Stochastic Process Discrete-time process Continuous-time process
Bernoulli Trials (Bernoulli Process) • Bernoulli Trials (with success probability ) • Discrete-time process • a sequence of independent and identically distributed (iid) Bernoulli Random Variablesfollowing the common distribution . • Written where are independent and all .
Poisson Process • Poisson Process (with intensity ) • Continuous-time process • Limiting case of Bernoulli Trails when the index set becomes continuous. • “Poisson” in the name because the counts of success on any interval follows , irrespective of the location of the chosen interval on the time axis. • Also if two disjoint time intervals and are chosen, then the counts of success on each of them are independent.
Discrete Distribution Based On Bernoulli Trails • Bernoulli Distribution , one trial • Binomial Distribution , n trials • Poisson Distribution , ly many trials • Geometric Distribution , indefinitely many but at least one trial • Negative Binomial Distribution , indefinitely many but at least r trials.
Continuous Distribution Based On Poisson Process • Poisson Distribution (discrete) as building block • Distribution of counts on any infinitesimal time interval is , where represents the intensity (a differential concept). • Additive: , , and independent, then (Proof: use MGF) • Exponential Distribution as waiting time until first success/arrival/occurrence or inter-arrival time. • Gamma Distribution as waiting time until success/arrival/occurrence.
Examples of Poisson Process • Radioactive disintegrations • Flying-bomb hits on London • Chromosome interchanges in cells • Connection to wrong number • Bacteria and blood counts Feller: An Introduction to Probability Theory and Its Applications (3e) Vol. 1. §VI.6.
Radioactive Disintegrations Rutherford Chadwick Geiger Geiger Counter
Explanation • There are 57 time intervals (7.5 sec each) recorded zero emission. • There are 203 time intervals (7.5 sec each) recorded 1 emission. • …… • There are total 2608 time intervals (7.5 sec each) involved. • On average, each interval recorded 3.87 emissions. • Use 3.87 as the intensity of the Poisson process that models the counts of emissions on each of the 2608 intervals.
What’s the waiting time until recording 40 emissions? • Assuming emission mechanism follows a Poisson process with intensity over every 7.5s interval, then waiting time until recording the emission follows . • The waiting time of recording the emission follows and its expected value is 40/3.87=154.8 intervals (each of 7.5s long) or 1161 seconds (a bit more than 19 minutes).