420 likes | 746 Views
Chapter 5. Discrete Random Variables and Probability Distributions. Random Variables. A random variable is a variable that takes on numerical values determined by the outcome of a random experiment. Discrete Random Variables.
E N D
Chapter 5 Discrete Random Variables and Probability Distributions
Random Variables A random variable is a variable that takes on numerical values determined by the outcome of a random experiment.
Discrete Random Variables A random variable is discrete if it can take on no more than a countable number of values.
Discrete Random Variables(Examples) • The number of defective items in a sample of twenty items taken from a large shipment. • The number of customers arriving at a check-out counter in an hour. • The number of errors detected in a corporation’s accounts. • The number of claims on a medical insurance policy in a particular year.
Continuous Random Variables A random variable is continuousif it can take any value in an interval.
Continuous Random Variables(Examples) • The income in a year for a family. • The amount of oil imported into the U.S. in a particular month. • The change in the price of a share of IBM common stock in a month. • The time that elapses between the installation of a new computer and its failure. • The percentage of impurity in a batch of chemicals.
Discrete Probability Distributions The probability distribution function (DPF), P(x), of a discrete random variable expresses the probability that X takes the value x, as a function of x. That is
Graph the probability distribution function for the roll of a single six-sided die. Discrete Probability Distributions P(x) 1/6 1 2 3 4 5 6 x Figure 5.1
Required Properties of Probability Distribution Functions of Discrete Random Variables Let X be a discrete random variable with probability distribution function, P(x). Then • P(x) 0 for any value of x • The individual probabilities sum to 1; that is Where the notation indicates summation over all possible values x.
Cumulative Probability Function The cumulative probability function, F(x0), of a random variable X expresses the probability that X does not exceed the value x0, as a function of x0. That is Where the function is evaluated at all values x0
Derived Relationship Between Probability and Cumulative Probability Function Let X be a random variable with probability function P(x) and cumulative probability function F(x0). Then it can be shown that Where the notation implies that summation is over all possible values x that are less than or equal to x0.
Derived Properties of Cumulative Probability Functions for Discrete Random Variables Let X be a discrete random variable with a cumulative probability function, F(x0). Then we can show that • 0 F(x0) 1 for every number x0 • If x0 and x1 are two numbers with x0 < x1, then F(x0) F(x1)
Expected Value The expected value, E(X), of a discrete random variable X is defined Where the notation indicates that summation extends over all possible values x. The expected value of a random variable is called its mean and is denoted x.
Variance and Standard Deviation Let X be a discrete random variable. The expectation of the squared discrepancies about the mean, (X - )2, is called the variance, denoted 2x and is given by The standard deviation, x , is the positive square root of the variance.
Variance(Alternative Formula) The variance of a discrete random variable X can be expressed as
Expected Value and Variance for Discrete Random Variable Using Microsoft Excel(Figure 5.4) Expected Value = 1.95 Variance = 1.9475
Bernoulli Distribution A Bernoulli distribution arises from a random experiment which can give rise to just two possible outcomes. These outcomes are usually labeled as either “success” or “failure.” If denotes the probability of a success and the probability of a failure is (1 - ), the the Bernoulli probability function is
Mean and Variance of a Bernoulli Random Variable The mean is: And the variance is:
Sequences of x Successes in n Trials The number of sequences with x successes in n independent trials is: Where n! = n x (x – 1) x (n – 2) x . . . x 1 and 0! = 1.
Binomial Distribution Suppose that a random experiment can result in two possible mutually exclusive and collectively exhaustive outcomes, “success” and “failure,” and that is the probability of a success resulting in a single trial. If n independent trials are carried out, the distribution of the resulting number of successes “x” is called the binomial distribution. Its probability distribution function for the binomial random variable X = x is: P(x successes in n independent trials)= for x = 0, 1, 2 . . . , n
Mean and Variance of a Binomial Probability Distribution Let X be the number of successes in n independent trials, each with probability of success . The x follows a binomial distribution with mean, and variance,
Binomial Probabilities- An Example –(Example 5.7) An insurance broker, Shirley Ferguson, has five contracts, and she believes that for each contract, the probability of making a sale is 0.40. What is the probability that she makes at most one sale? P(at most one sale) = P(X 1) = P(X = 0) + P(X = 1) = 0.078 + 0.259 = 0.337
Hypergeometric Distribution Suppose that a random sample of n objects is chosen from a group of N objects, S of which are successes. The distribution of the number of X successes in the sample is called the hypergeometric distribution. Its probability function is: Where x can take integer values ranging from the larger of 0 and [n-(N-S)] to the smaller of n and S.
Poisson Probability Distribution Assume that an interval is divided into a very large number of subintervals so that the probability of the occurrence of an event in any subinterval is very small. The assumptions of a Poisson probability distribution are: • The probability of an occurrence of an event is constant for all subintervals. • There can be no more than one occurrence in each subinterval. • Occurrences are independent; that is, the number of occurrences in any non-overlapping intervals in independent of one another.
Poisson Probability Distribution The random variable X is said to follow the Poisson probability distribution if it has the probability function: where P(x) = the probability of x successes over a given period of time or space, given = the expected number of successes per time or space unit; > 0 e = 2.71828 (the base for natural logarithms)
Poisson Probability Distribution • The mean and variance of the Poisson probability distribution are:
Partial Poisson Probabilities for = 0.03 Obtained Using Microsoft Excel PHStat(Figure 5.14)
Poisson Approximation to the Binomial Distribution Let X be the number of successes resulting from n independent trials, each with a probability of success, . The distribution of the number of successes X is binomial, with mean n. If the number of trials n is large and n is of only moderate size (preferably n 7), this distribution can be approximated by the Poisson distribution with = n. The probability function of the approximating distribution is then:
Joint Probability Functions Let X and Y be a pair of discrete random variables. Their joint probability function expresses the probability that X takes the specific value x and simultaneously Y takes the value y, as a function of x and y. The notation used is P(x, y) so,
Joint Probability Functions Let X and Y be a pair of jointly distributed random variables. In this context the probability function of the random variable X is called its marginal probability function and is obtained by summing the joint probabilities over all possible values; that is, Similarly, the marginal probability function of the random variable Y is
Properties of Joint Probability Functions • Let X and Y be discrete random variables with joint probability function P(x,y). Then • P(x,y) 0 for any pair of values x and y • The sum of the joint probabilities P(x, y) over all possible values must be 1.
Conditional Probability Functions Let X and Y be a pair of jointly distributed discrete random variables. The conditional probability function of the random variable Y, given that the random variable X takes the value x, expresses the probability that Y takes the value y, as a function of y, when the value x is specified for X. This is denoted P(y|x), and so by the definition of conditional probability: Similarly, the conditional probability function of X, given Y = y is:
Independence of Jointly Distributed Random Variables The jointly distributed random variables X and Y are said to be independent if and only if their joint probability function is the product of their marginal probability functions, that is, if and only if And k random variables are independent if and only if
Expected Value Function of Jointly Distributed Random Variables Let X and Y be a pair of discrete random variables with joint probability function P(x, y). The expectation of anyfunction g(x, y) of these random variables is defined as:
Stock Returns, Marginal Probability, Mean, Variance(Example 5.16) Table 5.6
Covariance Let X be a random variable with mean X , and let Y be a random variable with mean, Y . The expected value of (X - X )(Y - Y ) is called the covariancebetween X and Y, denoted Cov(X, Y). For discrete random variables An equivalent expression is
Correlation Let X and Y be jointly distributed random variables. The correlation between X and Y is:
Covariance and Statistical Independence If two random variables are statistically independent, the covariance between them is 0. However, the converse is not necessarily true.
Portfolio Analysis The random variable X is the price for stock A and the random variable Y is the price for stock B. The market value, W, for the portfolio is given by the linear function, Where, a, is the number of shares of stock A and, b, is the number of shares of stock B.
Portfolio Analysis The mean value for W is, The variance for W is, or using the correlation,