540 likes | 690 Views
Lecture note 6. Continuous Random Variables and Probability distribution. Continuous Random Variables. A random variable is continuous if it can take any value in an interval. Probability Distribution of Continuous Random Variables.
E N D
Lecture note 6 Continuous Random Variables and Probability distribution
Continuous Random Variables A random variable is continuous if it can take any value in an interval.
Probability Distribution of Continuous Random Variables • For continuous random variable, we use the following two concepts to describe the probability distribution • Probability Density Function • Cumulative Distribution Function
Probability Density Function • Probability Density Function is a similar concept as the probability distribution function for a discrete random variable. • You can consider the probability density function as a “smoothed” probability distribution function.
Comparing Probability Distribution Function (for discrete r.v) and Probability density function (for continuous r.v)
Comparing Probability Distribution Function (Discrete r.v) and Probability density function Probability distribution function for a discrete random variable shows the probability for each outcome.
Comparing Probability Distribution Function (Discrete r.v) and Probability density function (Continuous r.v) f(x) 0 a b x Probability Density Function shows the probability that the random variable falls in a particular range.
Probability density function • Let X be a continuous random variable. Let x denotes the value that the random variable X takes. We use f(x) to denote the probability density function.
Example • You client told you that he will visit you between noon and 2pm. Between noon and 2pm, the time he will arrive at your company is totally random. Let X be the random variable for the time he arrives (X=1.5 means he visit your office at 1:30pm) • Let x be the possible value for the random variable X. Then, the probability density function f(x) has the following shape.
Probability Density Function Example f(x) 0.5 x 0 2
Probability Density Function Example The probability that your client visits your office between 12:30 and 1:00 is given by the shaded area. f(x) 0.5 x 0 1 0.5 2
Probability Density Function Example Note that area between 0 and 2 should be equal to 1 since the probability that your clients arrives between noon and 2pm is 1 (assuming that he will keep his promise that he will visit between noon and 2pm) f(x) 0.5 x 0 2
Some Properties of probability density function • f(x)≥0 for any x • Total area under f(x) is 1
Cumulative Distribution Function The cumulative distribution function, F(x), for a continuous random variable X expresses the probability that X does not exceed the value of x, as a function of x
In other words, the cumulative distribution function F(x) is given by the shaded area. f(x) F(x)=P(X≤x) x x
Cumulative Distribution Function-Example- f(x) F(x) Cumulative DistributionFunction Density Function 1 0.5 x 2 0 2 0
Probability Density Function and Cumulative Distribution Function-Exercise- • The daily revenue of a certain shop is a continuous random variable. Let X be the random variable for the daily revenue (in thousand dollars). Suppose that X has the following probability density function. Ex.1 Graph f(x), Ex. 2 Find F(3), Ex.3 Find P(2<X<3) Ex. 4 Graph F(x)
Relationship Between Probability Density Function and Cumulative Distribution Function • Let X be a continuous random variable. Then, there is a following relationship between probability density function and cumulative distribution function.
Expectation for continuous random variable • Expectation for a continuous random variable is defined in a similar way as the expectation for a discrete random variable. See Next Page
Expectation for the discrete random variable is defined as • For continuous variable, f(x) corresponds to P(x). However, we cannot sum xf(x) since the value of x is continuous. Therefore, we define the expectation using integral in the following way. where is the lowest possible value of X and is the highest possible value for X. We use μX to denote E(X) .
Expectation of a function of a continuous random variable • Let g(X) be a function of a continuous random variable X. Then the expectation of g(X) is given by the following. A special case is the variance given in the next slide
Variance and standard deviation for the continuous random variable • Variance and standard deviation for a continuous random variable are defined as
Expected valuefor Continuous Random variable-Exercise- • Continue using the daily revenue example. The probability density function for X is given by Compute E(X) and SD(X)
Linear Functions of Variables Let X be a continuous random variable with mean X and variance X2, and let a and b any constant fixed numbers. Define the random variable W as Then the mean and variance of W are and and the standard deviation of W is
Linear Functions of Variables-Exercise- • Continue using the daily sales example. Suppose that the manager’s daily salary in thousand dollars is determined in the following way. W=4+0.5X Ex: Compute E(W) and SD(W)
Linear Functions of Variable An important special case of the previous results is the standardized random variable which has a mean 0 and variance 1.
Normal Distribution • A random variable X is said to be a normal random variable with mean μ and variance σ2 if X has the following probability density function. where e and are physical constants, e = 2.71828. . . and = 3.14159. . .
Probability Density Function for a Normal Distribution 0.4 0.3 0.2 0.1 0.0 x
A note about the normal distribution • Normal distributions approximate many types of distributions. • Normal distributions have been applied to many areas of studies, such as economics, finance, operations management etc.
Normal Distribution approximates many types of distributions.
The shape of the normal distribution • The shape of the normal density function changes with the mean and the standard deviation
The shape of normal distribution. Exercise • Open “Normal Density Function Exercise”. Excel function NORMDIST(x, μ,σ,FALSE) will compute the normal density function with mean μ and standard deviation σ evaluated at x. EX1: Plot the normal density functions for μ=0 and σ=1, and μ=1 and σ=1. You will see how the change in the mean affect the shape of the density (without changing standard deviation) EX2: Plot the normal density function for μ=0 and σ=1, and μ=0 and σ=2. You will see how the change in the standard deviation affects the shape (without changing the mean)
Answer • Changing the mean without changing the standard deviation causes a shift. • Increasing standard deviation makes the shape flatter (Fat Tail).
Properties of the Normal Distribution Suppose that the random variable X follows a normal distribution given by the previous slides. Then • The mean of the random variable is , • The variance of the random variable is 2, • The Following notation means that a random variable has the normal distribution with mean and variance 2.
Cumulative Distribution Function of the Normal Distribution Suppose that X is a normal random variable with mean and variance 2 ; that is X~N(, 2). Then the cumulative distribution function is This is the area under the normal probability density function to the left of x0, as illustrated in the figure in the next slide.
Shaded Area is the Probability that X does not Exceed x0 for a Normal Random Variable f(x) x0 x
Range Probabilities for Normal Random Variables Let X be a normal random variable with cumulative distribution function F(x), and let a and b be two possible values of X, with a < b. Then The probability is the area under the corresponding probability density function between a and b.
Range Probabilities for Normal Random Variables(Figure 6.12) f(x) a b x
The Standard Normal Distribution The Standard Normal Distribution is the normal distribution with mean 0 and variance 1. We often use Z to denote the Standard Normal Variable. If Z has standard normal distribution, we say that Z follows the standard normal distribution.
Computing the cumulative distribution and a range probability for the standard normal distribution • We often need to compute the cumulative distribution and a range probability for a standard normal distribution. • Since the standard normal distribution is used so frequently, most of the textbooks provide the cumulative distribution table. See P837 of our textbook.
Computing the cumulative distribution and a range probability for the standard normal distribution-Exercise- • Compute the following probability P(Z≤1.72) P(1.25<Z<1.72) P(Z≤-1.25) P(-1.25<Z<1.72)
Finding Range Probabilities for Normally Distributed Random Variables Let X be a normally distributed random variable with mean and variance 2. Then the random variable Z = (X - )/ has a standard normal distribution: Z ~ N(0, 1) It follows that if a and b are any numbers with a < b, then where Z is the standard normal random variable and F(z) denotes its cumulative distribution function.
Computing Normal Probabilities(Example 6.6) A very large group of students obtains test scores that are normally distributed with mean 60 and standard deviation 15. If you randomly choose a student, what is the probability that the test score of the student is between 85 and 95?
Joint Cumulative Distribution Functions Let X1, X2, . . .Xk be continuous random variables • Their joint cumulative distribution function, F(x1, x2, . . .xk) defines the probability that simultaneously X1 is less than x1, X2 is less than x2, and so on; that is • The cumulative distribution functions F1(x1), F2(x2), . . .,Fk(xk) of the individual random variables are called their marginal distribution functions. For any i, Fi(xi) is the probability that the random variable Xi does not exceed the specific value xi. • The random variables are independent if and only if
Joint Cumulative Distribution Functions • Joint cumulative distribution functions often have complex forms, especially when they are not independent. • Some simple joint distribution functions are presented in the next page.
Joint distribution functions-Example, optional- • X and Y are said to have independent joint uniform distribution if X and Y has the following distribution function. F(x,y)=xy 0≤x≤1, 0≤y≤1 • X and Y are said to have independent joint exponential distribution if the joint distribution function is given by F(x,y)=(1-e-x)(1-e-y), 0<x<∞、 0<y<∞
Expectation of a function of two random variables. • Let g(X,Y) be a function of two continuous random variables X and Y. Let f(x,y) be the joint distribution function of X and Y.Then the expectation of g(X,Y) is defined as Special case is the covariance in the next slide
Covariance Let X and Y be a pair of continuous random variables, with respective means x and y. The expected value of (x - x)(Y - y) is called the covariance between X and Y. That is An alternative but equivalent expression can be derived as If the random variables X and Y are independent, then the covariance between them is 0. However, the converse is not true.
Correlation Let X and Y be jointly distributed random variables. The correlation between X and Y is