340 likes | 481 Views
Chapter 3. Random Variables and Probability Distributions. Chapter 3. Random Variables and Probability Distributions. Chapter 3.1. Concept of a Random Variable. Concept of a Random Variable.
E N D
Chapter 3 Random Variables and Probability Distributions Chapter 3 Random Variables and Probability Distributions
Chapter 3.1 Concept of a Random Variable Concept of a Random Variable • Random variable is a function that associates a real number with each element in the sample space. • In other words, random variable is a numerical description of the outcome of an experiment, where each outcome gets assigned a numerical value. • A capital letter, say X, is used to denotes a random variable and its corresponding small letter, x in this case, for one of its values.
Chapter 3.1 Concept of a Random Variable Concept of a Random Variable The sample space giving a detailed description of each possible outcome when three electronic components are tested may be written as One is concerned with the number of defectives that occurs. Thus each point in the sample space will be assigned a numerical value of 0, 1, 2, or 3. Then, the random variable X assumes the value 2 for all elements in the subset Two balls are drawn in succession without replacement from an urn containing 4 red balls and 3 black balls. The possible outcomes and the values y of the random variable Y, where Y is the number of red balls are
Chapter 3.1 Concept of a Random Variable Sample Space and Random Variable • If a sample space contains a finite number of possibilities or an unending sequence with as many elements as there are whole numbers, it is called a discrete sample space. • If a sample space contains an infinite number of possibilities equal to the number of points on a line segment, it is called a continuous sample space. • A random variable is called a discrete random variable if its set of possible outcomes is countable. • A random variable is called a continuous random variable if it can take on values on a continuous scale. If X is the random variable assigned to the waiting time, in minute, for a bus at a bus stop, then the random variable X may take on all values of waiting time x, x ≥ 0. In this case, X is a continuous random variable.
Chapter 3.2 Discrete Probability Distributions Discrete Probability Distributions • Frequently, it is convenient to represent all the probabilities of a random variable X by a formula. • Such a formula would necessarily be a function of the numerical values x, denoted by f(x), g(x), r(x), and so forth. • For example, • The set of ordered pairs (x, f(x)) is a probability function, probability mass function, or probability distribution of the discrete random variable X if, for each possible outcome x,
Chapter 3.2 Discrete Probability Distributions Discrete Probability Distributions In the experiment of tossing a fair coin twice, the random variable X represents how many times the head turns up. The possible value for x of X and their probability can be summarized as
Chapter 3.2 Discrete Probability Distributions Discrete Probability Distributions A shipment of 20 similar laptop computers to a retail outlet contains 3 that are defective. If a school makes a random purchase of 2 of these computers, find the probability distribution for the number of defectives. Let X be a random variable, whose value x are the possible numbers of defective computers purchased by the school.
Chapter 3.2 Discrete Probability Distributions Discrete Probability Distributions • There are many problems where we may wish to compute the probability that the observed value of a random variable X will be less than or equal to some real number x. • The cumulative distribution F(x) of a discrete random variable X with probability distribution f(x) is • Example of a probability distribution • Example of a cumulative distribution
Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions • In case the sample space is continuous, there can be unlimited number of possible value for the samples. • Thus, it is more meaningful to deal with an interval rather than a point value of a random variable. • For example, it does not make sense to know the probability of selecting person at random who is exactly 164 cm tall. It will be more useful to talk about the probability of selecting a person who is at least 163 cm but not more than 165 cm. • We shall concern ourselves now with computing probabilities for various intervals of continuous random variables such as P(a<X<b), P(W≥c), P(U≤d) and so forth. • Note that when X is continuous • Probability of a point value is zero
Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions • In dealing with continuous variables, the notation commonly used is f(x) and it is usually called the probability density function, or the density function of X. • For most practical application, the density functions are continuous and differentiable. • Their graphs may take any forms, but since it will be used to represent probabilities, the density function must lie entirely above the x axis to represent positive probability. f(x) f(x) f(x) x x x
Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions • A probability density function is constructed so that the area under its curve bounded by the x axis is equal to 1 when computed over the range of X for which f(x) is defined. • In the figure below, the probability that X assumes a value between a and b is equal to the shaded area under the density function between the ordinates at x=a and x=b.
Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions • The function f(x) is a probability density function for the continuous random variable X, defined over the set of real numbers Rif
Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions Suppose that the error in the reaction temperature, in °C, for a controlled laboratory experiment is a continuous random variable X having the probability density function Verify whether Find P(0<X≤1)
Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions • The cumulative distribution F(x) of a continuous random variable X with density function f(x) is For the density function in the last example, find F(x) and use it to evaluate P(0<X≤1).
Chapter 3.4 Joint Probability Distributions Joint Probability Distributions • If X and Y are two discrete random variables, the probability distribution for their simultaneous occurrence can be represented by a function with values f(x,y) for any pair of values (x,y) within the range of the random variables X and Y. • Such function is referred to as the joint probability distribution of X and Y. • The function f(x,y) is a joint probability density function or joint probability distribution function of the discrete random variables X and Y if • For any region A in the xy plane,
Chapter 3.4 Joint Probability Distributions Joint Probability Distributions • Two ballpoint pens are selected at random from a box that contains 3 blue pens, 2 red pens, and 3 green pens. If X is the number of blue pens selected and Y is the number of red pens selected, find • the joint probability function f(x,y) • P[(X,Y)A], where A is the region {(x,y)|x+y≤1}
Chapter 3.4 Joint Probability Distributions Joint Probability Distributions • The function f(x,y) is a joint probability density function of the continuous random variables X and Y if • For any region A in the xy plane.
Chapter 3.4 Joint Probability Distributions Joint Probability Distributions • A privately owned business operates both a drive-in facility and a walk-in facility. On a randomly selected day, let X and Y, respectively, be the proportions of the time that the drive-in and the walk-in facilities are in use, and suppose that the joint density function of these random variables is • Verify that f(x,y) is a joint density function. • Find P[(X,Y)A], where A is {(x,y)|0<x<1/2, 1/4<y<1/2}.
Chapter 3.4 Joint Probability Distributions Joint Probability Distributions Find P[(X,Y)A], where A is {(x,y)|0<x<1/2, 1/4<y<1/2}.
Chapter 3.4 Joint Probability Distributions Marginal Probability Distributions • The marginal probability distribution functions of X alone and of Y alone are for the discrete case, and for the continuous case. • The term marginal is used here because, in discrete case, the values of g(x) and h(y) are just the marginal totals of the respective columns and rows when the values of f(x,y) are displayed in a rectangular table.
Chapter 3.4 Joint Probability Distributions Marginal Probability Distributions Show that the column and row totals from the “ballpoint pens” example give the marginal distribution of X alone and of Y alone. • It is found that the values of g(x) are just the column totals of the table above. • In similar manner we could show that the values of h(y) are given by the row totals.
Chapter 3.4 Joint Probability Distributions Marginal Probability Distributions Find f(x) and h(y) for the joint density function of the “drive-in walk-in facility” example.
Chapter 3.4 Joint Probability Distributions Conditional Probability Distributions • Let X and Y be two random variables, discrete or continuous. The conditional probability distribution function of the random variable Y, given than X = x, is Similarly, the conditional distribution of the random variable X, given that Y = y, is
Chapter 3.4 Joint Probability Distributions Conditional Probability Distributions • If one wished to find the probability that the discrete random variable X falls between a and b when it is known that the discrete variable Y = y, we evaluate where the summation extends over all available values of X between a and b. • When X and Y are continuous, we can find the probability that X lies between a and b by evaluating
Chapter 3.4 Joint Probability Distributions Conditional Probability Distributions Referring back to the “ballpoint pens” example, find the conditional distribution of X, given that Y=1, and use it to determine P(X=0|Y=1).
Chapter 3.4 Joint Probability Distributions Conditional Probability Distributions Given the joint density function find g(x), h(y), f(x|y), and evaluate P(1/4<X<1/2|Y=1/3).
Chapter 3.4 Joint Probability Distributions Statistical Independence • Let X and Y be two random variables, discrete or continuous, with joint probability distribution f(x,y) and marginal distributions g(x) and h(y), respectively. The random variables X and Y are said to be statistically independent if and only if for all (x,y) within their range.
Chapter 3.4 Joint Probability Distributions Statistical Independence Consider the following joint probability density function of random variables X and Y. Find the marginal density functions of X and Y Are X and Y statistically independent? Find P(X>2|Y=2)
Chapter 3.4 Joint Probability Distributions Statistical Independence Are X and Y statistically independent? • X and Yare not statistically independent Find P(X>2|Y=2)
Chapter 3.4 Joint Probability Distributions Statistical Independence • Let X1, X2, ..., Xn be n random variables, discrete or continuous, with joint probability distribution f(x1, x2, ..., xn) and marginal distributions f1(x1), f2(x2), ..., fn(xn), respectively. The random variables X1, X2, ..., Xn are said to be mutually statistically independent if and only if for all (x1,x2, ..., xn)) within their range.
Chapter 3.4 Joint Probability Distributions Statistical Independence Suppose that the shelf life, in years, of a certain perishable food product packaged in cardboard containers is a random variable whose probability density function is given by Let X1, X2, and X3 represent the shelf lives for three of these containers selected independently and find P(X1<2,1<X2<3,X3>2)
Probability and Statistics Homework 3 A game is played with the rule that a counter will move forward one, two, or four places according to whether the scores on the two dice rolled differ by three or more, by one or two, or are equal. Here we define a random variable, M, the number of places moved, which can take the value 1, 2, or 4. Determine the probability distribution of M. (Sou.04.E1 s.2) Let the random variable X denote the time until a computer server connects to your notebook (in milliseconds), and let Y denote the time until the server authorizes you as a valid user (in milliseconds). Each of these random variables measures the wait from a common starting time. Assume that the joint probability density function for X and Y is Show that X and Y are independent. (Mo.E5.20) Determine P(X>1000, Y<1000).