1 / 66

Probability and Probability Distributions

This chapter explores the concept of probability and probability distributions. It covers the basics of probability, including how to calculate probabilities, simulate experiments, and estimate proportions. It also introduces discrete random variables and their probability distributions.

Download Presentation

Probability and Probability Distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Part III Probability and Probability Distributions

  2. Chapter 5 Probability

  3. Chapter 5 • When we talk about probability, we are talking about a (mathematical) measure of how likely it is for some particular thing to happen • Probability deals with chance behavior • We study outcomes, or results of experiments • Each time we conduct an experiment, we may get a different result • Probability models the short-term behavior of experiments

  4. Chapter 5 – Section 1 • Rule – the probability of any event must be greater than or equal to 0 and less than or equal to 1 • It does not make sense to say that there is a –30% chance of rain • It does not make sense to say that there is a 140% chance of rain • Note – probabilities can be written as decimals (0, 0.3, 1.0), or as percents (0%, 30%, 100%), or as fractions (3/10)

  5. Chapter 5 – Section 1 • If we do not know the probability of a certain event E, we can conduct a series of experiments to approximate it by • This becomes a good approximation for P(E) if we have a large number of trials (the law of large numbers)

  6. Chapter 5 – Section 1 • Sometimes probabilities are difficult to calculate, but the experiment can be simulated on a computer • If we simulate the experiment multiple times, then this is similar to the situation for the empirical method • We can use

  7. Chapter 5 – Section 1 • Example • We wish to determine what proportion of students at a certain school have type A blood • We perform an experiment (a simple random sample!) with 100 students • If 29 of those students have type A blood, then we would estimate that the proportion of students at this school with type A blood is 29%

  8. Chapter 5 • Descriptive statistics, describing and summarizing data, deals with data as it is • Probability , modeling data, deals with data as it is predicted to be • The combination of the two will let us do our inferential statistics techniques in Part IV

  9. Chapter 6 Discrete Probability Distributions

  10. Overview • These are probability distributions that are designed to model discrete variables • Many of the discrete probability distributions model “counts”

  11. 6.1 6.2 Chapter 6 Sections • Sections in Chapter 6 • Discrete Random Variables • The Binomial Probability Distribution

  12. Chapter 6Section 1 Discrete Random Variables

  13. 6 1 2 3 4 5 Chapter 6 – Section 1 • Learning objectives • Distinguish between discrete and continuous random variables • Identify discrete probability distributions • Construct probability histograms • Compute and interpret the mean of a discrete random variable • Interpret the mean of a discrete random variable as an expected value • Compute the variance and standard deviation of a discrete random variable

  14. 1 2 3 4 5 6 Chapter 6 – Section 1 • Learning objectives • Distinguish between discrete and continuous random variables • Identify discrete probability distributions • Construct probability histograms • Compute and interpret the mean of a discrete random variable • Interpret the mean of a discrete random variable as an expected value • Compute the variance and standard deviation of a discrete random variable

  15. Chapter 6 – Section 1 • A randomvariable is a numeric measure of the outcome of a probability experiment • Random variables reflect measurements that can change as the experiment is repeated • Random variables are denoted with capital letters, typically using X (and Y and Z …) • Values are usually written with lower case letters, typically using x (and y and z ...)

  16. Chapter 6 – Section 1 • Examples • Tossing four coins and counting the number of heads • The number could be 0, 1, 2, 3, or 4 • The number could change when we toss another four coins • Examples • Tossing four coins and counting the number of heads • The number could be 0, 1, 2, 3, or 4 • The number could change when we toss another four coins • Measuring the heights of students • The heights could change from student to student

  17. Chapter 6 – Section 1 • A discreterandomvariable is a random variable that has either a finite or a countable number of values • A finite number of values such as {0, 1, 2, 3, and 4} • A countable number of values such as {1, 2, 3, …} • A discreterandomvariable is a random variable that has either a finite or a countable number of values • A finite number of values such as {0, 1, 2, 3, and 4} • A countable number of values such as {1, 2, 3, …} • Discrete random variables are designed to model discrete variables (see section 1.2) • Discrete random variables are often “counts of …”

  18. Chapter 6 – Section 1 • An example of a discrete random variable • The number of heads in tossing 3 coins (a finite number of possible values) • An example of a discrete random variable • The number of heads in tossing 3 coins (a finite number of possible values) • There are four possible values – 0 heads, 1 head, 2 heads, and 3 heads • An example of a discrete random variable • The number of heads in tossing 3 coins (a finite number of possible values) • There are four possible values – 0 heads, 1 head, 2 heads, and 3 heads • A finite number of possible values – a discrete random variable • An example of a discrete random variable • The number of heads in tossing 3 coins (a finite number of possible values) • There are four possible values – 0 heads, 1 head, 2 heads, and 3 heads • A finite number of possible values – a discrete random variable • This fits our general concept that discrete random variables are often “counts of …”

  19. Chapter 6 – Section 1 • Other examples of discrete random variables • Other examples of discrete random variables • The possible rolls when rolling a pair of dice • A finite number of possible pairs, ranging from (1,1) to (6,6) • Other examples of discrete random variables • The possible rolls when rolling a pair of dice • A finite number of possible pairs, ranging from (1,1) to (6,6) • The number of pages in statistics textbooks • A countable number of possible values • Other examples of discrete random variables • The possible rolls when rolling a pair of dice • A finite number of possible pairs, ranging from (1,1) to (6,6) • The number of pages in statistics textbooks • A countable number of possible values • The number of visitors to the White House in a day • A countable number of possible values

  20. Chapter 6 – Section 1 • A continuousrandomvariable is a random variable that has an infinite, and more than countable, number of values • The values are any number in an interval • A continuousrandomvariable is a random variable that has an infinite, and more than countable, number of values • The values are any number in an interval • Continuous random variables are designed to model continuous variables (see section 1.1) • Continuous random variables are often “measurements of …”

  21. Chapter 6 – Section 1 • An example of a continuous random variable • The possible temperature in Chicago at noon tomorrow, measured in degrees Fahrenheit • An example of a continuous random variable • The possible temperature in Chicago at noon tomorrow, measured in degrees Fahrenheit • The possible values (assuming that we can measure temperature to great accuracy) are in an interval • An example of a continuous random variable • The possible temperature in Chicago at noon tomorrow, measured in degrees Fahrenheit • The possible values (assuming that we can measure temperature to great accuracy) are in an interval • The interval may be something like (–20,110) • An example of a continuous random variable • The possible temperature in Chicago at noon tomorrow, measured in degrees Fahrenheit • The possible values (assuming that we can measure temperature to great accuracy) are in an interval • The interval may be something like (–20,110) • This fits our general concept that continuous random variables are often “measurements of …”

  22. Chapter 6 – Section 1 • Other examples of continuous random variables • Other examples of continuous random variables • The height of a college student • A value in an interval between 3 and 8 feet • Other examples of continuous random variables • The height of a college student • A value in an interval between 3 and 8 feet • The length of a country and western song • A value in an interval between 1 and 15 minutes • Other examples of continuous random variables • The height of a college student • A value in an interval between 3 and 8 feet • The length of a country and western song • A value in an interval between 1 and 15 minutes • The number of bytes of storage used on a 80 GB (80 billion bytes) hard drive • Although this is discrete, it is more reasonable to model it as a continuous random variable between 0 and 80 GB

  23. Chapter 6 – Section 1 • The probabilitydistribution of a discrete random variable X relates the values of X with their corresponding probabilities • The probabilitydistribution of a discrete random variable X relates the values of X with their corresponding probabilities • A distribution could be • In the form of a table • In the form of a graph • In the form of a mathematical formula

  24. Chapter 6 – Section 1 • If X is a discrete random variable and x is a possible value for X, then we write P(x) as the probability that X is equal to x • If X is a discrete random variable and x is a possible value for X, then we write P(x) as the probability that X is equal to x • Examples • In tossing one coin, if X is the number of heads, then P(0) = 0.5 and P(1) = 0.5 • In rolling one die, if X is the number rolled, thenP(1) = 1/6

  25. Chapter 6 – Section 1 • Properties of P(x) • Since P(x) form a probability distribution, they must satisfy the rules of probability • 0 ≤ P(x) ≤ 1 • ΣP(x) = 1 • In the second rule, the Σ sign means to add up the P(x)’s for all the possible x’s

  26. Chapter 6 – Section 1 • An example of a discrete probability distribution • All of the P(x) values are positive and they add up to 1

  27. Chapter 6 – Section 1 • An example that is not a probability distribution • Two things are wrong • An example that is not a probability distribution • Two things are wrong • P(5) is negative • An example that is not a probability distribution • Two things are wrong • P(5) is negative • The P(x)’s do not add up to 1

  28. Chapter 6 – Section 1 • A probabilityhistogram is a histogram where • The horizontal axis corresponds to the possible values of X (i.e. the x’s) • The vertical axis corresponds to the probabilities for those values (i.e. the P(x)’s) • A probability histogram is very similar to a relative frequency histogram

  29. Chapter 6 – Section 1 • An example of a probability histogram • The histogram is drawn so that the height of the bar is the probability of that value

  30. Chapter 6 – Section 1 • The meanofaprobabilitydistribution can be thought of in this way: • There are various possible values of a discrete random variable • The meanofaprobabilitydistribution can be thought of in this way: • There are various possible values of a discrete random variable • The values that have the higher probabilities are the ones that occur more often • The meanofaprobabilitydistribution can be thought of in this way: • There are various possible values of a discrete random variable • The values that have the higher probabilities are the ones that occur more often • The values that occur more often should have a larger role in calculating the mean • The meanofaprobabilitydistribution can be thought of in this way: • There are various possible values of a discrete random variable • The values that have the higher probabilities are the ones that occur more often • The values that occur more often should have a larger role in calculating the mean • The mean is the weighted average of the values, weighted by the probabilities

  31. Chapter 6 – Section 1 • The mean of a discrete random variable is μX = Σ [ x • P(x) ] • The mean of a discrete random variable is μX = Σ [ x • P(x) ] • In this formula • x are the possible values of X • P(x) is the probability that x occurs • Σ means to add up these terms for all the possible values x

  32. Multiply Multiply Multiply again Multiply again Multiply again Multiply again Multiply again Multiply again Chapter 6 – Section 1 • Example of a calculation for the mean • Example of a calculation for the mean • Example of a calculation for the mean • Example of a calculation for the mean • Add: 0.2 + 1.2 + 0.5 + 0.6 = 2.5 • The mean of this discrete random variable is 2.5

  33. Chapter 6 – Section 1 • The calculation for this problem written out μX = Σ [ x • P(x) ] = [1• 0.2] + [2• 0.6] + [5• 0.1] + [6• 0.1] = 0.2 + 1.2 + 0.5 + 0.6 = 2.5 • The mean of this discrete random variable is 2.5

  34. Chapter 6 – Section 1 • The mean can also be thought of this way (as in the Law of Large Numbers) • The mean can also be thought of this way (as in the Law of Large Numbers) • If we repeat the experiment many times • The mean can also be thought of this way (as in the Law of Large Numbers) • If we repeat the experiment many times • If we record the result each time • The mean can also be thought of this way (as in the Law of Large Numbers) • If we repeat the experiment many times • If we record the result each time • If we calculate the mean of the results (this is just a mean of a group of numbers) • The mean can also be thought of this way (as in the Law of Large Numbers) • If we repeat the experiment many times • If we record the result each time • If we calculate the mean of the results (this is just a mean of a group of numbers) • Then this mean of the results gets closer and closer to the mean of the random variable

  35. Chapter 6 – Section 1 • The expectedvalue of a random variable is another term for its mean • The expectedvalue of a random variable is another term for its mean • The term “expected value” illustrates the long term nature of the experiments – as we perform more and more experiments, the mean of the results of those experiments gets closer to the “expected value” of the random variable

  36. Chapter 6 – Section 1 • The variance of a discrete random variable is computed similarly as for the mean • The variance of a discrete random variable is computed similarly as for the mean • The mean is the weighted sum of the values μX = Σ [ x • P(x) ] • The variance of a discrete random variable is computed similarly as for the mean • The mean is the weighted sum of the values μX = Σ [ x • P(x) ] • The variance is the weighted sum of the squared differences from the mean σX2 = Σ [ (x – μX)2 • P(x) ] • The variance of a discrete random variable is computed similarly as for the mean • The mean is the weighted sum of the values μX = Σ [ x • P(x) ] • The variance is the weighted sum of the squared differences from the mean σX2 = Σ [ (x – μX)2 • P(x) ] • The standard deviation, as we’ve seen before, is the square root of the variance … σX = √ σX2

  37. Chapter 6 – Section 1 • The variance formula σX2 = Σ [ (x – μX)2 • P(x) ] can involve calculations with many decimals or fractions • An equivalent formula is σX2 = [ Σx2 • P(x) ] – μX2 • This formula is often easier to compute

  38. Chapter 6 – Section 1 • For variables and samples (section 3.2), we had the concept of a population variance (for the entire population) and a sample variance (for a sample from that population) • These probability distributions model the complete population • These are population variance formulas • There is no analogy for sample variance here

  39. Chapter 6 – Section 1 • The variance can be calculated by hand, but the calculation is very tedious • Whenever possible, use technology (calculators, software programs, etc.) to calculate variances and standard deviations

  40. Summary: Chapter 6 – Section 1 • Discrete random variables are measures of outcomes that have discrete values • Discrete random variables are specified by their Discrete probability distributions • The mean of a discrete random variable can be interpreted as the long term average of repeated independent experiments • The variance of a discrete random variable measures its dispersion from its mean

  41. Chapter 6Section 2 The Binomial Probability Distribution

  42. Chapter 6 – Section 2 • A binomial experiment has the following structure • A binomial experiment has the following structure • The first test is performed … the result is either a success or a failure • A binomial experiment has the following structure • The first test is performed … the result is either a success or a failure • The second test is performed … the result is either a success or a failure. This result is independent of the first and the chance of success is the same • A binomial experiment has the following structure • The first test is performed … the result is either a success or a failure • The second test is performed … the result is either a success or a failure. This result is independent of the first and the chance of success is the same • A third test is performed … the result is either a success or a failure. The result is independent of the first two and the chance of success is the same

  43. Chapter 6 – Section 2 • Example • A card is drawn from a deck. A “success” is for that card to be a heart … a “failure” is for any other suit • Example • A card is drawn from a deck. A “success” is for that card to be a heart … a “failure” is for any other suit • The card is then put back into the deck • Example • A card is drawn from a deck. A “success” is for that card to be a heart … a “failure” is for any other suit • The card is then put back into the deck • A second card is drawn from the deck with the same definition of success. • Example • A card is drawn from a deck. A “success” is for that card to be a heart … a “failure” is for any other suit • The card is then put back into the deck • A second card is drawn from the deck with the same definition of success. • The second card is put back into the deck • Example • A card is drawn from a deck. A “success” is for that card to be a heart … a “failure” is for any other suit • The card is then put back into the deck • A second card is drawn from the deck with the same definition of success. • The second card is put back into the deck • We continue for 10 cards

  44. Chapter 6 – Section 2 • A binomialexperiment is an experiment with the following characteristics • A binomialexperiment is an experiment with the following characteristics • The experiment is performed a fixed number of times, each time called a trial • A binomialexperiment is an experiment with the following characteristics • The experiment is performed a fixed number of times, each time called a trial • The trials are independent • A binomialexperiment is an experiment with the following characteristics • The experiment is performed a fixed number of times, each time called a trial • The trials are independent • Each trial has two possible outcomes, usually called a success and a failure • A binomialexperiment is an experiment with the following characteristics • The experiment is performed a fixed number of times, each time called a trial • The trials are independent • Each trial has two possible outcomes, usually called a success and a failure • The probability of success is the same for every trial

  45. Chapter 6 – Section 2 • Notation used for binomial distributions • The number of trials is represented by n • The probability of a success is represented by p • The total number of successes in n trials is represented by X • Because there cannot be a negative number of successes, and because there cannot be more than n successes (out of n attempts) 0 ≤ X ≤ n

  46. Chapter 6 – Section 2 • In our card drawing example • Each trial is the experiment of drawing one card • The experiment is performed 10 times, so n = 10 • In our card drawing example • Each trial is the experiment of drawing one card • The experiment is performed 10 times, so n = 10 • The trials are independent because the drawn card is put back into the deck • In our card drawing example • Each trial is the experiment of drawing one card • The experiment is performed 10 times, so n = 10 • The trials are independent because the drawn card is put back into the deck • Each trial has two possible outcomes, a “success” of drawing a heart and a “failure” of drawing anything else • The probability of success is 0.25, the same for every trial, so p = 0.25 • In our card drawing example • Each trial is the experiment of drawing one card • The experiment is performed 10 times, so n = 10 • The trials are independent because the drawn card is put back into the deck • Each trial has two possible outcomes, a “success” of drawing a heart and a “failure” of drawing anything else • The probability of success is 0.25, the same for every trial, so p = 0.25 • X, the number of successes, is between 0 and 10

  47. Chapter 6 – Section 2 • We would like to calculate the probabilities of X, i.e. P(0), P(1), P(2), …, P(n) • Do a simpler example first • For n = 3 trials • With p = .4 probability of success • Calculate P(2), the probability of 2 successes

  48. Chapter 6 – Section 2 • For 3 trials, the possible ways of getting exactly 2 successes are • S S F • S F S • F S S • For 3 trials, the possible ways of getting exactly 2 successes are • S S F • S F S • F S S • The probabilities for each (using the multiplication rule) are • 0.4 • 0.4 • 0.6 = 0.096 • 0.4 • 0.6 • 0.4 = 0.096 • 0.6 • 0.4 • 0.4 = 0.096

  49. Chapter 6 – Section 2 • The total probability is P(2) = 0.096 + 0.096 + 0.096 = 0.288 • The total probability is P(2) = 0.096 + 0.096 + 0.096 = 0.288 • But there is a pattern • Each way had the same probability … the probability of 2 success (0.4 times 0.4) times the probability of 1 failure (0.6 times 0.6) • The total probability is P(2) = 0.096 + 0.096 + 0.096 = 0.288 • But there is a pattern • Each way had the same probability … the probability of 2 success (0.4 times 0.4) times the probability of 1 failure (0.6 times 0.6) • The probability for each case is 0.42 • 0.61

  50. Chapter 6 – Section 2 • There are 3 cases • S S F could represent choosing a combination of 2 out of 3 … choosing the first and the second • S F S could represent choosing a second combination of 2 out of 3 … choosing the first and the third • F S S could represent choosing a third combination of 2 out of 3 • There are 3 cases • S S F could represent choosing a combination of 2 out of 3 … choosing the first and the second • S F S could represent choosing a second combination of 2 out of 3 … choosing the first and the third • F S S could represent choosing a third combination of 2 out of 3 • These are the 3 = 3C2 ways to choose 2 out of 3

More Related