1 / 43

Introduction to Stochastic Models GSLM 54100

Introduction to Stochastic Models GSLM 54100. Outline. conditional probability & Binomial recursive relationships examples of similar random phenomena. Conditional Probability. A former Mid-Term Question (Compared to Example 3.4 in Ross ).

aoconnell
Download Presentation

Introduction to Stochastic Models GSLM 54100

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Stochastic ModelsGSLM 54100 1

  2. Outline • conditional probability & Binomial • recursive relationships • examples of similar random phenomena 2

  3. Conditional Probability 3

  4. A former Mid-Term Question (Compared to Example 3.4 in Ross ) • three types of cakes, chocolate, mango, and strawberry in a bakery • each customer choosing chocolate, mango, and strawberry w.p. 1/2, 1/3, and 1/6, respectively, independent of everything else • profit from each piece of chocolate, mango, and strawberry cake ~ $3, $2, and $1, respectively • 4 cream cakes sold on a particular day • (a). Let Xc be the number of chocolate cream cakes sold on that day. Find the distribution of Xc. • (b). Find the expected total profit of the day from the 4 cream cakes. • (c). Given that no chocolate cream cake is sold on that day, find the variance of the total profit of the day. 4

  5. A former Mid-Term Question (Compared to Example 3.4 in Ross ) • (a). Xc = the number of chocolate cream cakes sold on that day • Xc ~ Bin(4, 1/2) • (b). E(total profit of the day) = E(3Xc+ 2Xm + Xs) = 3E(Xc) + 2E(Xm) + E(Xs) = 6+(8/3)+(2/3) = 28/3. 5

  6. A former Mid-Term Question (Compared to Example 3.4 in Ross ) • (c). Given that no chocolate cream cake is sold on that day, find the variance of the total profit of the day. • given Xc = 0, each cake is of mango of probability 2/3 and of strawberry of probability 1/3. • (Xm|Xc = 0) ~ Bin(4, 2/3) and (Xs|Xc = 0) ~ Bin(4, 1/3). • V(Xm|Xc = 0) = V(Xs|Xc = 0) = 8/9 • the total profit = (Y|Xc = 0) = 2(Xm|Xc = 0) + (Xs|Xc = 0) • (Xm|Xc = 0) + (Xs|Xc = 0) = 4 • (Y|Xc = 0) = 4 + (Xm|Xc = 0)  • V(Y) = V(4+Xm|Xc = 0) = V(Xm|Xc = 0) = 8/9 6

  7. Recursive Relationships 7

  8. Two Innocent Equations • A and B : two events • P(A) = P(A|B)P(B) + P(A|Bc)P(Bc) •  generalization: jBj =  and BiBj =  for ij • P(A) = • useful recursive equations finding probabilities and expectations by these equations 8

  9. Recursive Relationship • a special property in some random phenomena: changing back to oneself, or to something related • flipping a coin until getting the first head first flip = H THE END flipping a coin until getting the first head flipping a coin until getting the first head + first flip = T one flip 9

  10. Recursive Relationship type 1 outcome simple type 1 problem . . . . random phenomenon type k+1 problem being related to the original random phenomenon type k outcome simple type k problem type k+1 outcome type 1 outcome simple type 1 analysis difficult type k+1 problem . . . . the problem may become easy if the type k+1 problem is related to the original random phenomenon random phenomenon type k outcome simple type k analysis type k+1 outcome 10

  11. More Recursive Relationships type 1 outcome simple type 1 problem The two problems for random phenomena A and B may be solved easily if they evolve into each other. random phenomenon B random phenomenon A type 2 outcome difficult problem related to random phenomenon B difficult problem related to random phenomenon A type 1’ outcome simple type 1’ problem type 2’ outcome 11

  12. About Recursive Relationships • more forms, possibly involving more than 2 random phenomena • identifying the relationships among random phenomena being an art, not necessarily science 12

  13. Examples of Recursive Relationships(The identification of a similar structure is an art.) 13

  14. Exercise 3.1.2 of Notes • n contractors bidding for m projects (n  m) • one project for each contractor • all projects being equally profitable • random independent bids by contractors • Ai = project i, im, is bid (by at least one contractor) • (a). Find   • (b). Find P(A1) • (c). Find • (d). Find P(A2| A1)  14

  15. Exercise 3.1.2 of Notes random bids by n contractors on m projects (n  m), one project for each contractor • Ai = the project i, im is bid (by at least one contractor) • (a). • (b). P( A1) = • (c). • (d). to find P(A2| A1), note that   15

  16. E(X) = E[E(X|Y)] • . • suppose that By = {Y = y} • . • define indicator variable • E(1A) = P(A) 16

  17. E(X) = E[E(X|Y)] • let X = 1A • . • the expressions being true in general • E(X) = E[E(X|Y)] • P(A) = E[P(A|Y)] 17

  18. E(X) = E[E(X|Y)] • discrete X and Y • E(X|Y = y) = x xP(X = x|Y = y) • E[E(X|Y)] = y E(X|Y = y)P(Y = y) 18

  19. c.f. Example 3.11 of Ross(Expectation of a Random Sum) • a tailor-shop equally likely to sell 0, 1, or 2 suits per day • net profit from a sold suit $300 to $800 dollars • customer arrivals and prices all being independent • mean profit per day = ? • P(net profit in a day  7) = ? 19

  20. c.f. Example 3.11 of Ross(Expectation of a Random Sum) • N = number of suits sold in a day • equally likely to be 0, 1, 2 • Xj = profit from the jth sold suit, j = 1, 2, 3 • uniform 3 to 8 (hundreds) • all random variables being independent with each other • S = total profit in a day, 20

  21. c.f. Example 3.11 of Ross(Expectation of a Random Sum) • . • hard to find E(S) from the distribution of S • finding E(S) without using its distribution 21

  22. c.f. Example 3.11 of Ross(Expectation of a Random Sum) • E(S) = (0+5.5+11)/3 = 5.5 • another way 22

  23. c.f. Example 3.11 of Ross(Expectation of a Random Sum) • to find P(S > 7) • P(S > 7) = E[P(S > 7|N)] 8 4 3 3 8 4 23

  24. Example 3.12 of Ross + • X ~ geo (p) • E(X) = E(X|X=1)P(X=1) + E(X|X>1)P(X>1) • E(X|X = 1) = 1 • E(X|X > 1) = 1 + E(X) • E(X) = (1)p + (1 + E(X))(1p), i.e. E(X) = 1/p first flip = H THE END flipping a coin until getting the first head flipping a coin until getting the first head first flip = T one flip 24

  25. Example 3.4.3 of Notes • find E(X) for X = max{Y, Z}, where Y & Z ~ i.i.d. unif[0,1] • E(X) = E[E(X|Y)] 25

  26. Example 3.13 of Ross • A miner is trapped in a mine containing three doors. The first door leads to a tunnel that takes him to safety after two hours of travel. The second door leads to a tunnel that returns him to the mine after three hours of travel. The third door leads to a tunnel that returns him to his mine after five hours. Assuming that the miner is at all times equally likely to choose any one of the doors, what is the expected length of time until the miner reaches safety? 26

  27. Example 3.13 of Ross every time equally likely to choose one of the three doors 2 hr 3 hr 5 hr 27

  28. Example 3.13 of Ross • X = time taken • Y = door initially chosen • E[X|Y = 1] = 2 • E[X|Y = 2] = 3 + E[X] • E[X|Y = 3] = 5 + E[X] • E[X] = (2+3 +E[X]+5+E[X])/3  E[X] = 10 28

  29. Example 3.5.4 of Notes Comparing Two Exp Random Variables • X ~ exp() and Y ~ exp(), independent • P(X> Y) = E[P(X > Y|Y)] • P(X > Y|Y = y) = P(X > y) = ey • P(X > Y) = E[P(X > Y|Y)] = E[eY] = = 29

  30. Example 3.5.3 of Notes Random Partition of Poisson • Z ~ Poisson() • each item being type 1 w.p. p, 0 < p < 1, and type 2 o.w., independent of everything else • X = number of type 1 items in Z • P(X = k) = E[P(X = k|Z)] = 30 X ~ Poisson(p)

  31. Ex. #4 of WS#10 • #1. (Solve Exercise #4 of Worksheet #10 by conditioning, not direct computation.) Let X and Y be two independent random variables ~ geo(p). • (a). Find P(X = Y). • (b). Find P(X > Y). • (c). Find P(min(X, Y) > k) for k {1, 2, …}. • (d). From (c) or otherwise, Find E[min(X, Y)]. • (e). Show that max(X, Y) + min(X, Y) = X + Y. Hence find E[max(X, Y)]. 31

  32. Ex. #4 of WS#10 • (a). different ways to solve the problem •  by computation: • P(X = Y) = = = = = p/(2-p) 32

  33. Ex. #4 of WS#10 • by conditioning: • P(X = Y|X = 1, Y = 1) = 1 • P(X = Y|X = 1, Y > 1) = 0 • P(X = Y|X > 1, Y = 1) = 0 • P(X = Y|X > 1, Y > 1) = P(X = Y) • P(X = Y) = P(X = 1, Y = 1)(1) + P(X > 1, Y > 1)P(X = Y) • P(X = Y) = p2 + (1p)2P(X = Y) i.e., P(X = Y) = p/(2p) 33

  34. Ex. #4 of WS#10 • (b) P(X > Y) • by symmetry • P(X > Y) + P(X = Y) + P(X < Y) = 1 • P(X > Y) = P(X < Y) • P(X > Y) = = 34

  35. Ex. #4 of WS#10 • by direct computation • P(X > Y) = 35

  36. Ex. #4 of WS#10 • by conditioning • P(X > Y) = E[P(X > Y|Y)] • P(X > Y|Y = y) = P(X > y) = (1-p)y • E[P(X > Y|Y)] = E[(1-p)Y] 36

  37. Ex. #4 of WS#10 • yet another way of conditioning • P(X > Y|X = 1, Y = 1) = 0 • P(X > Y|X = 1, Y > 1) = 0 • P(X > Y|X > 1, Y = 1) = 1 • P(X > Y|X > 1, Y > 1) = P(X > Y) • P(X > Y) = P(X > 1, Y = 1) + P(X > 1, Y > 1)P(X > Y) = (1-p)p + (1-p)2P(X > Y) • P(X > Y) = (1p)/(2p) 37

  38. Ex. #5 of WS#10 • In the sea battle of Cape of No Return, two cruisers of country Landpower (unluckily) ran into two battleships of country Seapower. With artilleries of shorter range, the two cruisers had no choice other than receiving rounds of bombardment by the two battleships. Suppose that in each round of bombardment, a battleship only aimed at one cruiser, and it sank the cruiser with probability p in a round, 0 < p < 1, independent of everything else. The two battleships fired simultaneously in each round. 38

  39. Ex. #5 of WS#10 • (b) Now suppose that initially the two battleships aimed at a different cruiser. They helped the other battleship only if its targeted cruiser was sunk before the other one. • (i) What is the probability that the two cruisers were sunk at the same time (with the same number of rounds of bombardment). 39

  40. Ex. #5 of WS#10 • (b). (i). Let Ni be the number of rounds taken to sink the ith cruiser. Ni~ Geo (p); N1and N2 are independent. • ps= P(2 cruisers sunk at the same round) = P(N1 = N2) • discussed before 40

  41. Stochastic Modeling In the sea battle of Cape of No Return, two cruisers of country Landpower (unluckily) ran into two battleships of country Seapower. With artilleries of shorter range, the two cruisers had no choice other than receiving rounds of bombardment by the two battleships. Suppose that in each round of bombardment, a battleship only aimed at one cruiser, and it sank the cruiser with probability p in a round, 0 < p < 1, independent of everything else. The two battleships fired simultaneously in each round. (b) Now suppose that initially the two battleships aimed at a different cruiser. They helped the other battleship only if its targeted cruiser was sunk before the other one. (i) What is the probability that the two cruisers were sunk at the same time (with the same number of rounds of bombardment). • given a problem statement • formulate the problem by defining events, random variables, etc. • understand the stochastic mechanism • deduce means (including probabilities, variances, etc.) • identifying special structure and properties of the stochastic mechanism 41

  42. Examples of Ross in Chapter 3 • Examples 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.11, 2.12, 3.13 42

  43. Exercises of Ross in Chapter 3 • Exercises 3.1, 3.3, 3.5, 3.7, 3.8, 3.14, 3.21, 3.23, 3.24, 3.25, 3.27, 3.29, 3.30, 3.34, 3.37, 3.40, 3.41, 3.44, 3.49, 3.51, 3.54, 3.61, 3.62, 3.63, 3.64, 3.66 43

More Related