200 likes | 229 Views
TUGAS. TUGAS. By : Indri Rivani Purwanti (10990) Gempur Safar (10877) Windu Pramana Putra Barus (10835) Adhiarsa Rakhman (11063). Statistika UGM Yogyaka rt a. Dosen : Prof.Dr. Sri Haryatmi Kartiko, S.Si., M.Sc. I ntroduction of Mathematical Statistics 2.
E N D
TUGAS TUGAS By : Indri Rivani Purwanti (10990) Gempur Safar (10877) Windu Pramana Putra Barus (10835) Adhiarsa Rakhman (11063) Statistika UGM Yogyakarta Dosen:Prof.Dr. Sri Haryatmi Kartiko, S.Si., M.Sc. Introduction of Mathematical Statistics 2
Introduction to Mathematical Statistics (IMS) can be applied for the whole statistics subject, such as: Statistical Methods I and II Introduction to Probability Models Maximum Likelihood Estimation Waiting Times Theory Analysis of Life-testing models Introduction to Reliability Nonparametric Statistical Methods etc.
Statistical Methods In Statistical Methods, Introduction of Mathematical Statistics are used to: introduce and explain about the random variables , probability models and the suitable cases which can be solve by the right probability models. How to determine mean (expected value), variance and covariance of some random variables, Determining the convidence intervals of certain random variables Etc. Lee J. Bain & Max Engelhardt
Probability Models Mathematical Statistics also describing the probability model that being discussed by the staticians. The IMS being used to make student easy in mastering how to decide the right probability models for certain random variables. Lee J. Bain & Max Engelhardt
Introduction of Reliability The most basic is the reliability function that corresponds to probability of failure after time t. The reliability concepts: If a random variable X represents the lifetime of failure of a unit, then the reliability of the unit t is defined to be: R (t) = P ( X > t ) = 1 – F x (t) Lee J. Bain & Max Engelhardt
Maximum Likelihood Estimation IMS is introduces us to the MLE, Let L(0) = f (x1,....,xn:0), 0 ЄΩ, be the joint pdf of X1,....,Xn. For a given set bof observatios, (x1,....,xn:0), a value in Ω at which L (0) is a maximum and called the maximum likelihood estimate of θ. That is , is a value of 0 that statifies f (x1,....,xn: ) = max f (x1,....,xn:0), Lee J. Bain & Max Engelhardt
Analysis of Life-Testing Models Most of the statistical analysis for parametric life-testing models have been developed for the exponential and weibull models. The exponential model is generally easier to analyze because of the simplicity of the functional form. Weibull model is more flexibel , and thus it provides a more realistic model in many applications , particularly those involving wearout and aging. Lee J. Bain & Max Engelhardt
Nonparametric Statistical methods The IMS also introduce to us the nonparametrical methods of solving a statistical problem, such as: one-sample sign test Binomial Test Two-sample sign test wilcoxon paired-sample signed-rank test wilcoxon and mann-whitney tests correlation tests-tests of independence wald-wolfowitz runs test etc. Lee J. Bain & Max Engelhardt
KETERKAITAN KONVERGENSI
Example We consider the sequence of ”standardized” variables: With the simplified notation By using the series expansion Where d(n) 0 as n
Approximation for The Binomial Distribution Example: A certain type of weapon has probability p of working successfully. We test n weapons, and the stockpile is replaced if the number of failures, X, is at least one. How large must n be to have P[X ≥ 1] = 0.99 when p = 0.95?Use normal approximation. X : number of failures p : probability of working successfully = 0.95 q : probability of working failure = 0.05
Asymptotic Normal Distributions If Y1, Y2, … is a sequence of random variables and m and c are constants such that as , then Yn is said to have an asymptotic normal distribution with asymptotic mean m and asymptotic variance c2/n. Example: The random sample involve n = 40 lifetimes of electrical parts, Xi ~ EXP(100). By the CLT, has an asymptotic normal distribution with mean m = 100 and variance c2/n = 1002/ 40 = 250.
Asymptotic Distribution of Central Order Statistics • Theorem Let X1, …, Xn be a random sample from a continuous distribution with a pdf f(x) that is continuous and nonzero at the pth percentile, xp, for 0 < p < 1. If k/n p (with k – np bounded), then the sequence of kth order statistics, Xk:n, is asymptotically normal with mean xp and variance c2/n, where • Example Let X1, …, Xn be a random sample from an exponential distribution, Xi~ EXP(1), so that f(x) = e-x and F(x) = 1 – e-x; x > 0. For odd n, let k = (n+1)/2, so that Yk = Xk:n is the sample median. If p = 0.5, then the median is x0.5 = - ln (0.5) = ln 2 and Thus, Xk:n is asymptotically normal with asymptotic mean x0.5 = ln 2 and asymptotic variance c2/n = 1/n.
Theorem then • If Proof
For a sequence of random variables, if then For the special case For the special case Y = c, the limiting distribution is the degenerate distribution P[Y = c] = 1. this was the condition we initially used to define stochastic convergence. Theorem • If , then for any function g(y) that is continuous at c,
Theorem • If Xn and Yn are two sequences of random variables such that and then: Example Suppose that Y~BIN(n, p). Thus it follows that
Theorem Slutsky’s Theorem If Xn and Yn are two sequences of random variables such that and Note that as a special case Xn could be an ordinary numerical sequence such as Xn = n/(n-1). then for any continuous function g(y), and if g(y) has a nonzero derivative at y = m,
THANKS 4 UR ATTENTION The enD