190 likes | 286 Views
Analysis of RT distributions with R. Emil Ratko-Dehnert WS 2010/ 2011 Session 04 – 30.11.2010. Last time. Random variables (RVs) Definition and examples (-> mapping from Ω to R) Calculus and distributions (-> additivity, multiplicity, ...) Characterization of RVs
E N D
Analysis of RT distributionswith R Emil Ratko-Dehnert WS 2010/ 2011 Session 04 – 30.11.2010
Last time ... • Random variables (RVs) • Definition and examples (-> mapping from Ω to R) • Calculus and distributions (-> additivity, multiplicity, ...) • Characterization of RVs • By moments and descriptives (-> Mean, Var, Mode,Median)
II Random Variables &their characterization
II Recap: E(X), Var(X) • „What is the expected (long term) outcome of X?“ • „How much do the values of a RV X vary around its mean value X ?“
II Calculus for E(X) • E( X + c ) = E( X ) +c (scalar additivity) • E( X + Y ) = E( X ) + E( Y ) (linearity) • E( a*X ) = a*E( X ) (scalar multipl.) However(!): • E( X * Y ) ≠ E( X ) * E( Y ) (non-multiplicity)
II Calculus for Var(X) • Var( X ) = E( X2 ) – ( E( X )2 ) (alternative formula) • Var( a * X + b ) = a2 * Var( X ) (scalar „additivity“) • Var( X + Y ) = Var( X ) + Var( Y ) + 2 * Cov( X,Y ) • Var( Σ Xi ) = Σ Var( Xi ), Xi uncorr.(Bienaymé)
II Further properties of RVs • Covariance Cov(X, Y) • Correlation Corr(X, Y) • Independance of RV‘s • Identically distributedness of RV‘s IID
II Covariance Cov(X, Y) • „The Covariance of RV‘s X and Y is a measure of how much they change together“ • The standardized covariance is the Correlation Corr(X,Y)
II Correlation Corr(X, Y) • One way of standardizing leads to the Pearson‘s correlation coefficient Don‘t confuse this with causality! Neither confuse it with linearity!
II Example: Correlations (1)
II Example: Correlations (2) Corr(X,Y) = 0.816 for all
II Independance of RV‘s • Two RV‘s X and Y are said to be independant, if their expectations factorize: • Then:
II Independance and Covariance • If X, Y are independant, their covariance is zero • Warning! The converse is generally not true: i.e. X, Y can have Cov(X,Y) = 0 and not be independant.
II Identical distribution • Two RV‘s X and Y are said to be identically distributed, if they share the same distribution: E.g.: X ~ N(0,1) and Y ~ N(0,1) are identically distributed • Ergo: each observation can be treated like it was taken from the exact same distribution as the other
II Examples: IID RVs • a sequence of outcomes of spins of a roulette wheel is IID • a sequence of dice rolls is IID • a sequence of coin flips is IID • RTs are often treated as IID events, though this is rarely checked and frequently violated
II Significance of IID • IID assumption important for many reasons • In our case most importantly • identical conditions across trials should have the same effect (no intertrial or position effects!) • Required for law of large numbers and central limit theorem • Required for many statistical tests (e.g. z-test)
Creating own functions new.fun <-function(arg1, arg2, arg3){ x <-exp(arg1) y <-sin(arg2) z <-mean(arg2, arg3) result <- x + y + z result } A <- new.fun(12, 0.4, -4) „inputs“ Algorithm of function „output“ Usage of new.fun
Using Loops in R Create empty frame for loop result <- matrix(NA, nrow = 20, ncol = 10) for(i in 1:10){ xx <- rnorm(n = 10, mean = i, sd = 4) result [c(1:20), i] <- xx } result Running index of loop Fill columns of exp with xx