300 likes | 312 Views
Learn the importance of matrix algebra, scalar, vector, & matrix concepts, operations, Matlab usage, and applications like Neural Networks & SPM for solving simultaneous equations. Understand determinants, inverses, and neural connections.
E N D
Matrix Algebra (and why it’s important!) Methods for Dummies FIL October 2007 Steve Fleming & Verity Leeson
Sources and further information • Jon Machtynger & Jen Marchant’s slides! • Human Brain Function textbook (for GLM) • SPM course http://www.fil.ion.ucl.ac.uk/spm/course/ • Web Guides • http://mathworld.wolfram.com/LinearAlgebra.html • http://www.maths.surrey.ac.uk/explore/emmaspages/option1.html • http://www.inf.ed.ac.uk/teaching/courses/fmcs1/ (Formal Modelling in Cognitive Science course) • http://www.wikipedia.org
2 3 (Roman Catholic) Square (3 x 3) Rectangular (3 x 2) d r c : rthrow, cthcolumn Scalars, vectors and matrices • Scalar:Variable described by a single number – e.g. Image intensity (pixel value) • Vector: Variable described by magnitude and direction – e.g. Image intensity at a particular time • Matrix: Rectangular array of vectors defined by number of rows and columns
Matrices in Matlab ‘;’ is used to signal end of a row ‘:’ is used to signify all rows or columns • Vector formation: [1 2 3] • Matrix formation: X = [1 2 3; 4 5 6; 7 8 9] = Subscripting – each element of a matrix can be addressed with a pair of numbers; row first, column second (Roman Catholic) e.g. X(2,3) =6 X(3, :) = X( [2 3], 2) = “Special” matrix commands: • zeros(3,1) = • ones(2) = • magic(3) more to come…
Matrix addition Addition (matrix of same size) • Commutative: A+B=B+A • Associative: (A+B)+C=A+(B+C) Subtraction (consider as the addition of a negative matrix)
Matrix multiplication • Scalar multiplication: • Rule for multiplication of vectors/matrices: n l m k Matrix multiplication rule: “When A is a mxnmatrix & B is a kxl matrix, AB is only viable if n=k. The result will be an mxl matrix”
Multiplication method l l • Sum over product of respective rows and columns • For larger matrices, following method might be helpful: m m Define output matrix X = r c Sum over crc = • Matlab does all this for you! • Simply type: C = A * B • N.B. If you want to do element-wise multiplication, use: A .* B =
Transposition column → row row →column Mrc = Mcr • In Matlab: AT = A’
Outer and inner products of vectors Two vectors: (1xn)(nx1) (1X1) Inner product = scalar Outer product = matrix (nx1)(1xn) (nXn)
Worked example A In = A for a 3x3 matrix: Identity matrices • Is there a matrix which plays a similar role as the number 1 in number multiplication? Consider the nxnmatrix: A square nxn matrix Ahas one A In = InA = A An nxm matrix A has two!! InA = A & A Im = A • In Matlab:eye(r, c) produces an r x c identity matrix
Inverse matrices • Definition. A matrix A is nonsingular or invertible if there exists a matrix B such that: worked example: • Common notation for the inverse of a matrix A is A-1 • The inverse matrix A-1 is unique when it exists. • If A is invertible, A-1 is also invertible A is the inverse matrix of A-1. • Matrix division: A/B = AB-1 • If A is an invertible matrix, then (AT)-1 = (A-1)T • In Matlab:A-1 = inv(A)
Determinants • Determinant is a function: • Input is nxn matrix • Output is a single number (real or complex) called the determinant • A matrix A has an inverse matrix A-1 if and only if det(A)≠0 (see next slide) • In Matlab:det(A) = det(A)
Calculation of inversion using determinants Or you can just type inv(A)! Note: det(A)≠0 thus More complex matrices can be inverted using methods such as the Gauss-Jordan elimination, Gaussian elimination or LU decomposition
SEM http://www.maths.soton.ac.uk/~jav/soton/MATH1007/workbook_8/8_2_inv_mtrx_sim_lin_eqnpdf.pdf • Neural Networks http://csn.beckman.uiuc.edu/k12/nn_matrix.pdf • SPM http://imaging.mrc-cbu.cam.ac.uk/imaging/PrinciplesStatistics
Solving simultaneous equations • For one linear equation ax=b where the unknown is x and a and b are constants • 3 possibilities
With >1 equation and >1 unknown • Can use solution from the single equation to solve • For example • In matrix form AX = B
Need to find determinant of matrix A (because X =A-1B) • From earlier • (2 x -2) – (3 x 1) = -4 – 3 = -7 • So determinant is -7 • To find A-1
if B is So
Excitatory Connection Inhibitory Connection Input Neuron Output Neuron Input Neuron Output Neuron Neural Networks • Neural networks create a mathematical model of the connections in a neural system • Connections are the excitatory and inhibitory synapses between neurons
Scenario 1 Input Neuron Output Neuron then If then If
+ = Scenario 2 • The combination of both an active excitatory and active inhibitory input will cancel out • No net activity
#2 -1 +1 #1 #3 1 + 1 = Matrix Representations of Neural Connections –Scenario 2 again Excitatory = positive influence on post synaptic cell Inhibitory = negative influence With the synapses labelled (1-3) and activity level specified we can translate this information into a set of vectors (1 row matrices)
#2 -1 +1 #1 #3 1 + 1 = • Input vector = (1 1) relates to activity (#1 #2) • Weight vector = (1 -1) relates to connection weight (#1 #2) Activity of Neuron 3 Input x weight With varying input (activity) and weight, neuron 3 can take on a wide range of values
How are matrices relevant to fMRI data? • Consider that data measured includes • Response variable e.g BOLD signal at a particular voxel Many scalars for this one voxel • Explanatory variables These are assumed to be measured without error May be continuous May be dummy indicating levels of an experimental factor
With a single explanatory variable Y= X . β+ ε Observed = Predictors * Parameters + Error BOLD = Design Matrix * Betas + Error
Time Intensity Y = X . β + ε Preprocessing ... Y • Y is a matrix of BOLD signals • Each column represents a single voxel sampled at successive time points. • Each voxel is considered as independent observation • Analysis of individual voxels over time, not groups over space
Design Matrix Y = X . β + ε Y X1 X2 X1 X2 Most –ve nearest black, most +ve nearest white Matrix Rows : values of X for a single predictor Columns : different predictors
Solve equation for β – tells us how much of the BOLD signal is explained by X A complex version data vector (Voxel) parameters design matrix error vector a m b3 b4 b5 b6 b7 b8 b9 = + × = + Y X b e
The End… Any (easy) questions?!