290 likes | 592 Views
Generating random correlation matrices based on regular vines and the extended Onion method. Daniel Lewandowski Dorota Kurowicka Harry Joe. Our objective. Generate random correlation matrices uniformly from the set of semi-positive definite correlation matrices.
E N D
Generating random correlation matrices based on regular vines and the extended Onion method Daniel Lewandowski Dorota Kurowicka Harry Joe Bayesian Belief Networks Workshop, Delft
Our objective • Generate random correlation matrices uniformly from the set of semi-positive definite correlation matrices. • The method is based on the appropriate transformation of partial correlationson a regular vine to Pearson’s product moment correlations. • Extend the method of Harry Joe based on the D-vine. • Simplify and extend the method of Ghosh and Henderson, the so-called Onion method. • Show the equivalence of both methods. Bayesian Belief Networks Workshop, Delft
The transformation • where • fd(r) – the joint density of all product moment correlations in a correlation matrix of size d x d, • Cd – normalizing constant, • qi – partial correlations on a regular vine, • g - probability density of qi’s, • Jd – Jacobian matrix. Bayesian Belief Networks Workshop, Delft
Prerequisities Bayesian Belief Networks Workshop, Delft
Brief introduction to vines Bayesian Belief Networks Workshop, Delft
Partial correlation • Partial correlation • The partial correlation of random variables X1 and X2 with X3,…,Xn held constant is • where Cij denotes the (i,j)th cofactor of the n-dimensional correlation matrix R, that is, the determinant of the submatrix gotten by removing row i and column j. • Can be calculated recursively Bayesian Belief Networks Workshop, Delft
Partial and multiple correlations • Partial correlations can be assigned to the edges of a regular vine, such that the conditioned and conditioning sets of the edges and those of partial correlations coincide. • Multiple correlation • The multiple correlation Rn{n-1,…,1} of variable Xn with respect to Xn-1,…,X1 is given by • The multiple correlation satisfies Bayesian Belief Networks Workshop, Delft
Multiple correlation 3 ρ23 ρ13;2 ρ12 ρ13;2 ρ14;2 ρ15;24 ρ12 ρ24 ρ45 1 2 4 5 ρ23 ρ24 ρ25;4 ρ34;12 ρ34;12 ρ35;124 ρ14;2 ρ25;4 ρ45 ρ35;124 ρ15;24 Bayesian Belief Networks Workshop, Delft
Vine method Bayesian Belief Networks Workshop, Delft
Transformation Bayesian Belief Networks Workshop, Delft
Jacobian matrix 3 1. Let rij;L = r34;12. ρ23 ρ13;2 2. Let rst;De = r15;24. ρ45 ρ12 ρ24 1 2 4 5 ρ34;12 ρ25;4 ρ14;2 ρ15;24 ρ35;124 Bayesian Belief Networks Workshop, Delft
Jacobian matrix – cont. Bayesian Belief Networks Workshop, Delft
Jacobian matrix – cont. Bayesian Belief Networks Workshop, Delft
Vines properties • For each edge e of the vine we define: • The constraint set Ue, • Variables reachable from edge e. • The conditioned set {C1e, C2e}, • Intersection of the constraint sets of two edges in the previous tree joined by e. • The conditioning set De, • Symmetric difference of the constraint sets of two edges in the previous tree joined by e. • {L|K} denotes a constraint set with the conditioned set L and the conditioning set K. Bayesian Belief Networks Workshop, Delft
Vines properties – cont. 3 1. The conditioned setof e in T4 is {124}. ρ23 ρ13;2 2.Let {s,t} = {1,4}. ρ45 ρ12 ρ24 3. There exists f in T2, such that {C1f,C2f} = {1,4}. 1 2 4 5 ρ34;12 ρ25;4 ρ14;2 ρ15;24 ρ35;124 Bayesian Belief Networks Workshop, Delft
Calculations Bayesian Belief Networks Workshop, Delft
Calculations – cont. Bayesian Belief Networks Workshop, Delft
Jacobian of the transformation There exists a relationship between the form of the determinant of the correlation matrix R and the determinant of the Jacobian matrix J. Bayesian Belief Networks Workshop, Delft
Finally Bayesian Belief Networks Workshop, Delft
Finally Bayesian Belief Networks Workshop, Delft
Algorithm for generating random correlation matrices with C-vines • Density proportional to [det(r)]-1, ≥1. • Initialization =+(n-1)/2. • Loop for k=1,…,n-1 • -1/2; • Loop fori = k+1,…,n; • Generaterk,1;1,…,k-1 Beta(, ) on (-1,1); • Use recursive formula for partial correlations on r k,1;1,…,k-1to get rk,i= ri,k. • Return r, a n x n correlation matrix. Bayesian Belief Networks Workshop, Delft
Correlation matrices Bayesian Belief Networks Workshop, Delft
Onion method Bayesian Belief Networks Workshop, Delft
Onion method Any correlation matrix rn+1 of size (n+1) x (n+1) can be partitioned as From standard results on conditional multivariate normal distributions we have Let Q be a random vector with realization q. Q is elliptically contoured distributed with correlation matrix rn. Bayesian Belief Networks Workshop, Delft
Onion method cont. Let Rn has density proportional to Q given Rn=rn has density proportional to Then the density of Rn+1 has the density proportional to If one sets n= +1/2, then the density of Rn+1 is proportional to Bayesian Belief Networks Workshop, Delft
Algorithm for generating random correlation matrices with onion method • Density proportional to [det(r)]-1, ≥1. • Initialization =+(n-2)/2, generate r122u-1, where uBeta(,) • Set • Loop for k=2,…,n-1 • -1/2; • GenerateyBeta(k/2, ); • Generate u=(u1,…,un)Tuniform on the surface on k-dimensional hypersphere; • wy1/2 -1, obtain A such that ATA=r, set qAw; • Return r, a n x n correlation matrix. Bayesian Belief Networks Workshop, Delft
A simple analysis Bayesian Belief Networks Workshop, Delft
Proportion of correlation matrices Bayesian Belief Networks Workshop, Delft
Computational time analysis Bayesian Belief Networks Workshop, Delft