1 / 29

Generating random correlation matrices based on regular vines and the extended Onion method

Generating random correlation matrices based on regular vines and the extended Onion method. Daniel Lewandowski Dorota Kurowicka Harry Joe. Our objective. Generate random correlation matrices uniformly from the set of semi-positive definite correlation matrices.

noelle
Download Presentation

Generating random correlation matrices based on regular vines and the extended Onion method

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Generating random correlation matrices based on regular vines and the extended Onion method Daniel Lewandowski Dorota Kurowicka Harry Joe Bayesian Belief Networks Workshop, Delft

  2. Our objective • Generate random correlation matrices uniformly from the set of semi-positive definite correlation matrices. • The method is based on the appropriate transformation of partial correlationson a regular vine to Pearson’s product moment correlations. • Extend the method of Harry Joe based on the D-vine. • Simplify and extend the method of Ghosh and Henderson, the so-called Onion method. • Show the equivalence of both methods. Bayesian Belief Networks Workshop, Delft

  3. The transformation • where • fd(r) – the joint density of all product moment correlations in a correlation matrix of size d x d, • Cd – normalizing constant, • qi – partial correlations on a regular vine, • g - probability density of qi’s, • Jd – Jacobian matrix. Bayesian Belief Networks Workshop, Delft

  4. Prerequisities Bayesian Belief Networks Workshop, Delft

  5. Brief introduction to vines Bayesian Belief Networks Workshop, Delft

  6. Partial correlation • Partial correlation • The partial correlation of random variables X1 and X2 with X3,…,Xn held constant is • where Cij denotes the (i,j)th cofactor of the n-dimensional correlation matrix R, that is, the determinant of the submatrix gotten by removing row i and column j. • Can be calculated recursively Bayesian Belief Networks Workshop, Delft

  7. Partial and multiple correlations • Partial correlations can be assigned to the edges of a regular vine, such that the conditioned and conditioning sets of the edges and those of partial correlations coincide. • Multiple correlation • The multiple correlation Rn{n-1,…,1} of variable Xn with respect to Xn-1,…,X1 is given by • The multiple correlation satisfies Bayesian Belief Networks Workshop, Delft

  8. Multiple correlation 3 ρ23 ρ13;2 ρ12 ρ13;2 ρ14;2 ρ15;24 ρ12 ρ24 ρ45 1 2 4 5 ρ23 ρ24 ρ25;4 ρ34;12 ρ34;12 ρ35;124 ρ14;2 ρ25;4 ρ45 ρ35;124 ρ15;24 Bayesian Belief Networks Workshop, Delft

  9. Vine method Bayesian Belief Networks Workshop, Delft

  10. Transformation Bayesian Belief Networks Workshop, Delft

  11. Jacobian matrix 3 1. Let rij;L = r34;12. ρ23 ρ13;2 2. Let rst;De = r15;24. ρ45 ρ12 ρ24 1 2 4 5 ρ34;12 ρ25;4 ρ14;2 ρ15;24 ρ35;124 Bayesian Belief Networks Workshop, Delft

  12. Jacobian matrix – cont. Bayesian Belief Networks Workshop, Delft

  13. Jacobian matrix – cont. Bayesian Belief Networks Workshop, Delft

  14. Vines properties • For each edge e of the vine we define: • The constraint set Ue, • Variables reachable from edge e. • The conditioned set {C1e, C2e}, • Intersection of the constraint sets of two edges in the previous tree joined by e. • The conditioning set De, • Symmetric difference of the constraint sets of two edges in the previous tree joined by e. • {L|K} denotes a constraint set with the conditioned set L and the conditioning set K. Bayesian Belief Networks Workshop, Delft

  15. Vines properties – cont. 3 1. The conditioned setof e in T4 is {124}. ρ23 ρ13;2 2.Let {s,t} = {1,4}. ρ45 ρ12 ρ24 3. There exists f in T2, such that {C1f,C2f} = {1,4}. 1 2 4 5 ρ34;12 ρ25;4 ρ14;2 ρ15;24 ρ35;124 Bayesian Belief Networks Workshop, Delft

  16. Calculations Bayesian Belief Networks Workshop, Delft

  17. Calculations – cont. Bayesian Belief Networks Workshop, Delft

  18. Jacobian of the transformation There exists a relationship between the form of the determinant of the correlation matrix R and the determinant of the Jacobian matrix J. Bayesian Belief Networks Workshop, Delft

  19. Finally Bayesian Belief Networks Workshop, Delft

  20. Finally Bayesian Belief Networks Workshop, Delft

  21. Algorithm for generating random correlation matrices with C-vines • Density proportional to [det(r)]-1, ≥1. • Initialization =+(n-1)/2. • Loop for k=1,…,n-1 • -1/2; • Loop fori = k+1,…,n; • Generaterk,1;1,…,k-1 Beta(, ) on (-1,1); • Use recursive formula for partial correlations on r k,1;1,…,k-1to get rk,i= ri,k. • Return r, a n x n correlation matrix. Bayesian Belief Networks Workshop, Delft

  22. Correlation matrices Bayesian Belief Networks Workshop, Delft

  23. Onion method Bayesian Belief Networks Workshop, Delft

  24. Onion method Any correlation matrix rn+1 of size (n+1) x (n+1) can be partitioned as From standard results on conditional multivariate normal distributions we have Let Q be a random vector with realization q. Q is elliptically contoured distributed with correlation matrix rn. Bayesian Belief Networks Workshop, Delft

  25. Onion method cont. Let Rn has density proportional to Q given Rn=rn has density proportional to Then the density of Rn+1 has the density proportional to If one sets n= +1/2, then the density of Rn+1 is proportional to Bayesian Belief Networks Workshop, Delft

  26. Algorithm for generating random correlation matrices with onion method • Density proportional to [det(r)]-1, ≥1. • Initialization =+(n-2)/2, generate r122u-1, where uBeta(,) • Set • Loop for k=2,…,n-1 • -1/2; • GenerateyBeta(k/2, ); • Generate u=(u1,…,un)Tuniform on the surface on k-dimensional hypersphere; • wy1/2 -1, obtain A such that ATA=r, set qAw; • Return r, a n x n correlation matrix. Bayesian Belief Networks Workshop, Delft

  27. A simple analysis Bayesian Belief Networks Workshop, Delft

  28. Proportion of correlation matrices Bayesian Belief Networks Workshop, Delft

  29. Computational time analysis Bayesian Belief Networks Workshop, Delft

More Related