User Tools

Site Tools


gams:cholesky_decomposition

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
gams:cholesky_decomposition [2007/10/20 07:19]
Franz Nelissen
gams:cholesky_decomposition [2020/05/27 11:30] (current)
Michael Bussieck
Line 1: Line 1:
 ====== Cholesky Decomposition ====== ====== Cholesky Decomposition ======
- 
  
 //Q: I have an estimate of a variance-covariance matrix and I need the Cholesky decomposition of it. Does anybody have any suggestions on how to program this in GAMS?// //Q: I have an estimate of a variance-covariance matrix and I need the Cholesky decomposition of it. Does anybody have any suggestions on how to program this in GAMS?//
  
-Contributed by [[oyvind.hoveid@nlh10.nlh.no|Oyvind Hoveid]] ​to the [[http://www.gams.com/​maillist/​gams_l.htm|GAMS-User List]]: The program enclosed seems to work. Basically, you need only one additional equation ​and one variable.+There are multiple ways to calculate ​the Cholesky factorization in GAMSimplement the algorithm in GAMS, use a model and solver, or use the ''​cholesky''​ toolThese three methods are demonstrated in the model belowThanks ​to Oyvind Hoveid ​and Arne Drud for contributing to this model.
 <​code>​ <​code>​
-SETS +Sets 
-  j variables ​   /j1*j10/, +     ​         '​variables' ​               ​/j1*j10/ 
-  n observations /n1*n10/, +     ​         '​observations' ​            /n1*n15
-  b_l(j,j) lower triangular marix;+     ​b_l(j,​j) ​  'lower triangular marix';
  
-ALIAS (j,jj,jjj);+Alias (j,jj,jjj);
  
-b_l(j,jj) $ (ORD(j) GE ORD(jj)) = YES;+b_l(j,jj) $ (ord(j) >= ord(jj)) = yes; 
 + 
 +Parameters 
 + ​jn1(j,​n) ​  '​variable observations'​ 
 + ​b1(j,​jj) ​  '​covariance matrix'​;
  
-PARAMETER 
-  jn1(j,n) variable observations 
-  b1(j,jj) covariance matrix; 
 * forming a covariance matrix * forming a covariance matrix
-jn1(j,n) = NORMAL(0,1); +jn1(j,n) = normal(0,1); 
-b1(j,j) = SUM(n, jn1(j,n)) /CARD(n)+b1(j,​j) ​ sum(n, jn1(j,n))/card(n); 
-DISPLAY b1+b1(j,jj) = sum(n, (jn1(j,n) - b1(j,​j))*(jn1(jj,​n) - b1(jj,​jj)))/​card(n);
-b1(j,jj) = SUM(n, (jn1(j,n) - b1(j,​j))*(jn1(jj,​n) -b1(jj,​jj)))/​CARD(n)+
-DISPLAY b1;+
  
-VARIABLES +* Symbols for the Cholesky decomposition algorithm 
- BB(j,jjCholesky factors +Sets 
- OBJ objective +    l(j,j '​defines the lower triangle'​ 
-EQUATIONS +    ​r(j) ​   'the pivot row' ​                 / #j / 
- B1_EQ(j,jjCholesky factorisation +    ​npr(j) ​ 'rows not yet used for pivot' ​   / #j / 
- OBJ_DF ​;+    ​npc(j) ​ '​columns not yet used for pivot' / #j / 
 +parameters 
 +    w(j,j '​factor of b' 
 +    ww(j,j) 'part of b not yet factored'​ 
 +    big     '​the diagonal pivot at each point in the factorization'​ 
 +    piv     '​the square root of the pivot' 
 +    ​tol ​    '​pivot tolerance'​ / 1e-10 /;
  
- ​B1_EQ(j,​jj) $ b_l(j,​jj).. +* Cholesky decomposition ​algorithm ​in GAMS       
-  b1(j,jj) =E= SUM(jjj, (BB(j,jjj) $ b_l(j,jjj))*(BB(jj,jjj) $ b_l(jj,​jjj)));​ +
-OBJ_DF.. OBJ =E= 1; +
- +
-MODEL CHOLESKY /ALL/ ; +
-BB.L(j,j) = 1; +
- +
-SOLVE CHOLESKY MAXIMISING OBJ USING NLP +
- +
-display bb.l; +
-</​code>​ +
- +
-A note contributed by [[adrud@arki.dk|Arne Drud]]: Oyvind Hoveid has a solution based on solving a model. You can include this inside a larger model so it is very flexible. If you just need the Cholesky decomposition ​then you can compute it in GAMS with a loop as shown below. It is an extension of Oyvinds program in which I increased the number of observations to ensure that the matrix was positive definite (10 observations gives a matrix that is very close to being singular). There are additional example discussing Cholesky decompositions and positive semidefinite matrices in the GAMS-L archive -- it is a popular area. +
- +
-<​code>​ +
-SETS +
- ​j ​          ​variables ​                           /j1*j10/ +
- ​n ​         observations ​                         /n1*n15/ +
- ​b_l(j,​j) ​  lower triangular marix; +
- +
-ALIAS (j,​jj,​jjj);​ +
- +
-b_l(j,jj) $ (ORD(j) GE ORD(jj)) = YES; +
- +
-PARAMETER +
- ​jn1(j,​n) ​ variable observations +
- ​b1(j,​jj) ​  ​covariance matrix; +
- +
-* forming a covariance matrix +
-jn1(j,n) = NORMAL(0,​1);​ +
-b1(j,j) = SUM(n, jn1(j,n)) /CARD(n); +
-DISPLAY b1; +
-b1(j,jj) = SUM(n, (jn1(j,n) - b1(j,​j))*(jn1(jj,​n) - b1(jj,​jj)))/​CARD(n);​ +
-DISPLAY b1; +
- +
-set l(j,j) defines the lower triangle; +
-parameters ​ w(j,​j) ​   factor of b +
-parameters ​ ww(j,​j) ​  part of b not yet factored +
-sets   ​r(j) ​  the pivot row +
-       ​npr(j) rows not yet used for pivot +
-       ​npc(j) columns not yet used for pivot +
-scalar big    the diagonal pivot at each point in the factorization +
-       ​piv ​   the square root of the pivot +
-       ​tol    pivot tolerance;​ +
-r(j)     = yes; +
-npr(j) ​  = yes; +
-npc(j) ​  = yes;+
 ww(j,jj) = b1(j,jj); ww(j,jj) = b1(j,jj);
-tol      = 1e-10; +loop(r, 
-loop(r, big = ww(r,r); +   big = ww(r,r); 
-   if ( big ge tol,+   if (big >= tol,
       piv = sqrt(big);       piv = sqrt(big);
       l(npr,r) = yes;       l(npr,r) = yes;
Line 93: Line 51:
       ww(r,npc) = 0;       ww(r,npc) = 0;
       ww(r,​r) ​  = 0;       ww(r,​r) ​  = 0;
-      ); 
    );    );
 +);
 +display w;
 +
 +* Using a CNS model to determine the Cholesky factors
 +Variables
 +   ​BB(j,​jj) '​Cholesky factors'​
 +Equations
 +   ​b1_eq(j,​jj) '​Cholesky factorization';​
 + 
 +b1_eq(b_l(j,​jj))..
 +  b1(j,jj) =e= sum(jjj, (BB(j,​jjj)$b_l(j,​jjj))*(BB(jj,​jjj)$b_l(jj,​jjj)));​
 +
 +Model cholesky / b1_eq / ;
 +BB.l(j,j) = 1;
 +
 +option limrow=0, limcol=0, solprint=silent;​
 +Solve cholesky using cns;
 +
 +display bb.l;
  
-display ​w, l;+* Using a cholesky tool to determine the Cholesky factors 
 +execute_unload '​a.gdx',​ j, b1; 
 +execute '​cholesky a.gdx j b1 b.gdx w'; 
 +execute_load '​b.gdx'​w; 
 +display w;
 </​code>​ </​code>​
IMPRESSUM / LEGAL NOTICEPRIVACY POLICY gams/cholesky_decomposition.1192857575.txt.gz · Last modified: 2007/10/20 07:19 by Franz Nelissen