The MASM Forum Archive 2004 to 2012

General Forums => The Laboratory => Topic started by: Neo on April 18, 2010, 05:43:23 AM

Title: Diagonalization and Matrix Exponential
Post by: Neo on April 18, 2010, 05:43:23 AM
This is a long shot, but anyone have anything for finding the largest several (~50) eigenvalues and eigenvectors of large (65536x65536), sparse (16 to 32 elements per row) matrices with good initial guesses of the eigenvalues and eigenvectors?

While I'm asking about long shots, it'd also be useful to have something for a matrix exponential on a 2400x2400 dense complex matrix (though most of the entries are usually almost zero).

In order to beat matlab, it has to do both of those, plus the easy parts matlab is really slow at in maybe 30 seconds.  Ideally, the entire thing would be kept under 500MB of memory too so that it can be run on volunteer computers, unlike the ~8GB matlab takes.  Gotta love quantum physics calculations.  :wink
Title: Re: Diagonalization and Matrix Exponential
Post by: hutch-- on April 18, 2010, 06:26:03 AM
Neil,

Is there any way to predict the distribution pattern of the data ? 500 meg is a lot of data but there may be a trick with multicore to read different parts of the same memory with the current read/write address far enough apart not to influence each other.
Title: Re: Diagonalization and Matrix Exponential
Post by: Neo on April 18, 2010, 08:17:24 AM
There are a sufficient number of these calculations that are separate in order to completely parallelize it in matlab, so that doesn't count for relative speedup, but yes, the app will be multi-threaded.  :wink

As for predicting distributions of data, in the main version, each row, r, of the 2^16 x 2^16 matrix will have known, non-zero elements in columns r (the diagonal), r XOR 2^0, r XOR 2^1, r XOR 2^2, ... r XOR 2^15.  This ends up forming 32 diagonals, each with half their elements zero, in the matrix.  Because of this structure, matrix multiplications can be done without ever constructing the matrix, and this is one of the reasons 500MB of memory would be way more than enough for that part.  It does lend itself toward a custom vectorization I could describe, but I was hoping someone might have something already.

Also, the diagonalization needs to be done many times on a sequence of matrices between which the eigenvalues and eigenvectors usually change slowly or predictably, which is why I think it should be feasible to exploit good initial guesses to diagonalize faster than matlab's repeated random vector guesses.

The matrix exponential has me a bit stumped.  matlab literally uses a degree-13 rational approximation to the matrix exponential.  The known structure here is that if the matrix is n^2 x n^2, there are n^2 particular elements that are usually much larger in magnitude than all other values in the matrix.  I'll be looking into whether it's safe to set the other values to zero and just use an n x n matrix instead, but there are definitely cases where it's not safe to do so.

The irony is that matlab takes the bulk of the time just constructing the matrix whose exponential is to be taken, which is the easy part.   :lol