News:

MASM32 SDK Description, downloads and other helpful links
MASM32.com New Forum Link
masmforum WebSite

Diagonalization and Matrix Exponential

Started by Neo, April 18, 2010, 05:43:23 AM

Previous topic - Next topic

Neo

This is a long shot, but anyone have anything for finding the largest several (~50) eigenvalues and eigenvectors of large (65536x65536), sparse (16 to 32 elements per row) matrices with good initial guesses of the eigenvalues and eigenvectors?

While I'm asking about long shots, it'd also be useful to have something for a matrix exponential on a 2400x2400 dense complex matrix (though most of the entries are usually almost zero).

In order to beat matlab, it has to do both of those, plus the easy parts matlab is really slow at in maybe 30 seconds.  Ideally, the entire thing would be kept under 500MB of memory too so that it can be run on volunteer computers, unlike the ~8GB matlab takes.  Gotta love quantum physics calculations.  :wink

hutch--

Neil,

Is there any way to predict the distribution pattern of the data ? 500 meg is a lot of data but there may be a trick with multicore to read different parts of the same memory with the current read/write address far enough apart not to influence each other.
Download site for MASM32      New MASM Forum
https://masm32.com          https://masm32.com/board/index.php

Neo

There are a sufficient number of these calculations that are separate in order to completely parallelize it in matlab, so that doesn't count for relative speedup, but yes, the app will be multi-threaded.  :wink

As for predicting distributions of data, in the main version, each row, r, of the 2^16 x 2^16 matrix will have known, non-zero elements in columns r (the diagonal), r XOR 2^0, r XOR 2^1, r XOR 2^2, ... r XOR 2^15.  This ends up forming 32 diagonals, each with half their elements zero, in the matrix.  Because of this structure, matrix multiplications can be done without ever constructing the matrix, and this is one of the reasons 500MB of memory would be way more than enough for that part.  It does lend itself toward a custom vectorization I could describe, but I was hoping someone might have something already.

Also, the diagonalization needs to be done many times on a sequence of matrices between which the eigenvalues and eigenvectors usually change slowly or predictably, which is why I think it should be feasible to exploit good initial guesses to diagonalize faster than matlab's repeated random vector guesses.

The matrix exponential has me a bit stumped.  matlab literally uses a degree-13 rational approximation to the matrix exponential.  The known structure here is that if the matrix is n^2 x n^2, there are n^2 particular elements that are usually much larger in magnitude than all other values in the matrix.  I'll be looking into whether it's safe to set the other values to zero and just use an n x n matrix instead, but there are definitely cases where it's not safe to do so.

The irony is that matlab takes the bulk of the time just constructing the matrix whose exponential is to be taken, which is the easy part.   :lol