Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Inverse of square block diagonal matrix

305 views
Skip to first unread message

Lam

unread,
May 1, 2013, 4:59:10 AM5/1/13
to
I am trying to speed up the calculation of inverse of square block diagonal matrix.

Currently I am taking inverse of the block one by one. Here is the code:

function [ Ainv ] = blkinv( A, m )
C = cell(rows(A)/m,1);
k = 1;
for i = 1:m:rows(A)
C{k} = inv(full(A(i:i+m-1,i:i+m-1)));
k = k + 1;
end
Ainv = blkdiag(sparse(0,0),C{:});
end

I wonder how I can compute the inverse in a parallel way.
Any "multi-threaded function" can help?

Lam

unread,
May 1, 2013, 5:59:08 AM5/1/13
to
I just made anther attempt,

function [ R ] = blkinv2( A, m )
B = mat2cell(full(reshape(nonzeros(A), m, [])), m, ones(rows(A)/m,1)*m)';
D = cellfun(@(X) sparse(inv(X)), B, 'UniformOutput', false);
R = blkdiag(D{:});
end

This is much faster, but still not parallel...

"Lam" wrote in message <klqlgu$n97$1...@newscl01ah.mathworks.com>...

Bruno Luong

unread,
May 1, 2013, 7:07:09 AM5/1/13
to

Lam

unread,
May 1, 2013, 8:21:07 AM5/1/13
to
Thank you for your information. Seems it will take me some time to study it.

BTW, do you know other free toolbox for parallel calculation?
It will be wonderful if there is some parallel version of "cellfun".

"Bruno Luong" <b.l...@fogale.findmycountry> wrote in message <klqt0t$ckj$1...@newscl01ah.mathworks.com>...

Steven_Lord

unread,
May 1, 2013, 9:41:24 AM5/1/13
to


"Lam " <lam....@gmail.com> wrote in message
news:klqlgu$n97$1...@newscl01ah.mathworks.com...
> I am trying to speed up the calculation of inverse of square block
> diagonal matrix.

Why do you need to compute the inverse? If you're doing it to solve a system
of equations, stop and use backslash instead.

*snip*

--
Steve Lord
sl...@mathworks.com
To contact Technical Support use the Contact Us link on
http://www.mathworks.com

Lam

unread,
May 1, 2013, 9:54:09 AM5/1/13
to
Running time of full matrix multiplication would be order O(n^3)
If it is sparse, it would be much less.
I guess it would be around O(n^2) for my case.

I am not sure about how fast matlab do the backslash for my case.
For full matrix gaussian elimination, the running time would be order O(n^3)

If I calculate the inverse first, then the running time can be reduced in the long run.

"Steven_Lord" <sl...@mathworks.com> wrote in message <klr623$af9$1...@newscl01ah.mathworks.com>...

Steven_Lord

unread,
May 1, 2013, 10:11:34 AM5/1/13
to


"Lam " <lam....@gmail.com> wrote in message
news:klr6q1$d19$1...@newscl01ah.mathworks.com...
> Running time of full matrix multiplication would be order O(n^3)
> If it is sparse, it would be much less.
> I guess it would be around O(n^2) for my case.
>
> I am not sure about how fast matlab do the backslash for my case.
> For full matrix gaussian elimination, the running time would be order
> O(n^3)
>
> If I calculate the inverse first, then the running time can be reduced in
> the long run.

http://www.mathworks.com/help/matlab/ref/inv.html

"In practice, it is seldom necessary to form the explicit inverse of a
matrix. A frequent misuse of inv arises when solving the system of linear
equations Ax = b. One way to solve this is with x = inv(A)*b. A better way,
from both an execution time and numerical accuracy standpoint, is to use the
matrix division operator x = A\b. This produces the solution using Gaussian
elimination, without forming the inverse. See mldivide (\) for further
information."

If you're solving many systems (which I'm guessing is the case from your
statement about "the long run") then there are other alternatives that I
would explore before going to INV.

1) Concatenate all your right-hand side vectors together into a matrix and
solve using \ with your coefficient matrix and the matrix of right-hand
sides all at once.
2) Use one of the iterative solvers listed in the last section of this page.
This has the benefit that you may not even need to explicitly construct the
coefficient matrix, if you can write a function to compute A*x without it.

http://www.mathworks.com/help/matlab/math/systems-of-linear-equations.html

3) Factor the matrix first (CHOL, LU, QR, ILU, etc.) and solve the two
triangular systems either with \ or LINSOLVE (telling LINSOLVE the matrices
are triangular so it goes right to the triangular solver.)

http://www.mathworks.com/help/matlab/matrix-decomposition.html

In general, you SHOULD NOT INVERT your coefficient matrix unless you know
that you specifically need the inverse and you know that your matrix is
well-conditioned.

Lam

unread,
May 1, 2013, 11:45:10 AM5/1/13
to
Thank you for your comprehensive information.

However it seems
1) the iterative methods doesn't work for my situation
2) after testing, factoring the matrix doesn't help
e.g. Performing U\(L\y) is almost the same as A\y (Strange!)

Actually I am doing a time-stepping PDE solver,
So my solution would be something like (A^-N)(y).
And N is very large.

Fortunately the condition number isn't very large.

Lam


"Steven_Lord" <sl...@mathworks.com> wrote in message <klr7qm$gon$1...@newscl01ah.mathworks.com>...
0 new messages