news:klr6q1$d19$1...@newscl01ah.mathworks.com...
> Running time of full matrix multiplication would be order O(n^3)
> If it is sparse, it would be much less.
> I guess it would be around O(n^2) for my case.
>
> I am not sure about how fast matlab do the backslash for my case.
> For full matrix gaussian elimination, the running time would be order
> O(n^3)
>
> If I calculate the inverse first, then the running time can be reduced in
> the long run.
http://www.mathworks.com/help/matlab/ref/inv.html
"In practice, it is seldom necessary to form the explicit inverse of a
matrix. A frequent misuse of inv arises when solving the system of linear
equations Ax = b. One way to solve this is with x = inv(A)*b. A better way,
from both an execution time and numerical accuracy standpoint, is to use the
matrix division operator x = A\b. This produces the solution using Gaussian
elimination, without forming the inverse. See mldivide (\) for further
information."
If you're solving many systems (which I'm guessing is the case from your
statement about "the long run") then there are other alternatives that I
would explore before going to INV.
1) Concatenate all your right-hand side vectors together into a matrix and
solve using \ with your coefficient matrix and the matrix of right-hand
sides all at once.
2) Use one of the iterative solvers listed in the last section of this page.
This has the benefit that you may not even need to explicitly construct the
coefficient matrix, if you can write a function to compute A*x without it.
http://www.mathworks.com/help/matlab/math/systems-of-linear-equations.html
3) Factor the matrix first (CHOL, LU, QR, ILU, etc.) and solve the two
triangular systems either with \ or LINSOLVE (telling LINSOLVE the matrices
are triangular so it goes right to the triangular solver.)
http://www.mathworks.com/help/matlab/matrix-decomposition.html
In general, you SHOULD NOT INVERT your coefficient matrix unless you know
that you specifically need the inverse and you know that your matrix is
well-conditioned.