Preconditioners not lowering the number of GMRES iterations

46 views
Skip to first unread message

Lucas Myers

unread,
Apr 25, 2023, 1:29:31 PM4/25/23
to deal.II User Group
Hi everyone,

TL;DR:
I'm trying to precondition my system which can be solved with GMRES (with increasing iteration number for increasing size) but the standard preconditioners are either increasing the number of iterations, or causing the solver not to converge.

Details:
I'm trying to solve a phase-field crystal system, whose time evolution equation I'm attaching as `pfc_time_evolution.png`. By introducing auxiliary variables (`auxiliary_variables.png`), it can be written as a second-order coupled equation (`reduced_pfc_time_evolution.png`). This can be discretized in time and linearized such that the linear operator looks like `block_continuous_operator.png` with dt the timestep and theta a time-stepping discretization parameter. Finally, this can be discretized in space so that the block-form of the relevant linear operator is given by `block_discrete_operator.png`. The M blocks are mass matrices and the L blocks are Laplacian operators. The rest are some sum of Laplacians, mixed mass matrices, and nonlinear terms which couple to the fields.

As a first pass, I'm trying to solve the entire matrix in one go, and this works fine using the GMRES method. However, the number of GMRES iterations increase as the number of DoFs increase, so I would like to precondition to mitigate this. It's my understanding that the AMG preconditioner is a good black-box for most problems, but when I try to use it the GMRES solver does not converge (I've set max iterations to n_dofs). Additionally, a simple Jacobi preconditioner increases the number of iterations relative to no preconditioner for two different grid sizes. Finally, the ILU preconditioner decreases the number of GMRES iterations for a small grid, but then fails to converge for a large grid. What could be going on here?

This is for the entire matrix, but as an aside, if it is helpful to somehow use the block structure of the matrix to precondition (it's 3x3, so can't use Schur complement I think) I'd be interested if anyone has any tips.

Thanks so much for any help!

- Lucas
reduced_pfc_time_evolution.png
block_continuous_operator.png
auxiliary_variables.png
block_discrete_operator.png
pfc_time_evolution.png

Wolfgang Bangerth

unread,
Apr 25, 2023, 7:11:30 PM4/25/23
to dea...@googlegroups.com

Lucas,

> I'm trying to precondition my system which can be solved with GMRES
> (with increasing iteration number for increasing size) but the standard
> preconditioners are either increasing the number of iterations, or
> causing the solver not to converge.

Preconditioner design is difficult. This is why I recorded so many
lectures on it :-)

The best approach to solving block systems *efficiently* is to use block
preconditioners. They can have multiple levels of Schur complementing --
in each step, you reduce the size of the problem by one block row and
column. You can also call a 2x2 part of the matrix a block in itself --
for example, for your matrix you might consider splitting it as

[B X]
[Y A]

where
X = [C D]
Y = [L_psi, 0]^T
A = [M_chi, 0; L_chi M_phi]

Then you apply the Schur complementing, which should be relatively
straightforward because A is invertible. In fact, because A is block
triangular, it can easily be solved with by solving for two mass
matrices, each of which is cheap. Then you'd come up with a
preconditioner in the same way as discussed in the "Possibilities for
extensions" of step-22 that uses the fact that you can form a Schur
complement.

Whether that results in a good preconditioner is a separate question,
and one on which it is possible to spend a year or two. But it's
probably worth investigating.

Separately, there is of course also the possibility of using a direct
solver. If you're running in 2d, that would be my first choice -- up to
~200k DoFs, direct solvers are very competitive.

Best
W.

--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/
Reply all
Reply to author
Forward
0 new messages