Error with PETScWrappers::PreconditionILU

64 views
Skip to first unread message

Hermes Sampedro

unread,
Mar 15, 2022, 12:39:26 PM3/15/22
to deal.II User Group

Dear all, 

I am getting a running error that I can not really understand when calling PETScWrappers::PreconditionILU preconditioner(system_matrix). However, it seems to work with  PETScWrappers::PreconditionBlockJacobi preconditioner(system_matrix).

May I ask for help to understand/solve this issue? The solver function ad error that I get is the following:

void LaplaceProblem<dim>::solve()

   {

       PETScWrappers::MPI::Vector completely_distributed_solution(locally_owned_dofs,mpi_communicator);

         SolverControl cn(completely_distributed_solution.size(), 1e-8 * system_rhs.l2_norm());

          PETScWrappers::SolverGMRES solver(cn, mpi_communicator);

       PETScWrappers::PreconditionILU preconditioner(system_matrix);

         solver.solve(system_matrix, completely_distributed_solution, system_rhs, preconditioner); 

       constraints.distribute(completely_distributed_solution);

       locally_relevant_solution = completely_distributed_solution;

   }


[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------

[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers

[0]PETSC ERROR: Could not locate a solver package for factorization type ILU and matrix type mpiaij.

[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.

[0]PETSC ERROR: Petsc Release Version 3.13.1, May 02, 2020 

[0]PETSC ERROR: ./waveLaplaceSolver on a  named gbarlogin1 by hsllo Fri Mar 11 11:05:23 2022

[0]PETSC ERROR: Configure options --prefix=/zhome/32/9/115503/dealii-candi/petsc-3.13.1 --with-debugging=0 --with-shared-libraries=1 --with-mpi=1 --with-x=0 --with-64-bit-indices=0 --download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90 --with-blaslapack-dir=/appl/OpenBLAS/0.3.17/XeonE5-2660v3/gcc-11.2.0/lib --with-parmetis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3 --with-metis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3 --download-scalapack=1 --download-mumps=1

[0]PETSC ERROR: #1 MatGetFactor() line 4492 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/mat/interface/matrix.c

[0]PETSC ERROR: #2 PCSetUp_ILU() line 133 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/impls/factor/ilu/ilu.c

[0]PETSC ERROR: #3 PCSetUp() line 894 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/interface/precon.c

----------------------------------------------------

Exception on processing: 

--------------------------------------------------------

An error occurred in line <431> of file </zhome/32/9/115503/dealii-candi/tmp/unpack/deal.II-v9.3.1/source/lac/petsc_precondition.cc> in function

    void dealii::PETScWrappers::PreconditionILU::initialize(const dealii::PETScWrappers::MatrixBase&, const dealii::PETScWrappers::PreconditionILU::AdditionalData&)

The violated condition was: 

    ierr == 0

Additional information: 

    deal.II encountered an error while calling a PETSc function.

    The description of the error provided by PETSc is "See

    https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for

    possible LU and Cholesky solvers".

    The numerical value of the original error code is 92.




Thank you very much

Regards, 

H

Timo Heister

unread,
Mar 15, 2022, 1:45:29 PM3/15/22
to dea...@googlegroups.com
Hi,

> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers
> [0]PETSC ERROR: Could not locate a solver package for factorization type ILU and matrix type mpiaij.

This means that ILU is not available for parallel matrices (mpiaij).
You need to pick a different preconditioner. You can try
PreconditionBlockJacobi which is a blocked ILU and probably what you
wanted to do.
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to dealii+un...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/dealii/c4109bcf-cd4e-453e-9c81-9ed1230a235cn%40googlegroups.com.



--
Timo Heister
http://www.math.clemson.edu/~heister/

Hermes Sampedro

unread,
Mar 18, 2022, 11:01:35 AM3/18/22
to deal.II User Group
Dear Timo, 

thank you for your answer. Is there any other option for ILU with parallel matrices apart of Block Jacobi? 

Thank you
Regards

Timo Heister

unread,
Mar 19, 2022, 9:33:04 PM3/19/22
to dea...@googlegroups.com
PreconditionBlockJacobi is a parallel ILU. I am not sure what you want
to do instead?
(also take a look at the documentation:
https://www.dealii.org/developer/doxygen/deal.II/classPETScWrappers_1_1PreconditionBlockJacobi.html
)

On Fri, Mar 18, 2022 at 11:02 AM Hermes Sampedro
> To view this discussion on the web visit https://groups.google.com/d/msgid/dealii/1bc76b33-35cd-4a38-8ea2-011cb1f1f091n%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages