Configuration of PETSc

26 views
Skip to first unread message

Konrad Simon

unread,
Sep 20, 2019, 4:16:05 AM9/20/19
to deal.II User Group
Dear deal.ii community,

I am using deali.ii with PETSc and Trilinos. However, when I am using the PETSc PreconditionILU I get an error that suggests that a solver package is missing (with Trilinos it works). Petsc's PreconditionAMG works fine (although not very efficiently for my problem). 
Do I need to do any special configuration steps for PETSC? I followed the instructions that are documented on the deal.ii pages on "how to configure Petsc". https://www.dealii.org/current/external-libs/petsc.html

Best,
Konrad

This is the error:

Running using PETSc. 
Number of active cells: 262144 
Total number of cells: 24865 (on 7 levels) 
Number of degrees of freedom: 1609920 (811200+798720) 
[0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- 
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers 
[0]PETSC ERROR: Could not locate a solver package. Perhaps you must ./configure with --download-<package> 
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. 
[0]PETSC ERROR: Petsc Release Version 3.9.4, Sep, 11, 2018  
[0]PETSC ERROR: main on a x86_64 named thunder5 by u290231 Fri Sep 20 10:00:23 2019 
[0]PETSC ERROR: Configure options --with-shared-libraries=1 --with-x=0 --with-mpi=yes --download-hypre=yes --with-64-bit-indices --with-debugging=yes --with-hypre=yes 
[0]PETSC ERROR: #1 MatGetFactor() line 4328 in /scratch/cen/numgeo/lib/petsc-3.9.4/src/mat/interface/matrix.c 
[0]PETSC ERROR: #2 PCSetUp_ILU() line 142 in /scratch/cen/numgeo/lib/petsc-3.9.4/src/ksp/pc/impls/factor/ilu/ilu.c 
[0]PETSC ERROR: #3 PCSetUp() line 923 in /scratch/cen/numgeo/lib/petsc-3.9.4/src/ksp/pc/interface/precon.c 
--------------------------------------------------------- 
TimerOutput objects finalize timed values printed to the 
screen by communicating over MPI in their destructors. 
Since an exception is currently uncaught, this 
synchronization (and subsequent output) will be skipped to 
avoid a possible deadlock. 
--------------------------------------------------------- 
WARNING! There are options you set that were not used! 
WARNING! could be spelling mistake, etc! 
Option left: name:-p value: ../MsFEComplex/parameter.in 
ERROR: Uncaught exception in MPI_InitFinalize on proc 0. Skipping MPI_Finalize() to avoid a deadlock. 


---------------------------------------------------- 
Exception on processing:  

-------------------------------------------------------- 
An error occurred in line <421> of file </scratch/cen/numgeo/lib_compile/dealii-9.1.1/source/lac/petsc_precondition.cc> in function 
   void dealii::PETScWrappers::PreconditionILU::initialize(const dealii::PETScWrappers::MatrixBase&, const dealii::PETScWrappers::PreconditionILU::AdditionalData&) 
The violated condition was:  
   ierr == 0 
Additional information:  
deal.II encountered an error while calling a PETSc function. 
The description of the error provided by PETSc is "See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers". 
The numerical value of the original error code is 92. 
-------------------------------------------------------- 

Aborting! 
---------------------------------------------------- 
-------------------------------------------------------------------------- 
Primary job  terminated normally, but 1 process returned 
a non-zero exit code. Per user-direction, the job has been aborted. 
-------------------------------------------------------------------------- 
-------------------------------------------------------------------------- 
mpirun detected that one or more processes exited with non-zero status, thus causing 
the job to be terminated. The first process to do so was: 

 Process name: [[37149,1],0] 
 Exit code:    1 
--------------------------------------------------------------------------

Toni Vidal

unread,
Sep 20, 2019, 6:52:23 AM9/20/19
to deal.II User Group
Hello Konrad,

I thing  PETSc PCILU does not work in parallel. You could use the preconditioner PETScWrappers::PreconditionBlockJacobi  that uses an ILU preconditioner for each process block. 


Also youcould use HYPRE parallell ILU from petsc if you have installed it.


 Or any other preconditioner that works in parallel. 

El divendres, 20 setembre de 2019 10:16:05 UTC+2, Konrad Simon va escriure:

Konrad Simon

unread,
Sep 20, 2019, 7:19:36 AM9/20/19
to deal.II User Group
Hi Toni,

Seems like I missed that little note in the documentation. Thank you :-)

Best,
Konrad
Reply all
Reply to author
Forward
0 new messages