Dear Bruno,
Thank you again for your answer.
I managed to solve now a system of 3.5 million DOF using the same solver as I posted above, SparseDirectMUMPS. Now, in release mode, the assembling takes a few minutes instead of hours, however, the solver function takes approximately 1.5h (per frequency iteration) using 40 processes in parallel (similar to step-40).
I was expecting to get faster performance when running in parallel with 40 processes, especially because I need to run for several frequencies. I would like to ask if you also would expect faster performance. Would that be solved using the solver that you suggested (Krylov)?
Thank you
Regards,
H
To view this discussion on the web visit https://groups.google.com/d/msgid/dealii/b78563e3-8746-4107-b1d3-95f4ff8289c9n%40googlegroups.com.
Dear Bruno,
I have been reading the examples and documents you pointed out. I tried to use SolvereGREMS with PreconditionILU. However, I am getting a running error that I can not really understand when calling PETScWrappers::PreconditionILU preconditioner(system_matrix). However, it seems to work with PETScWrappers::PreconditionBlockJacobi preconditioner(system_matrix). The solver function ad error that I get is the following:
void LaplaceProblem<dim>::solve()
{
PETScWrappers::MPI::Vector completely_distributed_solution(locally_owned_dofs,mpi_communicator);
SolverControl cn(completely_distributed_solution.size(), 1e-8 * system_rhs.l2_norm());
PETScWrappers::SolverGMRES solver(cn, mpi_communicator);
PETScWrappers::PreconditionILU preconditioner(system_matrix);
solver.solve(system_matrix, completely_distributed_solution, system_rhs, preconditioner);
constraints.distribute(completely_distributed_solution);
locally_relevant_solution = completely_distributed_solution;
}
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers
[0]PETSC ERROR: Could not locate a solver package for factorization type ILU and matrix type mpiaij.
[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.13.1, May 02, 2020
[0]PETSC ERROR: ./waveLaplaceSolver on a named gbarlogin1 by hsllo Fri Mar 11 11:05:23 2022
[0]PETSC ERROR: Configure options --prefix=/zhome/32/9/115503/dealii-candi/petsc-3.13.1 --with-debugging=0 --with-shared-libraries=1 --with-mpi=1 --with-x=0 --with-64-bit-indices=0 --download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90 --with-blaslapack-dir=/appl/OpenBLAS/0.3.17/XeonE5-2660v3/gcc-11.2.0/lib --with-parmetis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3 --with-metis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3 --download-scalapack=1 --download-mumps=1
[0]PETSC ERROR: #1 MatGetFactor() line 4492 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/mat/interface/matrix.c
[0]PETSC ERROR: #2 PCSetUp_ILU() line 133 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/impls/factor/ilu/ilu.c
[0]PETSC ERROR: #3 PCSetUp() line 894 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/interface/precon.c
----------------------------------------------------
Exception on processing:
--------------------------------------------------------
An error occurred in line <431> of file </zhome/32/9/115503/dealii-candi/tmp/unpack/deal.II-v9.3.1/source/lac/petsc_precondition.cc> in function
void dealii::PETScWrappers::PreconditionILU::initialize(const dealii::PETScWrappers::MatrixBase&, const dealii::PETScWrappers::PreconditionILU::AdditionalData&)
The violated condition was:
ierr == 0
Additional information:
deal.II encountered an error while calling a PETSc function.
The description of the error provided by PETSc is "See
https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for
possible LU and Cholesky solvers".
The numerical value of the original error code is 92.
Could you please help me to understand what is happening?
Thank you again for your help
--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to a topic in the Google Groups "deal.II User Group" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/dealii/KQGIxkJZL6w/unsubscribe.
To unsubscribe from this group and all its topics, send an email to dealii+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dealii/7f6384f4-ebae-4843-83e0-a17abb915e62n%40googlegroups.com.