Parallel Eigenproblems

124 views
Skip to first unread message

bjpal...@gmail.com

unread,
Apr 27, 2016, 5:42:50 PM4/27/16
to deal.II User Group
Hi,

I'm trying to implement a parallel version of the simple quantum problem described in Step 36 (assuming this is even possible). At present, I'm attempting to evaluate the minimum and maximum values of the spurious eigenvalues generated by the boundary conditions. I think that the easiest way to do this would be to use a function that extracts the diagonal from a distributed sparse matrix and copies it to a distributed vector. Does such a function exist in DEAL II?

Bruce

Wolfgang Bangerth

unread,
Apr 28, 2016, 7:36:51 AM4/28/16
to dea...@googlegroups.com
It doesn't, but it shouldn't be very difficult either to write something that,
on every processor, extracts the diagonal elements of those rows of the
matrices that are stored on that processor. You can access individual matrix
elements through the deal.II wrappers to PETSc and Trilinos wrappers.

Best
W.

--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@math.tamu.edu
www: http://www.math.tamu.edu/~bangerth/

bjpal...@gmail.com

unread,
Apr 29, 2016, 3:17:10 PM4/29/16
to deal.II User Group
I'm trying to run the Krylov-Schur solver in parallel. This is the same eigensolver as in step-36 but I'm getting errors from the solver about how the problem is set up. If I try running on 1 processor, I get the errors

[0]PETSC ERROR: MatSolverPackage petsc does not support matrix type mpiaij

and if I try and run on more than 1 processor I get

[0]PETSC ERROR: You chose to solve linear systems with a factorization, but in parallel runs you need to select an external package; see the users guide for details

I'm using the same setup and solver as in step-36. Do I need to specify additional attributes on the solver to get it to work? I've been successful running step-36 on 1 processor, so apparently the build is okay.

Bruce

Tobi Young

unread,
Apr 29, 2016, 5:18:58 PM4/29/16
to dealii
Odd! :-)

What version of PETSc are you using? How did you compile PETSc?
What system are you trying to solve and what does your mass-matrix
look like? Is that system supported by SLEPc? (Symmetry).

Best,
Toby
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see
> https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to dealii+un...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

bjpal...@gmail.com

unread,
May 2, 2016, 11:15:09 AM5/2/16
to deal.II User Group


Tobi,

I'm currently using PETSc 3.6.0 and SLEPc 3.6.3. The configuration script for PETSc is

python ./config/configure.py \
  PETSC_ARCH=linux-openmpi-gnu-cxx-real-opt \
  --with-prefix=./ \
  --with-mpi=1 \
  --with-cc=mpicc \
  --with-fc=mpif90 \
  --with-cxx=mpicxx \
  --with-c++-support=1 \
  --with-c-support=0 \
  --with-c-language=C++\
  --with-fortran=0 \
  --with-scalar-type=real \
  --with-fortran-kernels=generic \
  --download-superlu_dist \
  --download-superlu \
  --download-parmetis \
  --download-metis \
  --download-f2cblaslapack=1 \
  --download-suitesparse \
  --download-hypre \
  --with-clanguage=c++ \
  --with-mpirun=mpirun \
  --with-mpiexec=mpiexec \
  --with-debugging=0

I've attached a tarball that contains a stripped down version of the application code that seems to reproduce the errors that I've been seeing. I may not be handling the boundary conditions correctly, but I don't think that would explain the errors I'm seeing, which seems to be more related to the underlying structure of the matrices that I'm trying to set up.

This system represents a rectangular domain with a deeper square well in the center. The mass matrix is just the overlap matrix of the finite element basis functions. It should be similar to the one in step-36. I'm shooting for having the wave function go to zero on the boundary.

Bruce


 
testq.tar

Denis Davydov

unread,
May 3, 2016, 12:15:20 AM5/3/16
to deal.II User Group
Hi Brunce,


On Friday, April 29, 2016 at 9:17:10 PM UTC+2, bjpal...@gmail.com wrote:

[0]PETSC ERROR: You chose to solve linear systems with a factorization, but in parallel runs you need to select an external package; see the users guide for details

AFAIK, this is exactly the reason. namely by default you do something like Krylov-Schur where you need to invert K-\tau M, and PETSc complains that you need to have a parallel solver for parallel matrix. You do that by adding extra arguments:

-st_ksp_type cg -st_pc_type gamg -st_ksp_rtol 1e-12

or IMHO preferably to using deal.ii interfaces to set up solver and preconditioner.

Regards,
Denis. 

bjpal...@gmail.com

unread,
May 3, 2016, 4:12:16 PM5/3/16
to deal.II User Group


Daniel,

Modifying the preconditioner and linear solver seemed to do the trick, although it was pretty difficult tracking down the documentation on how to do it using the deal interface. The code had a number of other bugs once you got past the eigensolver, but it seems to work now. I've attached a working version of the reproducer, if anyone is interested.

Bruce

 
testq.tar

Denis Davydov

unread,
May 4, 2016, 1:49:53 AM5/4/16
to deal.II User Group
Hi Bruce,


On Tuesday, May 3, 2016 at 10:12:16 PM UTC+2, bjpal...@gmail.com wrote:


Modifying the preconditioner and linear solver seemed to do the trick, although it was pretty difficult tracking down the documentation on how to do it using the deal interface. The code

If you have an idea on how to improve this from a user's perspective (add this info additionally to description of some classes/namespaces or a short code snippet somewhere), please feel free to create a pull-request on GitHub https://github.com/dealii/dealii/pulls .
Or you can post a patch here and I will create a PR with it.

Regards,
Denis
Reply all
Reply to author
Forward
0 new messages