Complex-valued distributed matrices in dealii

42 views
Skip to first unread message

Pascal Kraft

unread,
Jul 23, 2020, 12:42:47 PM7/23/20
to deal.II User Group
Dear Deal.II devs and users,

In the latest release a lot of (great) work has been done to make complex numbers more of a first-class citizen in deal, which has made my code a lot more readable. Currently, I am stuck with one problem, though. Are there any distributed datatypes for matrices that accept complex numbers?

The dealii sparse matrix implementation is a template and allows complex numbers - however that implementation has no MPI functionality, which I need.

The Petsc Sparse Matrix and Trilinos Sparse Matrix are no templates. In the types header I found the declaration of TrilinosScalar as double but changing it and recompiling dealii with the changed header threw an error. 

I have Trillions compiled with support for complex numbers and also searched through the LinearAlgebra documentation.

I require GMRES as a solver (which should be possible, because the GMRES Versions all use a templated Vector which can take complex components) and MPI distribution of a sparse system. I have so far only seen FullMatrix to accept complex numbers.

Can anyone give me a pointer on what is possible?

Kind regards,
Pascal Kraft

Pascal Kraft

unread,
Jul 23, 2020, 12:58:40 PM7/23/20
to deal.II User Group
Some additional information: If I try to compile deal with TrilinosScalar = std::complex<double> I get many errors like this:

[  5%] Building CXX object source/numerics/CMakeFiles/obj_numerics_release.dir/data_postprocessor.cc.o
[  5%] Building CXX object source/numerics/CMakeFiles/obj_numerics_release.dir/dof_output_operator.cc.o
In file included from install_dir/dealii/dealii-source/include/deal.II/lac/trilinos_parallel_block_vector.h:27,
                 from install_dir/dealii/dealii-source/source/numerics/dof_output_operator.cc:26:
install_dir/dealii/dealii-source/include/deal.II/lac/trilinos_vector.h: In member function ‘dealii::TrilinosWrappers::MPI::Vector::value_type* dealii::TrilinosWrappers::MPI::Vector::begin()’:
install_dir/dealii/dealii-source/include/deal.II/lac/trilinos_vector.h:1525:25: error: cannot convert ‘double*’ to ‘dealii::TrilinosWrappers::MPI::Vector::iterator’ {aka ‘std::complex<double>*’} in return
 1525 |       return (*vector)[0];

suggesting a hard dependency on double somewhere else.

Daniel Arndt

unread,
Jul 23, 2020, 1:32:49 PM7/23/20
to dea...@googlegroups.com
Pascal,

The wrapped Trilinos matrices are based on Epetra which only supports double AFAICT. That's why you can replace TrilinosScalar easily.
On the other hand, you should be able to compile PETSc with complex scalar type and use that with MPI.

Best,
Daniel

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dealii+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dealii/4f4af1e0-4020-4dbf-aa9a-3a3fb7297d90o%40googlegroups.com.

Pascal Kraft

unread,
Jul 23, 2020, 5:10:51 PM7/23/20
to deal.II User Group
Hi Daniel,

oh, I'm really sorry for asking if that works. I had seen that neither PETSc nor Trilinos Sparse Matrices are templated and assumed that if the more modern version (Trilinos) doesn't work with complex numbers, trying PETSc wouldn't be very promising. But you are right, I will try that and report back. If that works I will see if it is possible to update the docu somewhere.

Kind regards and thanks for the super fast response, 
Pascal

Wolfgang Bangerth

unread,
Jul 25, 2020, 7:43:44 PM7/25/20
to dea...@googlegroups.com
On 7/23/20 10:42 AM, Pascal Kraft wrote:
>
> I have Trillions compiled with support for complex numbers and also searched
> through the LinearAlgebra documentation.

I don't think I knew that one can compile Trilinos with complex numbers. How
do you do that?

It does not greatly surprise me that we use TrilinosScalar and double
interchangeably. If Trilinos can indeed be compiled with complex numbers, then
we ought to find a way to (i) make TrilinosScalar dependent on whatever
Trilinos was compiled for, (ii) ensure that all of the places that currently
don't compile because we use double in place of TrilinosScalar are fixed.

Patches are, as always, very welcome!


> I require GMRES as a solver (which should be possible, because the GMRES
> Versions all use a templated Vector which can take complex components) and MPI
> distribution of a sparse system. I have so far only seen FullMatrix to accept
> complex numbers.

I believe that GMRES could indeed be made to work for complex-valued problems,
but I'm not sure any of us have every tried. When writing step-58, I toyed
with the idea of looking up in the literature what one would need for a
complex GMRES, but in the end decided to just make SparseDirectUMFPACK work
instead. The issue is that for every matrix-vector and vector-vector operation
that happens inside GMRES, you have to think about whether one or the other
operand needs to be complex-conjugated. I'm certain that is possible, but
would require an audit of a few hundred lines. It would probably be simpler to
just use PETSc's (or Trilinos') GMRES implementation.

Best
W.

--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/

Pascal Kraft

unread,
Jul 26, 2020, 4:57:36 AM7/26/20
to deal.II User Group
Hi Wolfgang,

here is what I found out about the topic: 
Originally, I only knew Trilinos because I used the distributed matrices and vectors in my application. I also knew that there is a configuration of trilinos to make complex numbers available in all packages that support it. 
However, from what I can tell, that only effects Tpetra datatypes, not Epetra. From what I have seen in the dealwrappers, they only use Epetra. An interesting detail about this is the Komplex-Package, which is described as an Epetra based solver for complex systems, which wraps Epetra matrices and stores the real and imaginary parts as blocks. (see here:  https://docs.trilinos.org/dev/packages/komplex/doc/html/index.html )
At GitHub I can see that project 4 deals with adding Tpetra support which would make complex numbers in Tpetra usable in deal if the interface is built to support them)

About GMRES: I will be using PETSc GMRES to solve my system, but if possible I will try to also solve it with dealii::SolverGMRES and let you know what happens.

Kind regards,
Pascal

Pascal Kraft

unread,
Jul 26, 2020, 5:06:32 AM7/26/20
to deal.II User Group
The documentation states that Tpetra supports 
- MPI
- Shared memory parallelization: (OpenMP, CUDA, Posix)

and: 
Scalar: A Scalar is the type of values in the sparse matrix or dense vector. This is the type most likely to be changed by many users. The most common use cases are float, double, std::complex<float> and std::complex<double> 

and it contains:

see
https://docs.trilinos.org/dev/packages/tpetra/doc/html/index.html  

Wolfgang Bangerth

unread,
Jul 26, 2020, 9:21:23 PM7/26/20
to dea...@googlegroups.com

Pascal,

> Originally, I only knew Trilinos because I used the distributed matrices and
> vectors in my application. I also knew that there is a configuration of
> trilinos to make complex numbers available in all packages that support it.
> However, from what I can tell, that only effects Tpetra datatypes, not Epetra.
> From what I have seen in the dealwrappers, they only use Epetra. An
> interesting detail about this is the Komplex-Package, which is described as an
> Epetra based solver for complex systems, which wraps Epetra matrices and
> stores the real and imaginary parts as blocks. (see here:
> https://docs.trilinos.org/dev/packages/komplex/doc/html/index.html
> <https://nam01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdocs.trilinos.org%2Fdev%2Fpackages%2Fkomplex%2Fdoc%2Fhtml%2Findex.html&data=02%7C01%7CWolfgang.Bangerth%40colostate.edu%7Ca8809251f22f473d86cc08d83141f358%7Cafb58802ff7a4bb1ab21367ff2ecfc8b%7C0%7C0%7C637313506615095299&sdata=m%2BNl78twnMHlf3ByWf3OlsYX7WDK%2ByHz6ogeMVuKUBA%3D&reserved=0> )
> At GitHub I can see that project 4 deals with adding Tpetra support which
> would make complex numbers in Tpetra usable in deal if the interface is built
> to support them)

Yes, Tpetra is the way to go in the long run. deal.II actually has Tpetra
wrappers already, in namespace LinearAlgebra::TpetraWrappers. It is definitely
our long-term goal to move from the Epetra interfaces to the Tpetra
interfaces. The issue -- for several years already -- is that not all of the
packages we use in Trilinos (for example for solvers, preconditioners, etc)
support Tpetra. In other words, at least every time we looked, we could not
replace our Epetra interfaces without losing functionality.

That said, it is definitely true that there are Trilinos packages for solvers
and preconditioners that do support Tpetra, and for which we don't (yet) have
any interfaces. So, if you are willing to help out with this situation, one
approach worth pursuing would be to explore what Trilinos functionality you
need, and then ask here or on github which pieces are already available and
which still need to be written. None of the interfaces are very large, and
writing more interfaces is not a very difficult task because there are already
good examples to start from. We would certainly appreciate any help!


> About GMRES: I will be using PETSc GMRES to solve my system, but if possible I
> will try to also solve it with dealii::SolverGMRES and let you know what happens.

Yes, feedback is welcome!
Reply all
Reply to author
Forward
0 new messages