Contributing back a small feature enhancement (MGTransferPrebuilt parallel PETSc support)

22 views
Skip to first unread message

Alexander Knieps

unread,
Mar 19, 2018, 6:39:07 AM3/19/18
to deal.II User Group
Hello everyone,

I have written a small feature enhancement that allows MGTransferPrebuilt to be used with parallel PETSc matrices. If possible I would like to contribute this change back to the main development branch. I guess that I should submit a pull request but I made changes that I would like to discuss before submission (or are these things usually discussed with the pull requests on github?).

- The trilinos wrapper has an optional argument that allows the matrix to be initialized with an incomplete sparsity pattern and have the trilinos wrapper distribute it by itself. The PETSc wrapper does not offer this functionality so I had to manually distribute the sparsity pattern. Since dynamic sparsity patterns have no copy constructor and I also couldn't find a copy method, that means that I had to change the argument type from "const SparsityPatternType&" to "SparsityPatternType&". The only call site to this function clears the sparsity pattern immediately after use, but I would still make sure that the library authors are aware of and OK with this change.

- The PETSc matrix wrapper has an AssertThrow statement that prohibits the number of locally stored columns from being 0 if the number of locally stored rows is nonzero (source/lac/petsc_parallel_sparse_matric.cc - line 385). If there is a coarse level where the # of DoFs is smaller than the process count and one where that is not the case, then this is guaranteed to fire an exception. Is there a bug in PETSc which requires this assertion to be present in the first place? I could not find a rule in the PETSc documentation prohibiting this use case.

 Additionally, I noticed that MGTransferPrebuilt is only instantiated for the cases where dim = spacedim, should I change this?

Best regards,

Alexander Knieps.

Wolfgang Bangerth

unread,
Mar 20, 2018, 12:07:08 AM3/20/18
to dea...@googlegroups.com, Alexander Knieps

Alexander,

> I have written a small feature enhancement that allows MGTransferPrebuilt to
> be used with parallel PETSc matrices. If possible I would like to contribute
> this change back to the main development branch. I guess that I should submit
> a pull request but I made changes that I would like to discuss before
> submission (or are these things usually discussed with the pull requests on
> github?).

Yes, we typically discuss these things with actual code, in github pull
requests. If you want to make it easier for everyone, then break things into
separate pull requests for logically separate issues.


> - The trilinos wrapper has an optional argument that allows the matrix to be
> initialized with an incomplete sparsity pattern and have the trilinos wrapper
> distribute it by itself. The PETSc wrapper does not offer this functionality
> so I had to manually distribute the sparsity pattern. Since dynamic sparsity
> patterns have no copy constructor and I also couldn't find a copy method, that
> means that I had to change the argument type from "const SparsityPatternType&"
> to "SparsityPatternType&". The only call site to this function clears the
> sparsity pattern immediately after use, but I would still make sure that the
> library authors are aware of and OK with this change.

I have to admit that without seeing the code that requires this, it's
difficult for me to follow why this is necessary. Can you put the change up on
github? Then we can discuss the details.


> - The PETSc matrix wrapper has an AssertThrow statement that prohibits the
> number of locally stored columns from being 0 if the number of locally stored
> rows is nonzero (source/lac/petsc_parallel_sparse_matric.cc - line 385). If
> there is a coarse level where the # of DoFs is smaller than the process count
> and one where that is not the case, then this is guaranteed to fire an
> exception. Is there a bug in PETSc which requires this assertion to be present
> in the first place? I could not find a rule in the PETSc documentation
> prohibiting this use case.

Good question -- this line may have been there since 2003, so who knows. The
PETSc documentation is also often rather brief.

Same here: put the change up on github as a pull request. I'm sure someone
will ask for a test case that tries creating such a matrix with no rows on one
processor. If you know how to write one, then great -- otherwise, we'll walk
you through the process on github.


> Additionally, I noticed that MGTransferPrebuilt is only instantiated for the
> cases where dim = spacedim, should I change this?

I think that makes sense. Are the other MG classes all instantiated for dim !=
spacedim?

Best
W.

--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/

Alexander Knieps

unread,
Mar 20, 2018, 12:43:01 PM3/20/18
to deal.II User Group
Dear Wolfgang,

thank you for your response. I will post the pull request on GitHub end of this week, so that it can be reviewed. I will give a detailed reasoning for each modification there.


I think that makes sense. Are the other MG classes all instantiated for dim !=
spacedim?

After looking a bit more into this, answer by .inst.in file in source/multigrid:
- mg_transfer_prebuilt.inst.in - No
- mg_transfer_matrix_free.inst.in - No (and the classes only take dim, but not spacedim as a template argument)
- mg_transfer_internal.inst.in - Partially (Yes where possible, but  internal::MGTransfer::setup_transfer(...) only accepts dim, but not spacedim as template parameter)
- mg_transfer_component.inst.in - No
- mg_transfer_block_inst.in - No
- mg_level_global_transfer.inst.in - No
- mg_base.inst.in - dim / spacedim not used

Some classes (e.g. MGTransferMatrixFree,
PreconditionMG) do not take a spacedim argument. Since PreconditionMG itself does not support that, I am not sure whether instantiating the rest for dim != spacedim even has a point.

Best regards,

Alexander.

Wolfgang Bangerth

unread,
Mar 20, 2018, 12:46:19 PM3/20/18
to dea...@googlegroups.com
On 03/20/2018 10:43 AM, Alexander Knieps wrote:
>
> I think that makes sense. Are the other MG classes all instantiated
> for dim !=
> spacedim?
>
>
> After looking a bit more into this, answer by .inst.in file in
> source/multigrid:
> - mg_transfer_prebuilt.inst.in - No
> - mg_transfer_matrix_free.inst.in - No (and the classes only take dim,
> but not spacedim as a template argument)
> - mg_transfer_internal.inst.in - Partially (Yes where possible, but
> internal::MGTransfer::setup_transfer(...) only accepts dim, but not
> spacedim as template parameter)
> - mg_transfer_component.inst.in - No
> - mg_transfer_block_inst.in - No
> - mg_level_global_transfer.inst.in - No
> - mg_base.inst.in - dim / spacedim not used
>
> Some classes (e.g. MGTransferMatrixFree, PreconditionMG) do not take a
> spacedim argument. Since PreconditionMG itself does not support that, I
> am not sure whether instantiating the rest for dim != spacedim even has
> a point.

It's probably worth making that happen at one point, if necessary by
adding the second template argument to all of the classes that don't
have it yet. But this may be a bit of work.
Reply all
Reply to author
Forward
0 new messages