FETools::interpolate with PETScWrappers::MPI::Vector

95 views
Skip to first unread message

Matteo Malvestiti

unread,
Jan 17, 2025, 6:09:47 AM1/17/25
to deal.II User Group
Good morning everybody,
I'm writing a PETSc distributed version of step-14.

In 
void WeightedResidual<dim>::output_solution()
I need to interpolate the solution of the dual solver (FE_Q(2)) on the primal space (FE_Q(1)) , otherwise I wouldn't be able to show them in the same output file.
Moreover other interpolations between spaces are needed further ahead for the dual-weighted error estimation.

This is my current code.

// Define the new dual_solution vector, which will house the interpolated values
PETScWrappers::MPI::Vector completely_distributed_dual_solution_on_primal(
     PrimalSolver<dim>::locally_owned_dofs,
     PrimalSolver<dim>::mpi_communicator); // NO GHOST elements

FETools::interpolate(DualSolver<dim>::dof_handler,
                           DualSolver<dim>::completely_distributed_solution,
                           PrimalSolver<dim>::dof_handler,
                           PrimalSolver<dim>::linear_system_ptr->hanging_node_constraints,
                           completely_distributed_dual_solution_on_primal);

 It works with 1 rank, but not with more than one.

---------------------------
I've read in another discussion here on the user group, that interpolate should take in ghosted vectors, but I get an error as soon as I try that.
Specifically, if I do:

PETScWrappers::MPI::Vector
locally_relevant_dual_solution_on_primal(
   PrimalSolver<dim>::locally_owned_dofs,
   PrimalSolver<dim>::locally_relevant_dofs,
   PrimalSolver<dim>::mpi_communicator); // with NO GHOST elements

FETools::interpolate(DualSolver<dim>::dof_handler,
                           DualSolver<dim>::locally_relevant_solution,
                           PrimalSolver<dim>::dof_handler,
                           PrimalSolver<dim>::linear_system_ptr->hanging_node_constraints,
                          locally_relevant_dual_solution_on_primal);

I then get the following error (even just in serial execution):

An error occurred in line <941> of file </var/folders/8z/hlb6vc015qjggytkxn84m6_c0000gn/T/heltai/spack-stage/spack-stage-dealii-9.6.0-rc1-3efkxbxl3aseiejk73xqv2r4jrvgva4l/spack-spc/include/deal.II/lac/pe

tsc_vector_base.h> in function

     const VectorReference dealii::PETScWrappers: :internal: :VectorReference: :operator+=(const PetscScalar &) const

The violated condition was:

    !vector.has_ghost_elements)

Additional information:


You are trying an operation on a vector that is only allowed if the vector has no ghost elements, but the vector you are operating on does have ghost elements.

Specifically, there are two kinds of operations that are typically not allowed on vectors with ghost elements. First, vectors with ghost elements are read-only and cannot appear in operations that write into these vectors. Second, reduction operations (such as computing the norm of a vector, or taking dot products between vectors) are not allowed to ensure that each vector element is counted only once (as opposed to once for the owner of the element plus once for each process on which the element is stored as a ghost copy).


See the glossary entry on 'Ghosted vectors' for more


Can someone more expert than me help to understand how I should face interpolations?
Thanks a lot in advance!

Kind regards,
Matteo Malvestiti

Wolfgang Bangerth

unread,
Jan 17, 2025, 7:40:36 PM1/17/25
to dea...@googlegroups.com
Matteo,
Thanks for reporting this!

It is entirely possible that this function has simply never been tried with
parallel triangulations. I know that it was written several years before we
ever started implementing parallel computations.

A good first step would be to create a "minimal working example" -- a
complete, compilable code that only sets up data structures and then calls the
function in question to demonstrate the error. The content of the vectors you
pass in is not important, they might as well be zero vectors.

Would you be willing to come up with such a piece of code and submit it as a
bug to the github repository? I can't promise that anyone has the capacity to
actually fix the issue, but we can at least tell you where one might have to
look to fix it!

Best
Wolfgang

--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/


Matteo Malvestiti

unread,
Jan 18, 2025, 5:44:18 PM1/18/25
to deal.II User Group
Dear Prof. Bangerth,
I thank you for your kind and quick response.
I see!
Here is a minimal self contained example to recreate the issue:
  • import a rectangular mesh
  • define two dof_handlers
  • attach FE_Q(1) and FE_Q(2) elements respectively
  • Define two PETScWrappers::MPI::Vector WITH ghost values. All zeros.
  • Perform FETools::interpolate
  • [ERROR message] see previous post
I provided a CMakeLists.txt, the mesh.msh and a minimalistic README with compile instruction as well.
You find all in the attached zip file.

I will also try now to submit it as a bug on the github repo.

I would enormously appreciate any suggestion on how to either:
  • fix the issue
  • (most likely) work around it,
until somebody more expert than me fixes it in a future release.


Mit freundlichen Grüßen aus Heidelberg,
Matteo Malvestiti



FEToolsInterpolate_Bug.zip

Matteo Malvestiti

unread,
Feb 3, 2025, 8:20:11 AM2/3/25
to deal.II User Group
Solved.
Problem on my side.

Input vector must be GHOSTED
Output vector must be NON-GHOSTED

I might be nice to add this to the documentation for clarity.
According to a user on the related github issue this works not only for PETSc but also for Trillinos.

Wolfgang Bangerth

unread,
Feb 3, 2025, 12:31:29 PM2/3/25
to dea...@googlegroups.com
On 2/3/25 06:20, Matteo Malvestiti wrote:
> Problem on my side.
>
> Input vector must be GHOSTED
> Output vector must be NON-GHOSTED
>
> I might be nice to add this to the documentation for clarity.
> According to a user on the related github issue this works not only for PETSc
> but also for Trillinos.

Matteo:
Yes, indeed -- would you like to write a patch that puts this kind of
information into the documentation? The file in which, I believe, the function
is documented is include/deal.II/fe/fe_tools.h. I think it would make for a
nice patch to better describe this requirement!

Best
Wolfgang

Matteo Malvestiti

unread,
Feb 3, 2025, 4:54:08 PM2/3/25
to deal.II User Group
Yes, very gladly.
As soon I have a moment and I figure out how to do this, I will gladly contribute, even if it is such a small thing.

Best regards,
Matteo
Reply all
Reply to author
Forward
0 new messages