Transferring solutions in distributed computing

41 views
Skip to first unread message

Junchao Zhang

unread,
Jul 20, 2016, 6:22:40 PM7/20/16
to deal.II User Group
Hello,
  I am new to deal.II. I want to learn how to solve a very simple convection-diffusion equation using adaptive mesh refinement with deal.II on distributed memory machines. I'd like to distribute all stuffs, including DoFs, matrices, vectors and triangulations. The boundary value function and initial value function of the PDE are known. During each time step, I want to refine/coarsen the mesh multiple times.
   I am now stuck in transferring the old solution from the old mesh to the new mesh. In each refinement step, I have the following code

        // old_locally_relevant_solution and locally_relevant_solution are two solution vectors for previous time step and current time step.

        SolutionTransfer<dim, PETScWrappers::MPI::Vector> soltrans(dof_handler);
        PETScWrappers::MPI::Vector previous_solution = old_locally_relevant_solution;
        triangulation.prepare_coarsening_and_refinement();
        triangulation.execute_coarsening_and_refinement ();
        setup_system();
        soltrans.interpolate(previous_solution, old_locally_relevant_solution);
        constraints.distribute (old_locally_relevant_solution);

I met a runtime error at the red line. It seems I should use vectors without ghost. But how to do that? Is there a deal.II example close to my problem?  I could not find one.

Thank you!

--Junchao Zhang


Daniel Arndt

unread,
Jul 21, 2016, 5:00:20 AM7/21/16
to deal.II User Group
Junchao,

You want to use parallel::distributed::SolutionTransfer instead if you are on a parallel::distributed::Triangulation
Executing
$ grep -r "parallel::distributed::SolutionTransfer" .
in the examples folder, tells me that this object is used in step-32, step-42 and step-48.
Have for example a look at how this is done in step-42::refine_grid[1].

Best,
Daniel

[1] https://www.dealii.org/8.4.0/doxygen/deal.II/step_42.html#PlasticityContactProblemrefine_grid

Junchao Zhang

unread,
Jul 21, 2016, 11:50:56 AM7/21/16
to dea...@googlegroups.com
Daniel, 

The link you provides is very helpful. Thanks. In the code, I see

solution_transfer.interpolate(distributed_solution);
constraints_hanging_nodes.distribute(distributed_solution);
solution = distributed_solution;

I am confused by the postprocessing. I think distributed_solution does not have ghost. So, does "solution = distributed_solution" import ghost values to solution? Postprocessing section of http://www.dealii.org/8.4.1/doxygen/deal.II/group__distributed.html says

"To postprocess stuff therefore means that we have to tell PETSc or Trilinos that it should also import ghost elements, ... Both the PETScWrappers::MPI::Vector and TrilinosWrappers::MPI::Vector class support specifying this information (see step-40 and step-32, respectively) through the PETScWrappers::MPI::Vector::update_ghost_values() function or, in the case of Trilinos, construction of a vector with an the locally relevant degrees of freedom index set"

I used PETScWrappers::MPI::Vector and met a compilation error -- class "dealii::PETScWrappers::MPI::Vector" has no member "update_ghost_values"

--Junchao Zhang

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to a topic in the Google Groups "deal.II User Group" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/dealii/bTABbGrKyso/unsubscribe.
To unsubscribe from this group and all its topics, send an email to dealii+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Daniel Arndt

unread,
Jul 21, 2016, 12:12:52 PM7/21/16
to deal.II User Group
Junchao,

It seems that the documentation is outdated for this piece of information.
In fact, neither PETScWrapper::MPI::Vector nor TrilinosWrappers::MPI::Vector
does have update_ghost_values.
What you should do is exactly what is done in the few lines of step-42 you referenced.
"solution = distributed_solution" imports ghost values while assigning the local values.

Best,
Daniel
Reply all
Reply to author
Forward
0 new messages