getting norm of a ghosted vector in the distributed adaptive refinement code

28 views
Skip to first unread message

Marek Čapek

unread,
Dec 8, 2017, 11:14:22 AM12/8/17
to deal.II User Group
Hello,

I develop distributed code like in step-40 and with the adaptive
mesh refinement like in step-42.
I was using ghosted Trilinos MPI vectors like

  LA::MPI::Vector solution_vel_n;


initialized by

solution_vel_n.reinit (locally_owned_dofs_vel,
                       locally_relevant_dofs_vel,
                       MPI_COMM_WORLD);

I was able to assign to them from the output of solve
procedures in the following manner
(like in step-40   https://www.dealii.org/8.5.0/doxygen/deal.II/step_40.html#LaplaceProblemsolve  )

    solution_vel_n = completely_distributed_solution;

where

LA::MPI::Vector completely_distributed_solution (locally_owned_dofs_vel,
    MPI_COMM_WORLD);

"GMRES_solver()"

    constraint_matrix_vel.distribute (completely_distributed_solution);



I am running fixed point iteration. I was successful in convergence on not
refined mesh, to be more specific on meshes without adaptive refinement.
After the adaptive refinement the convergence of residual stalled.

I have tried therefore to mimic the step-42 as closely as possible
https://www.dealii.org/8.5.0/doxygen/deal.II/step_42.html#PlasticityContactProblemrefine_grid

here is the following piece of code

if (transfer_solution)
{
TrilinosWrappers::MPI::Vector distributed_solution(locally_owned_dofs, mpi_communicator);
solution_transfer.interpolate(distributed_solution);

constraints_hanging_nodes.distribute(distributed_solution);
solution = distributed_solution;
...

Where solution is TrilinosWrappers::MPI::Vector          initialized by

solution.reinit(locally_relevant_dofs, mpi_communicator);






My refinement related code:




parallel::distributed::SolutionTransfer<dim, LA::MPI::Vector> sol_trans_vel (
dof_handler_vel);

 sol_trans_vel.prepare_for_coarsening_and_refinement (
solution_vel_n);

triang.execute_coarsening_and_refinement ();

nsSystem.setupSystem (solution_vel_n);             ------->here i re-distribute dofs, resize the vector solution_vel_n, matrix and
                                                                                    rhs, apply bc, rebuild constraint matrix

 LA::MPI::Vector distributed_solution_vel (locally_owned_dofs_vel,
  MPI_COMM_WORLD);

    sol_trans_vel.interpolate (distributed_solution_vel);

constraint_matrix_vel.distribute (distributed_solution_vel);

solution_vel_n_helper.reinit (locally_relevant_dofs_vel,
                       MPI_COMM_WORLD);
 solution_vel_n_helper = distributed_solution_vel;

solution_vel_n =  solution_vel_n_helper;

















I tried to do this in my code, however I have almost all data exchange in the
form of distributed ghosted  TrilinosWrappers::MPI::Vector .
from which I need to compute sometimes l2_norm.
I tried this (it works for non-refined case)

    parallel::distributed::Vector<double> vel (locally_owned_dofs_vel,
                           locally_relevant_dofs_vel,
                           MPI_COMM_WORLD);
    vel = crate.solution_vel_n;
    this->pcout << "    NORM tentative velocity  :" << vel.l2_norm()
    << std::endl;




I got this error







--------------------------------------------------------
An error occurred in line <1099> of file </home/mcapek/candis/candi_8_5/tmp/unpack/deal.II-v8.5.0/include/deal.II/lac/trilinos_vector_base.h> in function
    dealii::IndexSet dealii::TrilinosWrappers::VectorBase::locally_owned_elements() const
The violated condition was:
    owned_elements.size()==size()
Additional information:
    The locally owned elements have not been properly initialized! This happens for example if this object has been initialized with exactly one overlapping IndexSet.

Stacktrace:
-----------
#0  /home/mcapek/candis/candi_8_5/deal.II-v8.5.0/lib/libdeal_II.g.so.8.5.0: dealii::TrilinosWrappers::VectorBase::locally_owned_elements() const
#1  /home/mcapek/candis/candi_8_5/deal.II-v8.5.0/lib/libdeal_II.g.so.8.5.0: dealii::LinearAlgebra::ReadWriteVector<double>::import(dealii::TrilinosWrappers::MPI::Vector const&, dealii::VectorOperation::values, std::shared_ptr<dealii::LinearAlgebra::CommunicationPatternBase const>)
#2  /home/mcapek/candis/candi_8_5/deal.II-v8.5.0/lib/libdeal_II.g.so.8.5.0: dealii::LinearAlgebra::distributed::Vector<double>::operator=(dealii::TrilinosWrappers::MPI::Vector const&)
#3  ./main: NSSystem<3>::assemble_and_solve_system(SolutionCrate&, SolutionCrate&, double)
#4  ./main: NSSystem<3>::compute_solution(SolutionCrate&, SolutionCrate&, double, double)
#5  ./main: NSSystem<3>::compute_solution_get_dt(SolutionCrate&, SolutionCrate&, double, double)
#6  ./main: Main<3>::run()
#7  ./main: main
--------------------------------



Could You tell me please, how to compute the norm from the ghosted vector.
Maybe I did some false initialization in the procedure of refinement. However, when I want to get values
(get_function_values() call) from the already interpolated vector in assembly dealii does not complain.
Maybe could You recommend me some alternative to exchange of data to ghosted vectors?


Thank You


MareK Capek

gandalfhaha

unread,
Dec 8, 2017, 12:06:32 PM12/8/17
to dea...@googlegroups.com
in the latter
                      distribute_dofs,DoFTools::make_hanging_node_constraints,   interpolate

Could It make the difference?

Just an idea

Marek


--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dealii+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Wolfgang Bangerth

unread,
Dec 8, 2017, 3:49:12 PM12/8/17
to dea...@googlegroups.com
Marek -- I don't actually think that your problem happens when computing
the norm. In fact, as the backtrace above shows, it happens in your
assemble_and_solve_system() function in a location where you are
assigning a Trilinos vector to a LinearAlgebra::distributed::Vector. The
error message to me would suggest that one of the two vectors does not
have the correct size after mesh refinement.

It is often useful to carefully read the error message and stack trace.
It is also often useful to run a program in a debugger, because you can
then inspect the state of the vectors at the place where the problem
happens, and infer which variable may have been wrongly initialized.

Best
W.


--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/
Reply all
Reply to author
Forward
0 new messages