Segmentation fault after resetting elements of an LA::MPI::Vector

39 views
Skip to first unread message

Wasim Niyaz Munshi ce21d400

unread,
Apr 30, 2023, 2:44:01 PM4/30/23
to deal.II User Group
Hello everyone.
I am solving two equations iteratively in an MPI framework. The first equation is the equilibrium equation (slightly modified) solved in step-8 while the second equation (Damage equation) is very similar to the Laplace equation solved in step-3.
The solution of equilibrium equation depends on the damage solution vector and vice versa. To begin with, the damage solution vector is created as follows:

(locally_relevant_solution_damage.reinit(locally_owned_dofs_damage,
locally_relevant_dofs_damage,
mpi_communicator);)

Next, for the first iteration, we need to set the damage solution as 1 for certain nodes, as shown:
for (const auto &cell : dof_handler_damage.active_cell_iterators())
{
if (cell->is_locally_owned())

{
for (const auto vertex_number : cell->vertex_indices())
{
const auto vert = cell->vertex(vertex_number);
int a =  cell->vertex_dof_index(vertex_number, 0);
if (condition is satisfied by the vertex)
{
locally_relevant_solution_damage[a] = 1;
}
else
{
locally_relevant_solution_damage[a] = 0;
}
}
}
}

However, I am getting a segmentation fault when I try to extract any entry of locally_relevant_solution_damage.

I first thought that I needed to use the compress operation, but then I read that compressing does not apply to vectors with ghost elements. Now, I am unable to figure out what could be causing this segmentation fault.
Error message:
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 0 on node wasim-OptiPlex-5080 exited on signal 11 (Segmentation fault).

Wolfgang Bangerth

unread,
Apr 30, 2023, 6:48:34 PM4/30/23
to dea...@googlegroups.com

Wasim,
you are asking the people on this mailing list to help you with questions for
which (i) you have not put in enough work yourself, and (ii) you are not
providing enough information.

As for the latter point, nobody here can say for sure whether the problem is
in the code you show, or elsewhere -- the segmentation fault might as well be
happening anywhere in your code, but you neither show the complete code, nor
have you assessed in detail where the fault actually happens.

As for the former point, you need to learn to use the tools that are well
described both on the internet at large (including the deal.II documentation
and my video lectures) as well as in many posts on this forum. In your
specific case, the approach needs to be (i) to see whether running the code on
one process (instead of multiple MPI processes) works, and (ii) to run your
program under a debugger to see where exactly the problem happens. Knowing
where it happens if the first step in figuring out why it happens.

Best
WB


On 4/30/23 12:44, Wasim Niyaz Munshi ce21d400 wrote:
> *** Caution: EXTERNAL Sender ***
> --
> The deal.II project is located at http://www.dealii.org/
> <http://www.dealii.org/>
> For mailing list/forum options, see
> https://groups.google.com/d/forum/dealii?hl=en
> <https://groups.google.com/d/forum/dealii?hl=en>
> ---
> You received this message because you are subscribed to the Google Groups
> "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to dealii+un...@googlegroups.com
> <mailto:dealii+un...@googlegroups.com>.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/dealii/3b8125ee-119c-4727-a64d-236f2cecb75dn%40googlegroups.com <https://groups.google.com/d/msgid/dealii/3b8125ee-119c-4727-a64d-236f2cecb75dn%40googlegroups.com?utm_medium=email&utm_source=footer>.

--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/


Reply all
Reply to author
Forward
0 new messages