KellyErrorEstimator failure when running multiple processes

28 views
Skip to first unread message

mrjonm...@gmail.com

unread,
Jun 28, 2018, 3:11:13 AM6/28/18
to deal.II User Group
I'm trying to adapt the elasticity setup from step-17 to use a parallel distributed triangulation using PETSc. My modified code runs fine on one process, but I'm getting an error when I run on two processes when I try to call KellyErrorEstimator<dim>::estimate.

I've been combing the examples that use a parallel distributed triangulation for ideas, and I have the feeling that my problem is somewhere in setup_system. I've tried some fixes there including changing locally_relevant_solution.reinit (locally_owned_dofs, mpi_communicator) to locally_relevant_solution.reinit (locally_owned_dofslocally_relevant_dofs, mpi_communicatorand using locally_owned_dofs_per_processor in place of n_locally_owned_dofs_per_processor at the SparsityTools::distribute_sparsity_pattern step, but I just end up causing errors earlier in the code.

I've attached my MWE that gives me the problem. 

The error seems to be related to accessing a part of a vector that is not available. It is:


--------------------------------------------------------

An error occurred in line <1216> of file </mnt/beegfs/app/dealii/sources/dealii-9.0.0/include/deal.II/lac/petsc_vector_base.h> in function

    void dealii::PETScWrappers::VectorBase::extract_subvector_to(ForwardIterator, ForwardIterator, OutputIterator) const [with ForwardIterator = const unsigned int*; OutputIterator = double*]

The violated condition was: 

    index>=static_cast<unsigned int>(begin) && index<static_cast<unsigned int>(end)

Additional information: 

    This exception -- which is used in many places in the library -- usually indicates that some condition which the author of the code thought must be satisfied at a certain point in an algorithm, is not fulfilled. An example would be that the first part of an algorithm sorts elements of an array in ascending order, and a second part of the algorithm later encounters an element that is not larger than the previous one.


There is usually not very much you can do if you encounter such an exception since it indicates an error in deal.II, not in your own program. Try to come up with the smallest possible program that still demonstrates the error and contact the deal.II mailing lists with it to obtain help.


Stacktrace:

-----------

#0  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: void dealii::PETScWrappers::VectorBase::extract_subvector_to<unsigned int const*, double*>(unsigned int const*, unsigned int const*, double*) const

#1  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: void dealii::DoFCellAccessor<dealii::DoFHandler<2, 2>, false>::get_dof_values<dealii::PETScWrappers::MPI::Vector, double*>(dealii::PETScWrappers::MPI::Vector const&, double*, double*) const

#2  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: void dealii::DoFCellAccessor<dealii::DoFHandler<2, 2>, false>::get_interpolated_dof_values<dealii::PETScWrappers::MPI::Vector, double>(dealii::PETScWrappers::MPI::Vector const&, dealii::Vector<double>&, unsigned int) const

#3  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: dealii::FEValuesBase<2, 2>::CellIterator<dealii::TriaIterator<dealii::DoFCellAccessor<dealii::DoFHandler<2, 2>, false> > >::get_interpolated_dof_values(dealii::PETScWrappers::MPI::Vector const&, dealii::Vector<double>&) const

#4  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: void dealii::FEValuesBase<2, 2>::get_function_gradients<dealii::PETScWrappers::MPI::Vector>(dealii::PETScWrappers::MPI::Vector const&, std::vector<std::vector<dealii::Tensor<1, 2, dealii::PETScWrappers::MPI::Vector::value_type>, std::allocator<dealii::Tensor<1, 2, dealii::PETScWrappers::MPI::Vector::value_type> > >, std::allocator<std::vector<dealii::Tensor<1, 2, dealii::PETScWrappers::MPI::Vector::value_type>, std::allocator<dealii::Tensor<1, 2, dealii::PETScWrappers::MPI::Vector::value_type> > > > >&) const

#5  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: 

#6  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: 

#7  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: 

#8  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: void dealii::KellyErrorEstimator<2, 2>::estimate<dealii::PETScWrappers::MPI::Vector, dealii::DoFHandler<2, 2> >(dealii::Mapping<2, 2> const&, dealii::DoFHandler<2, 2> const&, dealii::hp::QCollection<1> const&, dealii::FunctionMap<2, dealii::PETScWrappers::MPI::Vector::value_type>::type const&, std::vector<dealii::PETScWrappers::MPI::Vector const*, std::allocator<dealii::PETScWrappers::MPI::Vector const*> > const&, std::vector<dealii::Vector<float>*, std::allocator<dealii::Vector<float>*> >&, dealii::ComponentMask const&, dealii::Function<2, double> const*, unsigned int, unsigned int, unsigned int, dealii::KellyErrorEstimator<2, 2>::Strategy)

#9  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: void dealii::KellyErrorEstimator<2, 2>::estimate<dealii::PETScWrappers::MPI::Vector, dealii::DoFHandler<2, 2> >(dealii::Mapping<2, 2> const&, dealii::DoFHandler<2, 2> const&, dealii::Quadrature<1> const&, dealii::FunctionMap<2, dealii::PETScWrappers::MPI::Vector::value_type>::type const&, std::vector<dealii::PETScWrappers::MPI::Vector const*, std::allocator<dealii::PETScWrappers::MPI::Vector const*> > const&, std::vector<dealii::Vector<float>*, std::allocator<dealii::Vector<float>*> >&, dealii::ComponentMask const&, dealii::Function<2, double> const*, unsigned int, unsigned int, unsigned int, dealii::KellyErrorEstimator<2, 2>::Strategy)

#10  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: void dealii::KellyErrorEstimator<2, 2>::estimate<dealii::PETScWrappers::MPI::Vector, dealii::DoFHandler<2, 2> >(dealii::Mapping<2, 2> const&, dealii::DoFHandler<2, 2> const&, dealii::Quadrature<1> const&, dealii::FunctionMap<2, dealii::PETScWrappers::MPI::Vector::value_type>::type const&, dealii::PETScWrappers::MPI::Vector const&, dealii::Vector<float>&, dealii::ComponentMask const&, dealii::Function<2, double> const*, unsigned int, unsigned int, unsigned int, dealii::KellyErrorEstimator<2, 2>::Strategy)

#11  /mnt/beegfs/app/dealii/9.0.0/lib/libdeal_II.g.so.9.0.0: void dealii::KellyErrorEstimator<2, 2>::estimate<dealii::PETScWrappers::MPI::Vector, dealii::DoFHandler<2, 2> >(dealii::DoFHandler<2, 2> const&, dealii::Quadrature<1> const&, dealii::FunctionMap<2, dealii::PETScWrappers::MPI::Vector::value_type>::type const&, dealii::PETScWrappers::MPI::Vector const&, dealii::Vector<float>&, dealii::ComponentMask const&, dealii::Function<2, double> const*, unsigned int, unsigned int, unsigned int, dealii::KellyErrorEstimator<2, 2>::Strategy)

#12  ./PoroMinWorking: DistributedElasticity::ElasticProblem<2>::refine_grid()

#13  ./PoroMinWorking: DistributedElasticity::ElasticProblem<2>::run()

#14  ./PoroMinWorking: main

--------------------------------------------------------


If anyone has any ideas where to look next, I would appreciate it. 
Thanks,
Jonathan

PoroMinWorking.cc

Daniel Arndt

unread,
Jun 28, 2018, 8:14:23 AM6/28/18
to deal.II User Group
Jonathan,

there are basically two issues in your code.
1.) You are first initializing locally_relevant_solution via
  locally_relevant_solution.reinit (locally_owned_dofs, locally_relevant_dofs, mpi_communicator);
and immedaitely after that you overwriting this with
   locally_relevant_solution.reinit (locally_owned_dofs, mpi_communicator);

The error you are observing comes from the fact that locally_relevant_solution does not contain the necessary ghost values. Just deleting the second reinitialization fixes this problem.

2.) You are calling MatrixTools::apply_boundary_values using locally_relevant_solution which now contains ghost values and doesn't allow write access to individual elements.
Here you need a separate, distributed object just like the one you are using in solve(). In general, I would not mix using a ConstraintMatrix (or AffineConstraints) for hanging node constraints and
MatrixTools::apply_boundary_values for boundary_values. Just incorporate all the information in the ConstraintMatrix (or AffineConstraints) object.

Best,
Daniel

mrjonm...@gmail.com

unread,
Jun 28, 2018, 1:59:12 PM6/28/18
to deal.II User Group
Thank you. 

I don't know how I missed item 1. That's a bit embarrassing. 

Your first suggestion on item 2 of using a distributed_solution object seems to work as a quick fix, but I will read up on how to use the Constraint matrix the way you described. 

I still have a lot to learn, but I wanted to state how much I appreciate how much effort has been put into not just the deal.ii library, but also the tutorials, video lectures and helping others on this mailing list. It makes a huge difference in usability for those of us learning how to use it.

Cheers,
Jonathan
Reply all
Reply to author
Forward
0 new messages