Could not find partitioner that fits vector

88 views
Skip to first unread message

yy.wayne

unread,
May 6, 2023, 3:22:23 AM5/6/23
to deal.II User Group
Hello my friends,

I'm testing a matrix-free code that, preconditioned by a geometric multigrid.
The code is not stable that it fails for some fe_degree + coarse_level + total_mesh levels
combination. 
For example, any n_mpi_process + 1st order basis elements run succesfully.
But for 2nd order basis, min_level=0 + total_levels=3(0,1 and 2) 
or min_level=1 + total_levels=5(0,1,2,3 and 4) fails during SolverGMRES.solve()
The error in debug mode is:
    
An error occurred in line <3403> of file </home/ubuntu/deal.II/installed/include/deal.II/matrix_free/matrix_free.h> in function
    unsigned int dealii::internal::VectorDataExchange<dim, Number, VectorizedArrayType>::find_vector_in_mf(const VectorType&, bool) const [with VectorType = dealii::LinearAlgebra::distributed::Vector<double>; int dim = 3; Number = double; VectorizedArrayType = dealii::VectorizedArray<double, 2>]
The violated condition was:
    false
Additional information:
    Could not find partitioner that fits vector

It might be my own coding mistake. I'm using 9.3.0 may upgrade dealii to 9.4.0 help ?

Snipaste_2023-05-06_15-19-34.png

yy.wayne

unread,
May 6, 2023, 4:22:58 AM5/6/23
to deal.II User Group
It might fails too for 1st basis order too ...
But all fails occur when mpi process > 2. If I run with only 2 mpi processes, no error shows.

Peter Munch

unread,
May 6, 2023, 4:41:40 AM5/6/23
to deal.II User Group

Hi Wayne,

my best guess it that the partitioners in the multigrid operator and the matrix-free operators not match, with the result that a vector with wrong ghosting coming from the multigrid operator is used as src/dst in the matrix-free loop resulting in the assert you show. Previously, one would need to fix the the ghosting in the operator (see https://github.com/dealii/dealii/blob/a091f25c5ff77687fbd8a65f5b336b235f201100/include/deal.II/matrix_free/operators.h#L489-L498) but nowadays one can pass in external partitioners to MG: see, e.g., https://github.com/peterrum/dealii-multigrid/blob/c50581883c0dbe35c83132699e6de40da9b1b255/multigrid_throughput.cc#L1802. Similarly one can do this in the context of global-coarsening MG.

Hope this helps!

Peter

yy.wayne

unread,
May 7, 2023, 2:18:37 AM5/7/23
to deal.II User Group
Thank you Peter,

Currently I'm using MGTransferBlockMatrixFree where build with external partitioners is not supported.
But before moving to modify Block mg transfers, I wonder will the unmatched partitioners occur even coding is correct?
My problem is preconditioned with global coarsening h-MG, with 6 dofs per node for real and imaginary vectors in 3d,
BlockVectors are applied except for trilinos direct solver on coarsest level, where only Vectors rather than BlockVector 
is supported. Transfer between Vector and BlockVector is a potential risk though.

Best,
Wayne

Peter Munch

unread,
May 7, 2023, 3:23:12 AM5/7/23
to deal.II User Group
>  I wonder will the unmatched partitioners occur even coding is correct?

Yes. The requirements of MG and the level operators are not identical. If you don't do anything special, the vector are set up for MG and not the level operators. With the help of external partitioners you can fix this "problem".

I guess you could introduce a new function MGTransferBlockMatrixFree::build(). Shouldn't to be too much work, since all you need to do is to delegate the task to MGTransferMatrixFree::build(). Happy to accept a patch!

Peter

yy.wayne

unread,
May 7, 2023, 3:27:34 AM5/7/23
to deal.II User Group
I'll try add build function to  MGTransferBlockMatrixFree  then.
Thank you.

Best,
Wayne

Peter Munch

unread,
May 7, 2023, 3:37:24 AM5/7/23
to deal.II User Group
Thanks!
Peter

yy.wayne

unread,
May 15, 2023, 8:57:45 AM5/15/23
to deal.II User Group
The modification is easy:
1) on line 518 of include/deal.II/multigrid/mg_transfer_matrix_free.h, change first build() to:
void
  build(const DoFHandler<dim, dim> &dof_handler,
        const std::vector<std::shared_ptr<const Utilities::MPI::Partitioner>>
          &external_partitioners =
            std::vector<std::shared_ptr<const Utilities::MPI::Partitioner>>());
2) on line 768 of source/multigrid/mg_transfer_matrix_free.cc, change first build() to
template <int dim, typename Number>
void
MGTransferBlockMatrixFree<dim, Number>::build(
  const DoFHandler<dim, dim> &dof_handler,
  const std::vector<std::shared_ptr<const Utilities::MPI::Partitioner>>
          &external_partitioners)
{
  AssertDimension(this->matrix_free_transfer_vector.size(), 1);
  this->matrix_free_transfer_vector[0].build(dof_handler,
                                             external_partitioners);
}

I'm not familiar with github pulls so just list the modification here. I tested with either with external_partitioners
or without external_partitioners build() functions, and both show no error. By introducing external_partitioners 
the former bug diminishes.

I have another suggestion that the mg_constrained_dofs::make_zero_boundary() seems only accept one 
boundary_ids set + ComponentMask combination. So if there are 2 Dirichlet BC that mask different components
in vector problems, more make_zero_boundary functions might be added. Not a big problem though.

Best,
Wayne

Daniel Arndt

unread,
May 15, 2023, 9:24:58 AM5/15/23
to dea...@googlegroups.com
Wayne,

> I have another suggestion that the mg_constrained_dofs::make_zero_boundary() seems only accept one 
> boundary_ids set + ComponentMask combination. So if there are 2 Dirichlet BC that mask different components
> in vector problems, more make_zero_boundary functions might be added. Not a big problem though.


[...]
This function can be called multiple times to allow considering different sets of boundary_ids for different components.
[...]

Best,
Daniel

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dealii+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dealii/8159f346-f4b9-4a19-b676-2a7df2a3a85an%40googlegroups.com.

yy.wayne

unread,
May 15, 2023, 9:43:41 AM5/15/23
to deal.II User Group
Thank you, Daniel.
That's great, my mistake.

Best, 
Wayne

Reply all
Reply to author
Forward
0 new messages