MatrixCreator::create_mass_matrix on parallel::distributed::Triangulation

64 views
Skip to first unread message

mrjonm...@gmail.com

unread,
Jul 5, 2018, 1:36:44 PM7/5/18
to deal.II User Group
I noticed the documentation for MatrixCreator::create_mass_matrix, claims the following:

"If the library is configured to use multithreading, this function works in parallel."

It seems like this is not set up to work with a distributed triangulation though, is that correct?

I can build my own assembly loop, but I want to use ready-made tools everywhere I can.

Thanks,
Jonathan

Bruno Turcksin

unread,
Jul 5, 2018, 1:45:26 PM7/5/18
to deal.II User Group
Jonathan,


On Thursday, July 5, 2018 at 1:36:44 PM UTC-4, mrjonm...@gmail.com wrote:
I noticed the documentation for MatrixCreator::create_mass_matrix, claims the following:

"If the library is configured to use multithreading, this function works in parallel."

It seems like this is not set up to work with a distributed triangulation though, is that correct?
That's correct. You will need to do assemble the mass matrix yourself.

Best,

Bruno

Alex Quinlan

unread,
Mar 17, 2025, 11:26:31 AM3/17/25
to deal.II User Group
Just checking if this has changed at all over the past few years.  Is there a way to run MatrixCreator::create_mass_matrix with MPI? 

It's running fine for me when I use -np 1 , but -np 2 or more causes a crash.  I don't see any way to pass in the locally_owned_dofs, so I'm assuming not, but I thought I'd double check if there's another recommended approach.

Thanks,
Alex

Daniel Arndt

unread,
Mar 17, 2025, 12:05:42 PM3/17/25
to dea...@googlegroups.com
Alex,

I would expect that to work assuming that the input matrix is
initialized correctly. What does the error/crash in debug mode look
like?

Best,
Daniel
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to dealii+un...@googlegroups.com.
> To view this discussion visit https://groups.google.com/d/msgid/dealii/2fd71d6b-d493-4b50-ac68-b18dbe5b8f7an%40googlegroups.com.

Alex Quinlan

unread,
Mar 17, 2025, 12:24:58 PM3/17/25
to deal.II User Group
Thanks Daniel,


The error is:


[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: Inserting a new nonzero at global row/column (18, 75) into matrix
[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.12.4, Feb, 04, 2020
[0]PETSC ERROR: ./elasticRect on a  named quinlan-Precision-5820-Tower by quinlan Fri Mar 14 11:12:54 2025
[0]PETSC ERROR: Configure options --build=x86_64-linux-gnu --prefix=/usr --includedir=${prefix}/include --mandir=${prefix}/share/man --infodir=${prefix}/share/info --sysconfdir=/etc --localstatedir=/var --with-silent-rules=0 --libdir=${prefix}/lib/x86_64-linux-gnu --runstatedir=/run --with-maintainer-mode=0 --with-dependency-tracking=0 --with-debugging=0 --shared-library-extension=_real --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord" --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse --with-suitesparse-lib="-lumfpack -lamd -lcholmod -lklu" --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch --with-ptscotch-lib="-lptesmumps -lptscotch -lptscotcherr" --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi" --with-superlu=1 --with-superlu-include=/usr/include/superlu --with-superlu-lib=-lsuperlu --with-superlu_dist=1 --with-superlu_dist-include=/usr/include/superlu-dist --with-superlu_dist-lib=-lsuperlu_dist --with-hdf5-include=/usr/include/hdf5/openmpi --with-hdf5-lib="-L/usr/lib/x86_64-linux-gnu/hdf5/openmpi -L/usr/lib/openmpi/lib -lhdf5 -lmpi" --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-hypre=1 --with-hypre-include=/usr/include/hypre --with-hypre-lib=-lHYPRE_core --prefix=/usr/lib/petscdir/petsc3.12/x86_64-linux-gnu-real --PETSC_ARCH=x86_64-linux-gnu-real CFLAGS="-g -O2 -fstack-protector-strong -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2 -fstack-protector-strong -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2 -fstack-protector-strong -fPIC -ffree-line-length-0" FFLAGS="-g -O2 -fstack-protector-strong -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-Bsymbolic-functions -Wl,-z,relro -fPIC" MAKEFLAGS=w
[0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 646 in /build/petsc-zg3KH7/petsc-3.12.4+dfsg1/src/mat/impls/aij/mpi/mpiaij.c
[0]PETSC ERROR: #2 MatAssemblyEnd_MPIAIJ() line 843 in /build/petsc-zg3KH7/petsc-3.12.4+dfsg1/src/mat/impls/aij/mpi/mpiaij.c
[0]PETSC ERROR: #3 MatAssemblyEnd() line 5268 in /build/petsc-zg3KH7/petsc-3.12.4+dfsg1/src/mat/interface/matrix.c
terminate called after throwing an instance of 'dealii::LACExceptions::ExcPETScError'
  what():  
--------------------------------------------------------
An error occurred in line <280> of file </home/quinlan/GM/WMU-dealii/dealii/source/lac/petsc_matrix_base.cc> in function
    void dealii::PETScWrappers::MatrixBase::compress(dealii::VectorOperation::values)
The violated condition was:
    ierr == 0
Additional information:
    deal.II encountered an error while calling a PETSc function.
    The description of the error provided by PETSc is "Argument out of
    range".
    The numerical value of the original error code is 63.
--------------------------------------------------------


and the code that triggers it is:


    thermal_dof_handler.distribute_dofs(thermal_fe);
    locally_owned_thermal_dofs = thermal_dof_handler.locally_owned_dofs();
    locally_relevant_thermal_dofs = DoFTools::extract_locally_relevant_dofs(thermal_dof_handler);
    thermal_constraints.clear();
    DoFTools::make_hanging_node_constraints(dof_handler, thermal_constraints);
    thermal_constraints.close();

    DynamicSparsityPattern dsp(locally_relevant_thermal_dofs);
    DoFTools::make_sparsity_pattern(thermal_dof_handler,
                                    dsp,
                                    thermal_constraints,
                                    /*keep_constrained_dofs = */ true);

    MassMatrixFunc<dim> *massmat_function = new MassMatrixFunc<dim>;
    mass_matrix.reinit(locally_owned_thermal_dofs, locally_owned_thermal_dofs, dsp, mpi_communicator);



    MatrixCreator::create_mass_matrix(
                                      thermal_dof_handler,                    // DoFHandler<dim>
                                      QGauss<dim>(fe.degree + 1),
                                      mass_matrix,                            //
LA::MPI::SparseMatrix
                                      massmat_function,                       // defined below
                                      thermal_constraints );                  //
AffineConstraints<double>

where the mass function is defined as


  template <int dim>
  class MassMatrixFunc : public Function<dim>
  {
  public:
    MassMatrixFunc () :  Function<dim>()
    {}
    virtual double value(const Point<dim>  &p, const unsigned int component = 0) const override;
  };

  template <int dim>
  double MassMatrixFunc<dim>::value(const Point<dim> & p,
                                    const unsigned int component) const
  {
    Assert(component == 0, ExcIndexRange(component, 0, 1));
    const double rho = 1861.6; // kg/m^3
    const double Cp = 853; // J/kgK  (avg of E-glass and Resin)
    return rho*Cp;
  }



Do you see anything wrong with my matrix initialization or other set-up work?
Thanks,
Alex




Alex Quinlan

unread,
Mar 20, 2025, 12:07:51 PM3/20/25
to deal.II User Group
I think I found the issue.  I was missing this call before re-initializing the mass matrix:


SparsityTools::distribute_sparsity_pattern(dsp,
                                                     locally_owned_thermal_dofs,
                                                     mpi_communicator,
                                                     locally_relevant_thermal_dofs);


Thanks for the encouragement, Daniel; I probably would have given up and done manual assembly if you hadn't said something.

Thanks,
Alex

Daniel Arndt

unread,
Mar 20, 2025, 1:37:44 PM3/20/25
to dea...@googlegroups.com
You're welcome. :-)

Best, Daniel
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to dealii+un...@googlegroups.com.
> To view this discussion visit https://groups.google.com/d/msgid/dealii/270b8a16-31ee-4b12-80e7-fc2e811cdf45n%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages