Incompatibility of MatrixFreeTools::compute_matrix() with PETScWrappers::MPI::SparseMatrix

30 views
Skip to first unread message

Davit Gyulamiryan

unread,
Nov 28, 2025, 6:19:23 PM (9 days ago) Nov 28
to deal.II User Group
Hi everyone,

I am using MatrixFree framework with geometric multigrid in dealii. I am trying to implement a matrix-based coarse grid solver that uses PETSc AMG as a preconditioner. However, I have encountered a problem: under certain conditions (which I describe below), the MatrixFreeTools::compute_matrix() function fails to compute the coarse system matrix. I reproduced the same error using a modified version of the tutorial code of step-75. The error is as follows:

```
[3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[3]PETSC ERROR: Argument out of range
[3]PETSC ERROR: Inserting a new nonzero at global row/column (30, 35) into matrix
[3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[3]PETSC ERROR: Petsc Release Version 3.21.4, Jul 30, 2024
[3]PETSC ERROR: step-75 on a arch-linux-c-opt named paragon by dg615 Fri Nov 28 22:32:54 2025
[3]PETSC ERROR: Configure options --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-debugging=0 COPTFLAGS="-O3 -march=native -mtune=native" CXXOPTFLAGS="-O3 -march=native -mtune=native" FOPTFLAGS="-O3 -march=native -mtune=native" --with-blaslapack-dir=/opt/intel/mkl --download-superlu_dist --download-superlu --download-metis --download-parmetis --download-mumps --download-scalapack --download-hypre --download-hdf5 --download-fftw --download-suitesparse --download-cmake
[3]PETSC ERROR: #1 MatSetValues_MPIAIJ() at /home/dg615/proj/petsc/petsc-3.21.4/src/mat/impls/aij/mpi/mpiaij.c:592
[3]PETSC ERROR: #2 MatAssemblyEnd_MPIAIJ() at /home/dg615/proj/petsc/petsc-3.21.4/src/mat/impls/aij/mpi/mpiaij.c:777
[3]PETSC ERROR: #3 MatAssemblyEnd() at /home/dg615/proj/petsc/petsc-3.21.4/src/mat/interface/matrix.c:5820
```

The block of code that produces the error is the following:

```
  template <int dim, typename number>
  const PETScWrappers::MPI::SparseMatrix &
  LaplaceOperator<dim, number>::get_system_matrix() const
  {
    if (system_matrix.m() == 0 && system_matrix.n() == 0)
      {
        const auto &dof_handler = this->matrix_free.get_dof_handler();

        // MODIFIED starts
        // =======================================================================
        const IndexSet &owned_dofs = dof_handler.locally_owned_dofs();
        const IndexSet &relevant_dofs = DoFTools::extract_locally_relevant_dofs(dof_handler);
        const IndexSet &used_dofs = owned_dofs; // using relevant_dofs here gives an error
        DynamicSparsityPattern dsp(used_dofs);

        DoFTools::make_sparsity_pattern(dof_handler, dsp, this->constraints);

        SparsityPattern sp;
        sp.copy_from(dsp);
        system_matrix.reinit(
          used_dofs,
          used_dofs,
          sp,
          dof_handler.get_triangulation().get_mpi_communicator()
        );
        // =======================================================================
        // MODIFIED ends

        // The error is thrown from the function below
        MatrixFreeTools::compute_matrix(
          matrix_free,
          constraints,
          system_matrix,
          &LaplaceOperator::do_cell_integral_local,
          this);
      }

    return this->system_matrix;
  }
```
I've also attached the modified step-75.cc file. You can find the modified area of the code by searching MODIFIED keyword. I also changed the code to run with PETSc (originally Trilinos; NOTE: you might have to modify the CMakeLists.txt file not to require Trilinos if you don't have Trilinos installed (I don't) and are pasting the step-75.cc file into the tutorial folder).

Error condition: in my code the coarse grid is a cuboid with some number of global refinements, the error occurs when the number of those refinements is > 2. In step-75 modified code the error occurs when the variable `minLevel` is set to anything above 0.

I have tried using locally_relevant_dofs instead of locally_owned_dofs but in that case i get a different error:

```
An error occurred in line <203> of file </home/dg615/proj/petsc/dealii-9.7.0/source/lac/petsc_parallel_sparse_matrix.cc> in function
    void dealii::PETScWrappers::MPI::SparseMatrix::do_reinit(MPI_Comm, const dealii::IndexSet&, const dealii::IndexSet&, const SparsityPatternType&) [with SparsityPatternType = dealii::SparsityPattern; MPI_Comm = ompi_communicator_t*]
The violated condition was:
    local_rows.is_contiguous() && local_columns.is_contiguous()
Additional information:
    PETSc only supports contiguous row/column ranges
```

Can someone please let me know how I can fix this issue? Is it internal to dealii/PETSc or is it something I've missed? I know I could assemble the coarse matrices using the standard matrix-based assembly logic, but I was wondering if it could be done entirely in the MF infrastructure.

Thank you.

Best wishes,
Davit Gyulamiryan

step-75.cc

Davit Gyulamiryan

unread,
Nov 29, 2025, 10:00:16 AM (8 days ago) Nov 29
to deal.II User Group
I found the problem - the setup of the sparsity pattern was incorrect. The fixed code is below:

  template <int dim, typename number>
  const PETScWrappers::MPI::SparseMatrix &
  LaplaceOperator<dim, number>::get_system_matrix() const
  {
    if (system_matrix.m() == 0 && system_matrix.n() == 0)
      {
        const auto &dof_handler = this->matrix_free.get_dof_handler();

        // MODIFIED starts
        // =======================================================================
        const IndexSet &owned_dofs = dof_handler.locally_owned_dofs();
        const IndexSet &relevant_dofs = DoFTools::extract_locally_relevant_dofs(dof_handler);
        DynamicSparsityPattern dsp(relevant_dofs);

        DoFTools::make_sparsity_pattern(dof_handler, dsp, this->constraints);

      SparsityTools::distribute_sparsity_pattern(
        dsp,
        owned_dofs,
        mpi_communicator,
        relevant_dofs);

        SparsityPattern sp;
        sp.copy_from(dsp);
        system_matrix.reinit(
          owned_dofs,
          sp,
          dof_handler.get_triangulation().get_mpi_communicator()
        );
        // =======================================================================
        // MODIFIED ends

        // The error is thrown from the function below
        MatrixFreeTools::compute_matrix(
          matrix_free,
          constraints,
          system_matrix,
          &LaplaceOperator::do_cell_integral_local,
          this);
      }

    return this->system_matrix;
  }

Best wishes,
Davit
Reply all
Reply to author
Forward
Message has been deleted
0 new messages