Assembly of PETSc Block Matrix

141 views
Skip to first unread message

Artur Safin

unread,
Jan 19, 2016, 3:03:53 AM1/19/16
to deal.II User Group
Hello all,

I am working on a PDE problem which involves (two-way) coupling of four equations in one domain to two equations in the second domain. To do this, I will need to use a custom block preconditioner with the added ability to solve in parallel.

What I am not very confident about is how to approach the assembly of such a system. In particular, I have the following issues:

1) Is there a way to organize the dofs so that the top left 4x4 blocks of the matrix correspond to the first domain, bottom right 2x2 correspond to the second domain, and other blocks correspond to coupling?

2) What is the proper way to initialize a PETSc BlockSparseMatrix? To start, just for the upper 4x4 block, I have tried

________________________________________________________________________________________________________

GridTools :: partition_triangulation(n_mpi_processes, triangulation);
dof_handler.distribute_dofs(fe_basisfunctions);
DoFRenumbering :: component_wise(dof_handler);

std::vector<types::global_dof_index> dofs_per_component(4);
DoFTools :: count_dofs_per_component(dof_handler, dofs_per_component);
const unsigned int n_dofs = dofs_per_component[0];

// Set up sparsity pattern
BlockDynamicSparsityPattern BlockDSP(4, 4);
for (i = 0; i < 4; i++)
for (j = 0; j < 4; j++)
BlockDSP.block(i, j).reinit(dofs_per_component[i], dofs_per_component[j]);
BlockDSP.collect_sizes();
DoFTools :: make_sparsity_pattern(dof_handler, BlockDSP);
BlockDSP.compress();

// Initialize A
IndexSet local_dofs = dof_handler.locally_owned_dofs();
std::vector<IndexSet> local_dofs_per_block;
for (i = 0; i < 4; i++)
local_dofs_per_block.push_back(local_dofs.get_view(n_dofs * i, n_dofs * (i+1)));
A.reinit(local_dofs_per_block, BlockDSP, mpi_communicator);
________________________________________________________________________________________________________

This seems to be ok in serial, but fails in parallel in the last line:

--------------------------------------------------------
An error occurred in line <260> of file </home/artur/Rorsrach/Packages/dealii-8.3.0/source/lac/petsc_matrix_base.cc> in function
    void dealii::PETScWrappers::MatrixBase::compress(dealii::VectorOperation::values)
The violated condition was: 
    ierr == 0
The name and call sequence of the exception was:
    ExcPETScError(ierr)
Additional Information: 
An error with error number 73 occurred while calling a PETSc function
--------------------------------------------------------

Best,

Artur

Timo Heister

unread,
Jan 22, 2016, 11:55:52 AM1/22/16
to dea...@googlegroups.com
> 1) Is there a way to organize the dofs so that the top left 4x4 blocks of
> the matrix correspond to the first domain, bottom right 2x2 correspond to
> the second domain, and other blocks correspond to coupling?

How are you defining your two "domains"? You should be able to use
DoFRenumbering::cell_wise() but I would start with a small test
program to see if that works correctly.

> 2) What is the proper way to initialize a PETSc BlockSparseMatrix? To start,
> just for the upper 4x4 block, I have tried

this looks okay to me...

> dealii::PETScWrappers::MatrixBase::compress(dealii::VectorOperation::values)

Are you doing anything in between the compress() call and the code you showed?

You might need to construct a minimal example that shows the problem
and post it so that we can take a look.

--
Timo Heister
http://www.math.clemson.edu/~heister/

Artur Safin

unread,
Jan 25, 2016, 9:28:47 PM1/25/16
to deal.II User Group
Timo,

How are you defining your two "domains"? You should be able to use 
DoFRenumbering::cell_wise() but I would start with a small test 
program to see if that works correctly. 

Thanks, this should be helpful. The problem that I am solving is quite similar to step-46 - the two domains are distinguished different subdomain_id values. 

Are you doing anything in between the compress() call and the code you showed?

Nope, nothing..
 
You might need to construct a minimal example that shows the problem
and post it so that we can take a look.

I am attaching a MWE. The error output below may be more indicative of why the program is failing

[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Nonconforming object sizes
[0]PETSC ERROR: Sum of local lengths 50 does not equal global length 25, my local length 25
  likely a call to VecSetSizes() or MatSetSizes() is wrong.
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.0, Jun, 09, 2015 
[0]PETSC ERROR: ./PETScTest on a x86_64 named artur-ubuntu by artur Mon Jan 25 20:17:45 2016
[0]PETSC ERROR: Configure options 
[0]PETSC ERROR: #1 PetscSplitOwnership() line 93 in /home/artur/Packages/petsc-3.6.0/src/sys/utils/psplit.c

Thanks,

Artur

error
PETScTest.cpp
CMakeLists.txt

Timo Heister

unread,
Jan 26, 2016, 9:05:57 AM1/26/16
to dea...@googlegroups.com
Dear Artur,

Your matrix construction is correct, but there are several problems
that came together:

1. You are trying to do parallel computations but you are using a
normal Triangulation, so
dof_handler.locally_owned_dofs()
does not what you want. You have several options:
a) use DoFTools::locally_owned_dofs_per_subdomain(...) instead,
but you need to renumber by subdomain (see attached code) -- like
step-17
b) use the new parallel::shared::Triangulation, then
dof_handler.locally_owned_dofs() works correctly (see step-18)
c) use a parallel::distributed::Triangulation like it is
described in step-40
2. GridTools :: partition_triangulation(n_mpi_processes,
triangulation) behaves weirdly with a small number of cells
(I just fixed this in https://github.com/dealii/dealii/pull/2117)
3. If you use 1a), you need to be careful with
locally_owned_dofs_per_subdomain() if you one processor has no cells
(see my hack in the attached code)
4. The error message inside PETSc was quite bad
with https://github.com/dealii/dealii/pull/2120
it will soon be:
An error occurred in line <411> of file
</ssd/deal-git/source/lac/petsc_parallel_sparse_matrix.cc> in function
void dealii::PETScWrappers::MPI::SparseMatrix::do_reinit(const
dealii::IndexSet&, const dealii::IndexSet&, const
SparsityPatternType&) [with SparsityPatternType =
dealii::DynamicSparsityPattern]
The violated condition was:
row_owners == sparsity_pattern.n_rows()
Additional Information:
Each row has to be owned by exactly one owner (n_rows()=25 but
sum(local_rows.n_elements())=75)

I hope that helps!

Best,
Timo
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see
> https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to dealii+un...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
PETScTest.cpp
Reply all
Reply to author
Forward
0 new messages