Hello all,
I am working on a PDE problem which involves (two-way) coupling of four equations in one domain to two equations in the second domain. To do this, I will need to use a custom block preconditioner with the added ability to solve in parallel.
What I am not very confident about is how to approach the assembly of such a system. In particular, I have the following issues:
1) Is there a way to organize the dofs so that the top left 4x4 blocks of the matrix correspond to the first domain, bottom right 2x2 correspond to the second domain, and other blocks correspond to coupling?
2) What is the proper way to initialize a PETSc BlockSparseMatrix? To start, just for the upper 4x4 block, I have tried
________________________________________________________________________________________________________
GridTools :: partition_triangulation(n_mpi_processes, triangulation);
dof_handler.distribute_dofs(fe_basisfunctions);
DoFRenumbering :: component_wise(dof_handler);
std::vector<types::global_dof_index> dofs_per_component(4);
DoFTools :: count_dofs_per_component(dof_handler, dofs_per_component);
const unsigned int n_dofs = dofs_per_component[0];
// Set up sparsity pattern
BlockDynamicSparsityPattern BlockDSP(4, 4);
for (i = 0; i < 4; i++)
for (j = 0; j < 4; j++)
BlockDSP.block(i, j).reinit(dofs_per_component[i], dofs_per_component[j]);
BlockDSP.collect_sizes();
DoFTools :: make_sparsity_pattern(dof_handler, BlockDSP);
BlockDSP.compress();
// Initialize A
IndexSet local_dofs = dof_handler.locally_owned_dofs();
std::vector<IndexSet> local_dofs_per_block;
for (i = 0; i < 4; i++)
local_dofs_per_block.push_back(local_dofs.get_view(n_dofs * i, n_dofs * (i+1)));
A.reinit(local_dofs_per_block, BlockDSP, mpi_communicator);
________________________________________________________________________________________________________
This seems to be ok in serial, but fails in parallel in the last line:
--------------------------------------------------------
An error occurred in line <260> of file </home/artur/Rorsrach/Packages/dealii-8.3.0/source/lac/petsc_matrix_base.cc> in function
void dealii::PETScWrappers::MatrixBase::compress(dealii::VectorOperation::values)
The violated condition was:
ierr == 0
The name and call sequence of the exception was:
ExcPETScError(ierr)
Additional Information:
An error with error number 73 occurred while calling a PETSc function
--------------------------------------------------------
Best,
Artur