Marking cells at interface between subdomains

77 views
Skip to first unread message

jose.a...@gmail.com

unread,
Mar 21, 2023, 5:47:14 AM3/21/23
to deal.II User Group
Hello dealii community,

I am working on a problem with several subdomains. At the interface between them a boundary integral is to be evaluated. I am identifying the interface by comparing the material_id of neighboring cells (or their active_fe_index as I am using a different FESystem per subdomain). In order to speed up the search during assembly, a Vector<float> is previously filled with 1.0 at the cells where the material_id/active_fe_index differ. This approach works in serial but in parallel the material_id() call of a neighbor cell outside the locally owned subdomain always returns 0 (An assertion is missing here). As such, not only the interface between subdomains is marked but also the interface between locally owned subdomains, as shown in the attached picture

My question is, if there is an equivalent to locally owned and relevant dofs for the case of cells, i.e., locally owned and relevant cells such that the material_id/active_fe_index of the neighboring cell outside the locally owned subdomain can be read? Alternatively, is there a built-in method/member which can be used for my purpose or someone has already done it through another approach?

Attached is also the MWE used to obtain the attached screenshot.

Cheers,
Jose
Screenshot at 2023-03-21 10-36-27.png
mark_interface_test.cc

Peter Munch

unread,
Mar 21, 2023, 6:00:45 AM3/21/23
to deal.II User Group

Hi Jose,

not sure. You could use Triangulation::global_active_cell_index_partitioner() to initialize a distributed vector, access it via CellAccessor::global_active_cell_index(), and update the ghost values.

Peter

jose.a...@gmail.com

unread,
Mar 21, 2023, 1:50:13 PM3/21/23
to deal.II User Group
Hi Peter,

Thanks a lot for the suggestion. With it I think I managed to achieved the desired effect. First, I populated a LinearAlgebra::distributed::Vector<float> with the material ids and called the update_ghost_values() method afterwards. In a second active_cell_iterator I used said vector and the global_active_cell_index() of each cell pair for the boolean comparison of the material id. I could not do a visual verification due to the assertion data_vector.size() == triangulation->n_active_cells() inside the add_data_vector() method with DataVectorType::type_cell_data as third argument, i.e.,

data_out.add_data_vector(
  cell_is_at_interface,
  "cell_is_at_interface",
  dealii::DataOut_DoFData<dealii::DoFHandler<dim, dim>, dim, dim>::DataVectorType::type_cell_data);

The assertion does not hold for a distributed vector as n_global_active_cells > n_active_cells. Maybe this is not the correct method for LinearAlgebra::distributed::Vector<float>? Nevertheless, I instead counted the amount of times the boolean comparison was true and it coincides with what it is expected from the amount of global refinements of the unit square (with 3 global refinements there are a total of 16 cells at the interface) and I checked the x-coordinate of the cells' center to verify that it is indeed at the interface.

It seems that when working in parallel out of the methods CellAccessor::material_id(), CellAccessor::active_fe_index() and CellAccessor::global_active_cell_index(), only the latter returns the correct value when called from a neighbor cell outside the locally owned subdomain. I could observe this when printing the global_active_cell_index, active_fe_index and the material_id pairs each the the boolean comparison was true. Here are the results in serial

    Global active cell index pair ( 5, 16) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair ( 7, 18) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair (13, 24) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair (15, 26) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair (16,  5) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair (18,  7) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair (24, 13) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair (26, 15) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair (37, 48) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair (39, 50) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair (45, 56) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair (47, 58) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair (48, 37) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair (50, 39) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair (56, 45) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair (58, 47) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)

and in parallel (-np 3)

    Global active cell index pair (24, 13) with active_fe_index pair ( 2,  0) and material_id pair ( 2,  0)
    Global active cell index pair (26, 15) with active_fe_index pair ( 2,  0) and material_id pair ( 2,  0)
    Global active cell index pair (37, 48) with active_fe_index pair ( 1,  0) and material_id pair ( 1,  0)
    Global active cell index pair (45, 56) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair (47, 58) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair (48, 37) with active_fe_index pair ( 2,  0) and material_id pair ( 2,  0)
    Global active cell index pair (50, 39) with active_fe_index pair ( 2,  0) and material_id pair ( 2,  0)
    Global active cell index pair (56, 45) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair (58, 47) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair ( 5, 16) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair ( 7, 18) with active_fe_index pair ( 1,  2) and material_id pair ( 1,  2)
    Global active cell index pair (13, 24) with active_fe_index pair ( 1,  0) and material_id pair ( 1,  0)
    Global active cell index pair (15, 26) with active_fe_index pair ( 1,  0) and material_id pair ( 1,  0)
    Global active cell index pair (16,  5) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair (18,  7) with active_fe_index pair ( 2,  1) and material_id pair ( 2,  1)
    Global active cell index pair (39, 50) with active_fe_index pair ( 1,  0) and material_id pair ( 1,  0)

If the neighbor cell is outside the locally owned subdomain, the material_id(), active_fe_index() methods return zero.

Attached is the updated MWE with which the above terminal output was obtained and which reproduces the assertion by the add_data_vector() method.

Cheers,
Jose
mark_interface_test.cc

Wolfgang Bangerth

unread,
Mar 21, 2023, 4:10:12 PM3/21/23
to dea...@googlegroups.com

Jose,

> I am working on a problem with several subdomains. At the interface
> between them a boundary integral is to be evaluated. I am identifying
> the interface by comparing the material_id of neighboring cells (or
> their active_fe_index as I am using a different FESystem per subdomain).
> In order to speed up the search during assembly, a Vector<float> is
> previously filled with 1.0 at the cells where the
> material_id/active_fe_index differ. This approach works in serial but in
> parallel the material_id() call of a neighbor cell outside the locally
> owned subdomain always returns 0 (An assertion is missing here). As
> such, not only the interface between subdomains is marked but also the
> interface between locally owned subdomains, as shown in the attached picture

If I understand you right, then you want to have the correct material_id
set also on ghost cells? If so, take a look at this function:

https://dealii.org/developer/doxygen/deal.II/namespaceGridTools.html#a9565dbf2f8e45fee28e40806870e2c98

Best
W.

--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/

jose.a...@gmail.com

unread,
Mar 22, 2023, 11:01:50 AM3/22/23
to deal.II User Group
Hi Wolfgang,

the function achieves the desired results. Thanks a lot!

Cheers,
Jose
Reply all
Reply to author
Forward
0 new messages