Restriction on number of processors

41 views
Skip to first unread message

Santanu Das

unread,
Oct 10, 2025, 2:27:20 AM (4 days ago) Oct 10
to basilisk-fr
Dear all,

I am currently simulating the case of 3D rising bubbles in multigrid using a uniform grid in all directions, without adaptive meshing. While running the code with MPI, I encounter an error stating that the number of MPI processes must be equal to 8^i. This means I can only use processor counts of 8, 64, 512, and so on.

Is there a way to utilize an arbitrary number of processors? While running on a cluster, I often have many processors left unused. Is there a slab decomposition method or any other approach to maximize cluster use? Any tips would be greatly appreciated.

Best,
Santanu Das

j.a.v...@gmail.com

unread,
Oct 10, 2025, 4:14:31 AM (3 days ago) Oct 10
to basilisk-fr

Hallo Santanu, 

There is no slab decomposition, and I believe there is no direct solution to your exact issue in Basilisk. However, you could consider the following suggestions to not be bounds to the 8^i requirement.

1) The octree can run with an arbitrary number of processors. It can also run in "fixed-grid" mode, mimicking the Multigrid, with only a speed and memory-usage penalty.

2) If you are willing to change the resolution , you may be able to find decompositions with n^3 processors.  e.g. for 3^3 = 27 processors you can set N = 96 = 3*32
   dimensions (nx = 3, ny = 3, nz = 3);
   init_grid (3*32);

3) If you are willing to vary the dimensions  (aspect ratio) of the domain, you may be able to tune even more. see,

Antoon
Op vrijdag 10 oktober 2025 om 08:27:20 UTC+2 schreef santanu...@gmail.com:

Santanu Das

unread,
Oct 12, 2025, 12:19:14 AM (yesterday) Oct 12
to basilisk-fr
Hi Antoon,

Thank you so much for the clarification and the suggestions. Unfortunately, we cannot change the aspect ratio of the domain at this time. Instead, I will try with octree using a fixed grid and compare its performance against the multigrid method.

Best,
Santanu

Reply all
Reply to author
Forward
0 new messages