Hi Stephane and everyone,
I have just gotten around to testing the new version of Basilisk with improved memory allocation. I don't have any quantitative data yet but qualitatively it works really well!! Thank you!
I am studying the Stokes flow in the gap between a sphere and a wall, which requires both a large domain (L0/d=256) and a high maximum level of refinement (> 13) to put enough cells in the gap between the sphere and the wall. So this is in a way a worst case scenario for memory allocation.
With the previous version of Basilisk, there seemed to be a glass ceiling in 3D at a value of the maximum level of refinement of 14, after which memory consumption would grow a lot and simulations would crash. I tried adding more procs but that did not solve the problem (maybe I didn't add enough). I don't know if anyone else experienced the same thing? I think the high memory cost was not linked to the number of cells used (which was not that high) but rather the inherent memory cost of the data structure needed to support 14 levels of refinement.
With the new version of Basilisk, so far I have run all my tests on 1 node (48 procs), which gives me access to roughly 182GB of memory. I have used a maximum level of refinement from 13 to 18 for a domain L0/d=256 in 3D and so far all the simulations are running! This is a huge improvement compared to the previous version of Basilisk.
Arthur