Ways to to reduce dedalus memory for 2D Kelvin Helmholtz Simulation

13 views
Skip to first unread message

Matthew Hudes

unread,
Dec 21, 2025, 5:17:06 PM (3 days ago) Dec 21
to Dedalus Users
Hello everyone,

I have recently started using dedalus v3 for simulating the 2D Kelvin Helmholtz Instability. I used the 2d shear flow example script as a starting point (https://github.com/DedalusProject/dedalus/tree/master/examples/ivp_2d_shear_flow). The main difference in my code is that I have changed a lot of the parameters and I switched to using the vorticity representation of Navier Stokes, since this will be important for my application. I have attached a MWE here. The code works for lower resolutions, but I am attempting and 8192 by 8192 grid, which fails (out of memory) before the first time step is complete. I am using a node with 48 cores and 187.5 GB of memory.

I have tried a few things to reduce memory usage, like reducing the number of tasks I save snapshots of and not storing the absolute value of vorticity as a flow property. However, I am not surprised these have not had an effect since my code is running out of memory before the first time step is complete. I did try moving the "nu*lap(om)" term to the RHS in an attempt to reduce the size of the LHS matrix, but this did not seem to solve the problem. I also tried using the SBDF2 timestepper, but I think it is equivalent memory wise to RK222.


I have seen various formulas around for calculating memory needs, such as

25*Ncc*F* F*Nx*Ny*Nz*8*10^{-9} = 335GB 
in my case, but I am unsure if this formula applies for dedalus V3 and my setup. If so, what are possible strategies for reducing the memory requirements of the code?

I also would be happy for any advice about proper utilization of dedalus! I also am curious what peoples thoughts are on the memory usage of dedalus compared to other pseudo spectral codes. Does it seem to be equivalent or perhaps more memory intensive than other packages (like GHOST for example: https://github.com/pmininni/GHOST/tree/master).
Reply all
Reply to author
Forward
0 new messages