Memory error while solving EVP despite having higher memory on the node

46 views
Skip to first unread message

Suprabha Mukhopadhyay

unread,
Jun 19, 2025, 12:57:27 PM6/19/25
to Dedalus Users
Hello,

I run into this memory error while solving an EVP and increasing the resolution. However, I checked that the memory used in a successful calculation with just a little lower resolution is much lower than the maximum memory available on the node on which it was submitted, with 2 TB of memory. 

Memory usage:
ET 34:58.52 | CPU 99% | MEM 53496876 KB max

Error message:
2025-06-18 22:33:53,826 subsystems 0/1 INFO :: Building subproblem matrices 1/1 (~100%) Elapsed: 26m 07s, Remaining: 0s, Rate: 6.4e-04/s
Not enough memory to perform factorization.
Traceback (most recent call last):
  File "/scratch/seismo/mukhopadhyay/dedalus_tests/sun_trial/batch_cluster/EVP/organized/RZ_benchmark2.py", line 712, in <module>
    # for m in range(0,1):
    ^^^^^^^^
  File "/scratch/seismo/mukhopadhyay/dedalus_tests/sun_trial/batch_cluster/EVP/organized/RZ_benchmark2.py", line 327, in solve
    target = targets[m]
    ^^^^^^^^^^^^^^^^^^^^
  File "/data/seismo/mukhopadhyay/opt/miniconda3/envs/dedalus3_test/lib/python3.12/site-packages/dedalus/core/solvers.py", line 268, in solve_sparse
    eig_output = scipy_sparse_eigs(A=A, B=B, left=left, N=N, target=target, matsolver=self.matsolver, v0=v0, **kw)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/seismo/mukhopadhyay/opt/miniconda3/envs/dedalus3_test/lib/python3.12/site-packages/dedalus/tools/array.py", line 424, in scipy_sparse_eigs
    solver = matsolver(C)
             ^^^^^^^^^^^^
  File "/data/seismo/mukhopadhyay/opt/miniconda3/envs/dedalus3_test/lib/python3.12/site-packages/dedalus/libraries/matsolvers.py", line 141, in __init__
    self.LU = spla.splu(matrix.tocsc(),
              ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/seismo/mukhopadhyay/opt/miniconda3/envs/dedalus3_test/lib/python3.12/site-packages/scipy/sparse/linalg/_dsolve/linsolve.py", line 438, in splu
    return _superlu.gstrf(N, A.nnz, A.data, indices, indptr,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
MemoryError

Is there something which I am missing out, for which the memory error might be occurring despite having higher memory available on the node? Any help would be appreciated.

Best regards,
Sup

Calum Skene

unread,
Jun 19, 2025, 3:50:19 PM6/19/25
to Dedalus Users
Hi Sup,
I think you are running into a known issue with scipy's superlu factorisation
It is unclear to me if it is fixed or not. I get this error too with scipy 1.15.2.
The error confusingly does not mean that there is not enough memory, there is. It's just a bug with the superlu wrapper.
Hope this helps,
Calum

Suprabha Mukhopadhyay

unread,
Jun 19, 2025, 5:33:32 PM6/19/25
to Dedalus Users
Thanks, Calum, for pointing out the bug in scipy. That was helpful. I upgraded to scipy 1.15.3 and still get the same error. Probably it's not fixed yet.

Best regards,
Sup

Reply all
Reply to author
Forward
0 new messages