Memory use experiences

85 views
Skip to first unread message

Hakon

unread,
Mar 17, 2025, 4:15:30 AMMar 17
to pypsa
Hello,

I am trying to run PyPSA-eur at higher resolutions but quickly run the challenge of high memory use during the run (1-hour, 64 clusterse) which crashed due to insufficient RAM.

I just wanted to ask about people's experiences when running 1-hour resolution models and their memory usage? Perhaps there are some config settings that you have learned help reduce consumption.

Kind regards,
Hakon

Hakon

unread,
Mar 17, 2025, 9:58:25 AMMar 17
to pypsa
Realized I forgot to clarify this, but I am curious as to the peoples experiences with memory usage during the solving process which is found in the "..._memory.log" files in the log folder of the results.

Fabian Neumann

unread,
Mar 27, 2025, 8:18:04 AMMar 27
to pypsa
For running PyPSA-Eur with all sectors as an overnight with hourly resolution and 64 clusters, it is very likely that you need some high-performance computing infrastructure. Depending on your settings this could easily take 100 GB RAM.

Hakon

unread,
Mar 27, 2025, 8:32:03 AMMar 27
to pypsa
Thanks for the reply Fabian!
I have been trying to run all countries, all sectors, with myopic foresight at hourly resolution with as many clusters as possible. Additionally, some config parameters have been altered from the default that my testing has shown to increase memory usage too. With all this we have been limited to about 48 clusters with a 300GB RAM allocated.

I am in the process of looking into what a suitable HPC system would be so I hope to better understand what PyPSA-Eur needs.
I read your paper on "The potential role of a hydrogen network in Europe" and found your testing on the sensitivity of the model to temporal and geographic resolution interesting. Just out of curiousity, do you recall how much memory you had to allocate to be able to run the hourly resolution with 181 clusters?

Kind regards,
Hakon

Fabian Neumann

unread,
Mar 28, 2025, 3:21:17 AMMar 28
to pypsa
Yes, myopic optimisation takes considerably more resources as components inherited from previous planning horizons are included.

As a partial remedy, you could try out Koen's aggregation strategy: https://github.com/PyPSA/pypsa-eur/pull/1056

We have rarely been able to solve networks requiring more than 300 GB RAM for non-hardware reasons.

For HPC systems, a CPU with high clock speeds and sufficient memory on one node (e.g., 512GB) would generally be suitable.

In the hydrogen network paper, we reduced the spatial resolution to 90 nodes for the sensitivity on temporal resolution, because it was impossible to solve hourly resolution with 181 regions.

The graphic shown there also shows that going down to 3h resolution does not cost much in accuracy.

You can also try the time segmentation clustering approach.

Hakon

unread,
Mar 28, 2025, 9:45:58 AMMar 28
to pypsa
Thank you for the thorough response and for giving me a better idea of what is suitable infrastructure to run larger pan-European models.

Koen's aggregation strategy seems like a good fit too, so thanks for pointing it out!
Reply all
Reply to author
Forward
0 new messages