Issue with MPI parallelization in reach-based routing configuration

3 views
Skip to first unread message

zed li

unread,
Mar 12, 2026, 8:11:46 AMMar 12
to wrf-hydro_users
Hello everyone,
 I recently ran the model with the reach-based routing configuration and encountered some issues with MPI parallelization. I can only run the model with 4 processes using mpirun -np 4 ./wrf_hydro.exe. When using 8 or 16 processes, I get an error: "Program received signal SIGABRT: Process abort signal". I found that this is related to the parameter "Number of routing grid cells to define stream" used for generating river connections. When this value is large, the river network is sparser, and the model can run with more processes. When the value is small, the river network is denser, and the model can only run with 4 or fewer processes. Can anyone explain why?
Thank you very much for your time and help!
Best regards,
zed li

Soren Rasmussen

unread,
Mar 13, 2026, 11:56:56 AMMar 13
to wrf-hyd...@ucar.edu
Hi Zed,

This is a known bug with the reach-based routing configuration where increasing the number of MPI ranks can cause a SIGABRT, coming from an incorrect memory-access. Unfortunately I do not have a very good explanation why that is occurring but it is on our to-do list of things to fix. Thanks for reporting this, the connection of the issue to the parameter "Number of routing grid cells to define stream" is helpful! I'll start taking a closer look at this.

If you have any other questions, please let me know! 

Thanks,
Soren

--
You received this message because you are subscribed to the Google Groups "wrf-hydro_users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to wrf-hydro_use...@ucar.edu.
To view this discussion visit https://groups.google.com/a/ucar.edu/d/msgid/wrf-hydro_users/63c6b4bc-e910-4712-9487-e54241358bd9n%40ucar.edu.


--
Soren Rasmussen, Ph.D.
Water Cycle Applications Program
Research Applications Laboratory
NSF National Center for Atmospheric Research (NCAR)

Reply all
Reply to author
Forward
0 new messages