Parallel MODFLOW 6 slower than serial for density-driven flow — expected?

26 views
Skip to first unread message

harilal v

unread,
Apr 2, 2026, 6:04:06 AM (6 days ago) Apr 2
to MODFLOW Users Group

Hello everyone,

I am working on a parallel MODFLOW 6 simulation of a density-driven flow problem (seawater intrusion model) and have some questions about comparing serial and parallel performance.

Model setup:

  • 20,000 cells with adaptive time stepping (ATS)
  • 180 transient stress periods over 1 day total simulation time
  • Density-dependent flow (GWF + GWT coupled)
  • Domain split into 2 subdomains using FloPy's Mf6Splitter

Observed runtimes:

  • Pure serial, unsplit (IMS solver): 2m 49s
  • Split domain, 1 core (PETSc solver): 6m 26s
  • Split domain, 2 cores (PETSc solver): 4m 17s

My questions:

  1. The 2-core parallel run is slower than pure serial. Is this expected for a problem of this size, and is 20,000 cells large enough to see meaningful speedup over communication overhead?
  2. The serial run uses IMS while the parallel run uses PETSc — these are fundamentally different solvers. I noticed the PETSc solver consistently takes more iterations per stress period than IMS. Is comparing these two directly a valid approach, or should I focus the comparison on 1-core vs 2-core PETSc runs only?
  3. Are there recommended PETSc solver settings that would make PETSc more competitive with IMS for this type of problem?
Thank you .

Joseph Hughes

unread,
Apr 3, 2026, 12:05:35 AM (5 days ago) Apr 3
to MODFLOW Users Group
A 20,000 cell problem is a pretty small problem. A rule of thumb is parallel simulations are not practical unless the number of cells is at least 100,000 to 1,000,000. The number of stress periods doesn't really figure into parallel performance since a single time step is solved before proceeding to the next time step, just like serial simulations.

The only fair parallel runtime comparison is running the model on 1 core with the PETSc solver. Although IMS and PETSc implement the same Krylov solvers the implementation in each is slightly different. Also, IMS has been tuned over the years for groundwater problems (incorporating many of the best options from PCG2 and xMD). 
Reply all
Reply to author
Forward
0 new messages