Re: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency

22 views
Skip to first unread message

Hammond, Glenn E

unread,
Feb 10, 2025, 11:30:33 AMFeb 10
to Wang, Yumeng, FM-sc, pflotran-dev (pflotran-dev@googlegroups.com)

Yumeng,

 

There are major changes in the PETSc Fortran interface after 3.21.X. The PETSc developers are still updating the interface, E.g.,

 

https://gitlab.com/petsc/petsc/-/merge_requests/7517,

 

and PFLOTRAN will not be updated until the PETSc devs have completed the task. Keep using 3.21.6.

 

Glenn

 

From: Wang, Yumeng <yu....@fz-juelich.de>
Date: Monday, February 10, 2025 at 2:37
 AM
To: Hammond, Glenn E <glenn....@pnnl.gov>
Cc: FM-sc <s...@fz-juelich.de>
Subject: Fw: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency

Check twice before you click! This email originated from outside PNNL.

 

Dear Glenn,

 

I hope this message finds you well, following our meeting in Fontainebleau.

 

I’m reaching out because I’ve encountered an issue while installing PFLOTRAN v6.0 on the Juelich Supercomputer Center (JSC) system (https://www.fz-juelich.de/en/ias/jsc/systems/supercomputers/jureca). It appears that there may be an inconsistency between the current versions of PETSc and PFLOTRAN v6.0, as indicated by the error syntax below. Despite efforts with Filipe from JSC, we haven’t been able to resolve the issue and would appreciate any insight you might have regarding the cause and potential solutions.

 

Would you be able to offer any suggestions for resolving this problem? My discussions with Pilipe so far are detailed below.

 

Best regards,

 

Yumeng

 


From: SC Support Team <s...@fz-juelich.de>
Sent: February 10, 2025 11:22 AM
To: Wang, Yumeng
Subject: Re: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency

 

Dear Yumeng,

I have made some progress on the compilation, and I believe I see the problem(s). They seem to be in the PFLOTRAN code, and I don't believe PETSc changed that much from 3.21.5 to 3.22.1 to have caused this. 

First of all, I see warnings of the type:
f951: Warning: Nonexistent include directory
that are caused because there's a wrong compilation flag "-I-Wl...". Those are not supposed to be together. To solve this, you should remove both -I in line 125 of the makefile, to end up with:
ifdef have_hdf5
  MYFLAGS += $(HDF5_INCLUDE) $(HDF5_LIB) ${FC_DEFINE_FLAG}PETSC_HAVE_HDF5
endif 

Now the errors are caused because PETSc API are wrongly used. For example, this error:
simulation_aux.F90:202:58:

  202 |   if (aux%subsurf_to_geomechanics /= PETSC_NULL_VECSCATTER) then
      |                                                          1
Error: Symbol ‘petsc_null_vecscatter’ at (1) has no IMPLICIT type

is caused because the name of the PETSc variable is PETSC_NULL_VEC_SCATTER, and not PETSC_NULL_VECSCATTER (note the underscore between VEC and SCATTER). (There are issues with that in other files too.)

Then, another issue of Rank mismatch, for example:
petsc_utility.F90:55:36:

   55 |   call MatSetValuesBlockedLocal(A,1,irow-1,1,icol-1,ndof_mat,ADD_VALUES, &
      |                                    1
Error: Rank mismatch in argument ‘c’ at (1) (rank-1 and scalar)

Is caused because the API (given here: https://petsc.org/release/manualpages/Mat/MatSetValuesBlockedLocal/) cannot be used directly with scalars. It should be called, for example, with:

call MatSetValuesBlockedLocal(A,1,[irow-1],1,[icol-1],ndof_mat,ADD_VALUES

(note the [...] in the 3rd and 5th arguments.)

The same happens with all PETSc functions that are called in preconditioner_cpr.F90 with PETSC_NULL_INTEGER and PETSC_NULL_SCALAR, which should be [PETSC_NULL_INTEGER] and [PETSC_NULL_SCALAR] instead. 


There's also a warning:
slatec_pchip.F90:465:44:

  465 |          IF ( PCHST(DEL1,DEL2) )  42, 41, 45
      |                                            1
Warning: Fortran 2018 deleted feature: Arithmetic IF statement at (1)

As this form of if is a very old fortran structure and is not used anymore. The developers should change that to a more modern one.



I'm still fixing other issues that show up to see what else breaks, but before I continue I'd like to as you: Are you a developer of PFLOTRAN or know some developers? I think they should be involved in the discussion - maybe there's a reason for that implementation, and I'd be curious to know if these changes were done so recently. (It can also be due to a more recent version of GCC, that enforces better the correctness of the code.)

Best regards,
Filipe Guimaraes

Juelich Supercomputing Support Team

-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------

Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Dr. Stephanie Bauer (stellv. Vorsitzende), Prof. Dr. Ir. Pieter Jansens

-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------



02/07/2025 11:10 - Wang Yumeng wrote:

Dear Fillipe,

 

I have tried to install the PETSc v3.21.5 with the easybuild in order to install the PFLOTRAN.

 

However, it does not work well to install the PETSc v3.21.5.

 

Could you help me to solve the problem as we have discussed the last week?

 

Best regards,

 

Yumeng

 


From: SC Support Team <s...@fz-juelich.de>
Sent: February 7, 2025 5:02 PM
To: Wang, Yumeng
Subject: Re: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency

 

Dear Yumeng,

I could reproduce the problem. Also, if I try to compile with Stages/2024, it tells me that the PETSc version is too old. That's a tricky issue. I will try to fix the issue and will let you know if I manage to make some progress, but probably only next week (I hope that is not a problem).

Have a nice weekend!

Best regards,
Filipe Guimaraes

Juelich Supercomputing Support Team

-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------

Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Dr. Stephanie Bauer (stellv. Vorsitzende), Prof. Dr. Ir. Pieter Jansens

-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------



07/02/2025 11:10 - Wang Yumeng wrote:

Dear Pilipe:

 

Here are the installing commands that I use to install the PFLOTRAN.

 

" 

module purge
module load Stages/2025

module load GCC ParaStationMPI
module load PETSc

git clone https://bitbucket.org/pflotran/pflotran
cd pflotran/src/pflotran
Now you need to fix a small bug in the file `makefile`. Open it with your favorite editor and replace:
```
ifdef have_hdf5
LIBS +=  -L${HDF5_LIB} -lhdf5_fortran -lhdf5 -lz
endif
```
with
```
ifdef have_hdf5
  LIBS +=  -L${HDF5_LIB} -lhdf5_hl_fortran -lhdf5_hl -lhdf5_fortran -lhdf5 -lz
endif
```
Then run the build with
```
make -j20 have_hdf5=1 pflotran

"

 

The error occurs when I use the command "make -j20 have_hdf5=1 pflotran"

 

The error is

"

simulation_aux.F90:202:58:

 

  202 |   if (aux%subsurf_to_geomechanics /= PETSC_NULL_VECSCATTER) then

      |                                                          1

Error: Symbol ‘petsc_null_vecscatter’ at (1) has no IMPLICIT type

simulation_aux.F90:70:53:

 

   70 |   aux%subsurf_to_geomechanics = PETSC_NULL_VECSCATTER

      |                                                     1

Error: Symbol ‘petsc_null_vecscatter’ at (1) has no IMPLICIT type

make: *** [/p/software/default/stages/2025/software/PETSc/3.22.1-gpsfbf-2024a/lib/petsc/conf/rules:169: simulation_aux.o] Error 1

make: *** Waiting for unfinished jobs....

petsc_utility.F90:55:36:

 

   55 |   call MatSetValuesBlockedLocal(A,1,irow-1,1,icol-1,ndof_mat,ADD_VALUES, &

      |                                    1

Error: Rank mismatch in argument ‘c’ at (1) (rank-1 and scalar)

petsc_utility.F90:55:45:

 

   55 |   call MatSetValuesBlockedLocal(A,1,irow-1,1,icol-1,ndof_mat,ADD_VALUES, &

      |                                             1

Error: Rank mismatch in argument ‘e’ at (1) (rank-1 and scalar)

make: *** [/p/software/default/stages/2025/software/PETSc/3.22.1-gpsfbf-2024a/lib/petsc/conf/rules:169: petsc_utility.o] Error 1

preconditioner_cpr.F90:1699:31:

 

 1699 |     call MatGetRow(a,i,numcols,PETSC_NULL_INTEGER,PETSC_NULL_SCALAR, &

      |                               1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1699:50:

 

 1699 |     call MatGetRow(a,i,numcols,PETSC_NULL_INTEGER,PETSC_NULL_SCALAR, &

      |                                                  1

Error: Rank mismatch in argument ‘vals’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1704:35:

 

 1704 |     call MatRestoreRow(a,i,numcols,PETSC_NULL_INTEGER,PETSC_NULL_SCALAR, &

      |                                   1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1704:54:

 

 1704 |     call MatRestoreRow(a,i,numcols,PETSC_NULL_INTEGER,PETSC_NULL_SCALAR, &

      |                                                      1

Error: Rank mismatch in argument ‘vals’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1545:42:

 

 1545 |       call MatGetRow(a,firstRow+j,numcols,PETSC_NULL_INTEGER,ctx%vals, &

      |                                          1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1551:46:

 

 1551 |       call MatRestoreRow(a,firstRow+j,numcols,PETSC_NULL_INTEGER,ctx%vals, &

      |                                              1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1391:40:

 

 1391 |     call MatGetRow(a,firstRow+1,numcols,PETSC_NULL_INTEGER,ctx%vals, &

      |                                        1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1399:44:

 

 1399 |     call MatRestoreRow(a,firstRow+1,numcols,PETSC_NULL_INTEGER,ctx%vals, &

      |                                            1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1403:40:

 

 1403 |     call MatGetRow(a,firstRow+2,numcols,PETSC_NULL_INTEGER,ctx%vals, &

      |                                        1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1412:44:

 

 1412 |     call MatRestoreRow(a,firstRow+2,numcols,PETSC_NULL_INTEGER,ctx%vals, &

      |                                            1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1214:42:

 

 1214 |     call MatGetRow(A,first_row+1,num_cols,PETSC_NULL_INTEGER,ctx%vals, &

      |                                          1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1231:46:

 

 1231 |     call MatRestoreRow(a,first_row+1,num_cols,PETSC_NULL_INTEGER,ctx%vals, &

      |                                              1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1026:42:

 

 1026 |     call MatGetRow(A,first_row+1,num_cols,PETSC_NULL_INTEGER,ctx%vals, &

      |                                          1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

preconditioner_cpr.F90:1072:46:

 

 1072 |     call MatRestoreRow(A,first_row+1,num_cols,PETSC_NULL_INTEGER,ctx%vals, &

      |                                              1

Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)

make: *** [/p/software/default/stages/2025/software/PETSc/3.22.1-gpsfbf-2024a/lib/petsc/conf/rules:169: preconditioner_cpr.o] Error 1

"

 

 

 

Could you help me to fix the problem?

 

Best regards,

 

Yumeng
 

 


From: SC Support Team <s...@fz-juelich.de>
Sent: February 7, 2025 1:28 PM
To: Wang, Yumeng
Subject: Re: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency

 

Dear Yumeng,

What is the problem you get using v3.22? Is that a know issue, or is it documented somewhere that newer versions would not work? Since that was just a minor version change, I wouldn't expect to break the compilation. If you let me know the steps, I can also try to reproduce the problem myself and see if I find a solution with that version.

Otherwise, I could suggest a couple of other potential solutions:
- Does it work only with v3.21.5 or also older versions? On Stages/2024 we have PETSc/3.20.0. Hopefully, in the future, PFLOTRAN will then support v3.22 and can then be installed on 2025.
- If that is not an option and you need v3.21.5, I'm afraid you have to install yourself, as we don't install older versions (and try to avoid different versions on the same stage). But that should not be too difficult, as you can use the UserInstallations described here:
https://apps.fz-juelich.de/jsc/hps/jureca/software-modules.html#installing-your-own-software-with-easybuild
And the easyconfig file you can use adapt from our currently installed one that is here:
https://github.com/easybuilders/JSC/blob/2025/Golden_Repo/p/PETSc/PETSc-3.22.1-foss-2024a.eb
Hopefully only changing the version would be enough.

Best regards,
Filipe Guimaraes

Juelich Supercomputing Support Team

-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------

Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Dr. Stephanie Bauer (stellv. Vorsitzende), Prof. Dr. Ir. Pieter Jansens

-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------



02/07/2025 11:10 - Wang Yumeng wrote:

Dear JSC staff,

 

I am yumeng Wang, from IFN-2 forshungszentrum Juelich. I am trying to install PLOFTRAN https://documentation.pflotran.org/user_guide/how_to/installation/installation.html. On the JURECA supercomputer for numerical modeling.

 

However, I have some problem on installing it, probably due to the incorrect dependency between available PETSc and PFLOTRAN. Based on the https://documentation.pflotran.org/user_guide/how_to/installation/linux.html#linux-install, the PETSc V3.21.5 is required for pflotran to be used, whereas this older version is not available on the Stages2025. After my attempts, I found that tthe newer version of PETSc v3.22 is not consistent with the dependency requirement of PFLOTRAN. 

 

Could you help me to solve this problem?

 

Thank you very much,

 

Yumeng

SC Support Team

unread,
Feb 10, 2025, 11:43:08 AMFeb 10
to Hammond, Glenn E, pflotran-dev (pflotran-dev@googlegroups.com), Wang, Yumeng

Dear Glenn and Yumeng,

Thank you for the clarification. I couldn't find detailed release notes, and the one on their website is quite extensive, but it is interesting that they changed the API with a minor version increase.

@Yumeng, tomorrow I will try to install PETSc 3.21.5 or 3.21.6 and try again the compilation. I will keep you posted.

Best regards,
Filipe Guimaraes

Juelich Supercomputing Support Team

-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------

Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Dr. Stephanie Bauer (stellv. Vorsitzende), Prof. Dr. Ir. Pieter Jansens

-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------



Hammond, Glenn E

unread,
Feb 10, 2025, 11:50:36 AMFeb 10
to SC Support Team, pflotran-dev (pflotran-dev@googlegroups.com), Wang, Yumeng

PETSc does not follow strict semantic versioning. You will notice at https://petsc.org/release/changes/ that version 3 goes back to 2008. The API has changed during this period.

 

Glenn

Jed Brown

unread,
Feb 10, 2025, 12:34:18 PMFeb 10
to 'Hammond, Glenn E' via pflotran-dev, SC Support Team, pflotran-dev (pflotran-dev@googlegroups.com), Wang, Yumeng
PETSc applies semver starting with the minor version (incremented in the feature releases every six months), while changes in subminor version should always be ABI and API compatible. The major version has only been incremented once since the 1990s. Those feature releases always have some API changes, though many apps are not impacted since most of the changes are to immature/experimental interfaces. There is a Fortran change coming up that will improve the experience with modern Fortran.
> --
> You received this message because you are subscribed to the Google Groups "pflotran-dev" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to pflotran-dev...@googlegroups.com.
> To view this discussion visit https://groups.google.com/d/msgid/pflotran-dev/PH0PR09MB783600C2448DFC9877F94D569AF22%40PH0PR09MB7836.namprd09.prod.outlook.com.

Hammond, Glenn E

unread,
Feb 11, 2025, 10:55:23 AMFeb 11
to Wang, Yumeng, j...@jedbrown.org, FM-sc, pflotran-dev (pflotran-dev@googlegroups.com)

Yumeng,

 

The PETSc devs are still in the process of refactoring the Fortran interface. My hope is that we can upgrade PFLOTRAN to 3.23.X by its release in April.

 

Glenn

 

From: Wang, Yumeng <yu....@fz-juelich.de>
Date: Tuesday, February 11, 2025 at 7:36
 AM
To: j...@jedbrown.org <j...@jedbrown.org>, Hammond, Glenn E <glenn....@pnnl.gov>
Cc: FM-sc <s...@fz-juelich.de>
Subject: Fw: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency

Dear Glenn and Jed,

 

Thank you very much for the explanations and suggestions. 

 

Filipe and me are still working on to install the PFLOTRAN on the supercomputer at Juelich Forshungszentrum. Unfortunately, the technical issue has not been solved on our end yet. Filipe will try alternative approaches and check the problem tomorrow. 

 

On the other hand, could you let us know when PFLOTRAN will be adapted for the new version of PETSc, e.g., v3.22.X? 

 

Best regards,

 

Yumeng

 


From: SC Support Team <s...@fz-juelich.de>
Sent: February 11, 2025 3:42 PM
To: Wang, Yumeng
Subject: Re: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency

 

Dear Yumeng,

Yes, I have tried to install PETSc v3.21.5 today using the easybuild userinstallations - as PFLOTRAN should be adapted for 3.22 soon, according to Glenn, then it's not worth installing an older version of PETSc for all. After almost 1h, the installation failed on some of the tests it does at the end. I asked help for the person who is responsible for the PETSc installation, and they will check it tomorrow.

In the meantime, I'm also trying v3.21.6, to see if the same error happens. I will keep you posted.

Best regards,

Filipe Guimaraes

Juelich Supercomputing Support Team

-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------

Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Dr. Stephanie Bauer (stellv. Vorsitzende), Prof. Dr. Ir. Pieter Jansens

-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------


07/02/2025 11:10 - Wang Yumeng wrote:

 

Dear Philipe,

 

As Glenn mentioned yesterday, the issue seems to be related to the inconsistency between the latest versions of PETSc and PFLOTRAN.

 

Could you confirm whether PETSc v3.21.5 or v3.21.6 has been installed on Stages 2025 to facilitate the installation of PFLOTRAN? Additionally, any insights you could provide on this matter would be greatly appreciated.

 

Looking forward to your response.

Wang, Yumeng

unread,
Feb 13, 2025, 11:21:15 AMFeb 13
to Hammond, Glenn E, FM-sc, pflotran-dev (pflotran-dev@googlegroups.com)

Dear Glenn,

My issue with installing PFLOTRAN on the Juelich Supercomputers has now been resolved by Filipe.

Our approach was to install PETSc v3.21.6 on the new system, as PFLOTRAN is well-adapted to this version. Additionally, I have conducted very simple scalability tests (convection.in) by varying the number of cores from 4 to 32, 64, and 128, using the TH module for 2-dimensional free "density-driven" thermal convection. The scalability results are provided on left hand side of the figure attached.

I observed near-linear scaling up to 32 cores. However, the improvement from 32 to 64 cores was moderate, and from 64 to 128 cores, the speedup was minimal. That said, I am aware that PFLOTRAN can perform significantly better, even on 32,768 cores, as demonstrated in the official benchmark figure on the right hand side. 

Could you help identify possible bottlenecks and suggest approaches to optimize the modeling speed?

Best regards,

Yumeng


From: Hammond, Glenn E <glenn....@pnnl.gov>
Sent: February 11, 2025 4:55 PM
To: Wang, Yumeng; j...@jedbrown.org
Cc: FM-sc; pflotran-dev (pflotr...@googlegroups.com)
core scaling.png
convection.in

SC Support Team

unread,
Feb 13, 2025, 11:36:23 AMFeb 13
to Wang, Yumeng, pflotran-dev (pflotran-dev@googlegroups.com), Hammond, Glenn E

Dear Yumeng,

On the right-hand side figure, the scale is log-log, so you should also use that to have a better comparison (but from the values, I can see that they don't seem as good). Apart from that, it's not clear for me that from 4 to 32 the scaling is near-linear, as you don't have points in-between. 

Do you know how is the code parallelised? Does it use distributed or shared memory, MPI and/or OpenMP? If it's a hybrid code, you could try different configurations, e.g., OpenMP inside a node and MPI between nodes, or 8 tasks in a node (1 per NUMA domain), each with 16 threads.

Hammond, Glenn E

unread,
Feb 14, 2025, 1:45:24 AMFeb 14
to Wang, Yumeng, FM-sc, pflotran-dev (pflotran-dev@googlegroups.com)

Yumeng,

 

A couple comments:

 

  • For comparing strong scaling performance (increasing number of processes with fixed problem size, I recommend plotting the runtime and speedup on log-log scale. Your runtime plot is log-linear.
  • I encourage you to read my 2014 paper on PFLOTRAN performance (https://agupubs.onlinelibrary.wiley.com/doi/full/10.1002/2012WR013483). In that paper, I explain that one needs ~10,000 degrees of freedom (dofs) per process for good scalability. I also explain the memory contention that occurs in a NUMA node. You are running with 40,000 grid cells and 2 dofs per cell for flow (pressure, temperature) and 5 dofs per cell for reactive transport. At 128 processes, you have 40,000*2/128 = 625 dofs per process for flow and 40,000*5/128 = 1653 dofs per process for transport, well below the 10,000 dofs per process guideline. Based on that guideline, I do not anticipate scalability performance beyond 8 processes for flow and 20 processes for reactive transport. At 32,768 processes, you would need a problem size composed of 327M dofs. Note that in the 2014 paper you will read that reactive transport can drop well below 10,000 dofs per process and exhibit good scalability.
  • Ideally one would run a 3D problem instead of 2D as the ratio of time spent in parallel communication to computation (surface area to volume) is minimized.

 

I hope that this helps explain the less than stellar parallel performance on the small problem.

 

Regards,

Hammond, Glenn E

unread,
Feb 14, 2025, 1:47:33 AMFeb 14
to SC Support Team, Wang, Yumeng, pflotran-dev (pflotran-dev@googlegroups.com)

Filipe,

 

The code is parallelized solely through MPI.

 

Glenn

Reply all
Reply to author
Forward
0 new messages