Yumeng,
Â
There are major changes in the PETSc Fortran interface after 3.21.X. The PETSc developers are still updating the interface, E.g.,
Â
https://gitlab.com/petsc/petsc/-/merge_requests/7517,
Â
and PFLOTRAN will not be updated until the PETSc devs have completed the task. Keep using 3.21.6.
Â
Glenn
Â
From:
Wang, Yumeng <yu....@fz-juelich.de>
Date: Monday, February 10, 2025 at 2:37 AM
To: Hammond, Glenn E <glenn....@pnnl.gov>
Cc: FM-sc <s...@fz-juelich.de>
Subject: Fw: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency
Check twice before you click! This email originated from outside PNNL.
Â
Dear Glenn,
Â
I hope this message finds you well, following our meeting in Fontainebleau.
Â
I’m reaching out because I’ve encountered an issue while installing PFLOTRAN v6.0 on the Juelich Supercomputer Center (JSC) system (https://www.fz-juelich.de/en/ias/jsc/systems/supercomputers/jureca). It appears that there may be an inconsistency between the current versions of PETSc and PFLOTRAN v6.0, as indicated by the error syntax below. Despite efforts with Filipe from JSC, we haven’t been able to resolve the issue and would appreciate any insight you might have regarding the cause and potential solutions.
Â
Would you be able to offer any suggestions for resolving this problem? My discussions with Pilipe so far are detailed below.
Â
Best regards,
Â
Yumeng
Â
From: SC Support Team <s...@fz-juelich.de>
Sent: February 10, 2025 11:22 AM
To: Wang, Yumeng
Subject: Re: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency
Â
Dear Yumeng,
I have made some progress on the compilation, and I believe I see the problem(s). They seem to be in the PFLOTRAN code, and I don't believe PETSc changed that much
from 3.21.5 to 3.22.1 to have caused this.Â
First of all, I see warnings of the type:
f951: Warning: Nonexistent include directory
that are caused because there's a wrong compilation flag "-I-Wl...". Those are not supposed to be together. To solve this, you should remove both -I in line 125 of the makefile, to end up with:
ifdef have_hdf5
 MYFLAGS += $(HDF5_INCLUDE) $(HDF5_LIB) ${FC_DEFINE_FLAG}PETSC_HAVE_HDF5
endifÂ
Now the errors are caused because PETSc API are wrongly used. For example, this error:
simulation_aux.F90:202:58:
 202 |  if (aux%subsurf_to_geomechanics /= PETSC_NULL_VECSCATTER) then
   |                              1
Error: Symbol ‘petsc_null_vecscatter’ at (1) has no IMPLICIT type
is caused because the name of the PETSc variable is PETSC_NULL_VEC_SCATTER, and not PETSC_NULL_VECSCATTER (note the underscore between VEC and SCATTER). (There are issues with that in other files too.)
Then, another issue of Rank mismatch, for example:
petsc_utility.F90:55:36:
  55 |  call MatSetValuesBlockedLocal(A,1,irow-1,1,icol-1,ndof_mat,ADD_VALUES, &
   |                   1
Error: Rank mismatch in argument ‘c’ at (1) (rank-1 and scalar)
Is caused because the API (given here:Â https://petsc.org/release/manualpages/Mat/MatSetValuesBlockedLocal/) cannot be used directly with scalars. It should be
called, for example, with:
call MatSetValuesBlockedLocal(A,1,[irow-1],1,[icol-1],ndof_mat,ADD_VALUES
(note the [...] in the 3rd and 5th arguments.)
The same happens with all PETSc functions that are called in preconditioner_cpr.F90 with PETSC_NULL_INTEGER and PETSC_NULL_SCALAR, which should be [PETSC_NULL_INTEGER] and [PETSC_NULL_SCALAR] instead.Â
There's also a warning:
slatec_pchip.F90:465:44:
 465 |      IF ( PCHST(DEL1,DEL2) )  42, 41, 45
   |                       1
Warning: Fortran 2018 deleted feature: Arithmetic IF statement at (1)
As this form of if is a very old fortran structure and is not used anymore. The developers should change that to a more modern one.
I'm still fixing other issues that show up to see what else breaks, but before I continue I'd like to as you: Are you a developer of PFLOTRAN or know some developers? I think they should be involved in the discussion - maybe there's a reason for that implementation,
and I'd be curious to know if these changes were done so recently. (It can also be due to a more recent version of GCC, that enforces better the correctness of the code.)
Best regards,
Filipe Guimaraes
Juelich Supercomputing Support Team
-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------
Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Dr. Stephanie Bauer (stellv. Vorsitzende), Prof. Dr. Ir. Pieter Jansens
-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------
02/07/2025 11:10 - Wang Yumeng wrote:
Dear Fillipe,
Â
I have tried to install the PETSc v3.21.5 with the easybuild in order to install the PFLOTRAN.
Â
However, it does not work well to install the PETSc v3.21.5.
Â
Could you help me to solve the problem as we have discussed the last week?
Â
Best regards,
Â
Yumeng
Â
From: SC Support Team <s...@fz-juelich.de>
Sent: February 7, 2025 5:02 PM
To: Wang, Yumeng
Subject: Re: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency
Â
Dear Yumeng,
I could reproduce the problem. Also, if I try to compile with Stages/2024, it tells me that the PETSc version is too old. That's a tricky issue. I will try to fix
the issue and will let you know if I manage to make some progress, but probably only next week (I hope that is not a problem).
Have a nice weekend!
Best regards,
Filipe Guimaraes
Juelich Supercomputing Support Team
-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------
Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Dr. Stephanie Bauer (stellv. Vorsitzende), Prof. Dr. Ir. Pieter Jansens
-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------
07/02/2025 11:10 - Wang Yumeng wrote:
Dear Pilipe:
Â
Here are the installing commands that I use to install the PFLOTRAN.
Â
"Â
module purge
module load Stages/2025
module load GCCÂ ParaStationMPI
module load PETSc
git clone
https://bitbucket.org/pflotran/pflotran
cd pflotran/src/pflotran
Now you need to fix a small bug in the file `makefile`. Open it with your favorite editor and replace:
```
ifdef have_hdf5
LIBS +=Â -L${HDF5_LIB} -lhdf5_fortran -lhdf5 -lz
endif
```
with
```
ifdef have_hdf5
 LIBS += -L${HDF5_LIB} -lhdf5_hl_fortran -lhdf5_hl -lhdf5_fortran -lhdf5 -lz
endif
```
Then run the build with
```
make -j20 have_hdf5=1Â pflotran
"
Â
The error occurs when I use the command "make -j20 have_hdf5=1 pflotran"
Â
The error is
"
simulation_aux.F90:202:58:
Â
 202 |  if (aux%subsurf_to_geomechanics /= PETSC_NULL_VECSCATTER) then
   |                             1
Error: Symbol ‘petsc_null_vecscatter’ at (1) has no IMPLICIT type
simulation_aux.F90:70:53:
Â
  70 |  aux%subsurf_to_geomechanics = PETSC_NULL_VECSCATTER
   |                           1
Error: Symbol ‘petsc_null_vecscatter’ at (1) has no IMPLICIT type
make: *** [/p/software/default/stages/2025/software/PETSc/3.22.1-gpsfbf-2024a/lib/petsc/conf/rules:169: simulation_aux.o] Error 1
make: *** Waiting for unfinished jobs....
petsc_utility.F90:55:36:
Â
  55 |  call MatSetValuesBlockedLocal(A,1,irow-1,1,icol-1,ndof_mat,ADD_VALUES, &
   |                  1
Error: Rank mismatch in argument ‘c’ at (1) (rank-1 and scalar)
petsc_utility.F90:55:45:
Â
  55 |  call MatSetValuesBlockedLocal(A,1,irow-1,1,icol-1,ndof_mat,ADD_VALUES, &
   |                       1
Error: Rank mismatch in argument ‘e’ at (1) (rank-1 and scalar)
make: *** [/p/software/default/stages/2025/software/PETSc/3.22.1-gpsfbf-2024a/lib/petsc/conf/rules:169: petsc_utility.o] Error 1
preconditioner_cpr.F90:1699:31:
Â
 1699 |   call MatGetRow(a,i,numcols,PETSC_NULL_INTEGER,PETSC_NULL_SCALAR, &
   |                1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1699:50:
Â
 1699 |   call MatGetRow(a,i,numcols,PETSC_NULL_INTEGER,PETSC_NULL_SCALAR, &
   |                         1
Error: Rank mismatch in argument ‘vals’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1704:35:
Â
 1704 |   call MatRestoreRow(a,i,numcols,PETSC_NULL_INTEGER,PETSC_NULL_SCALAR, &
   |                  1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1704:54:
Â
 1704 |   call MatRestoreRow(a,i,numcols,PETSC_NULL_INTEGER,PETSC_NULL_SCALAR, &
   |                           1
Error: Rank mismatch in argument ‘vals’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1545:42:
Â
 1545 |    call MatGetRow(a,firstRow+j,numcols,PETSC_NULL_INTEGER,ctx%vals, &
   |                     1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1551:46:
Â
 1551 |    call MatRestoreRow(a,firstRow+j,numcols,PETSC_NULL_INTEGER,ctx%vals, &
   |                       1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1391:40:
Â
 1391 |   call MatGetRow(a,firstRow+1,numcols,PETSC_NULL_INTEGER,ctx%vals, &
   |                    1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1399:44:
Â
 1399 |   call MatRestoreRow(a,firstRow+1,numcols,PETSC_NULL_INTEGER,ctx%vals, &
   |                      1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1403:40:
Â
 1403 |   call MatGetRow(a,firstRow+2,numcols,PETSC_NULL_INTEGER,ctx%vals, &
   |                    1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1412:44:
Â
 1412 |   call MatRestoreRow(a,firstRow+2,numcols,PETSC_NULL_INTEGER,ctx%vals, &
   |                      1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1214:42:
Â
 1214 |   call MatGetRow(A,first_row+1,num_cols,PETSC_NULL_INTEGER,ctx%vals, &
   |                     1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1231:46:
Â
 1231 |   call MatRestoreRow(a,first_row+1,num_cols,PETSC_NULL_INTEGER,ctx%vals, &
   |                       1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1026:42:
Â
 1026 |   call MatGetRow(A,first_row+1,num_cols,PETSC_NULL_INTEGER,ctx%vals, &
   |                     1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
preconditioner_cpr.F90:1072:46:
Â
 1072 |   call MatRestoreRow(A,first_row+1,num_cols,PETSC_NULL_INTEGER,ctx%vals, &
   |                       1
Error: Rank mismatch in argument ‘cols’ at (1) (rank-1 and scalar)
make: *** [/p/software/default/stages/2025/software/PETSc/3.22.1-gpsfbf-2024a/lib/petsc/conf/rules:169: preconditioner_cpr.o] Error 1
"
Â
Â
Â
Could you help me to fix the problem?
Â
Best regards,
Â
Yumeng
Â
Â
From: SC Support Team <s...@fz-juelich.de>
Sent: February 7, 2025 1:28 PM
To: Wang, Yumeng
Subject: Re: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency
Â
Dear Yumeng,
What is the problem you get using v3.22? Is that a know issue, or is it documented somewhere that newer versions would not work? Since that was just a minor version
change, I wouldn't expect to break the compilation. If you let me know the steps, I can also try to reproduce the problem myself and see if I find a solution with that version.
Otherwise, I could suggest a couple of other potential solutions:
- Does it work only with v3.21.5 or also older versions? On Stages/2024 we have PETSc/3.20.0. Hopefully, in the future, PFLOTRAN will then support v3.22 and can then be installed on 2025.
- If that is not an option and you need v3.21.5, I'm afraid you have to install yourself, as we don't install older versions (and try to avoid different versions on the same stage). But that should not be too difficult, as you can use the UserInstallations
described here:
https://apps.fz-juelich.de/jsc/hps/jureca/software-modules.html#installing-your-own-software-with-easybuild
And the easyconfig file you can use adapt from our currently installed one that is here:
https://github.com/easybuilders/JSC/blob/2025/Golden_Repo/p/PETSc/PETSc-3.22.1-foss-2024a.eb
Hopefully only changing the version would be enough.
Best regards,
Filipe Guimaraes
Juelich Supercomputing Support Team
-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------
Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Dr. Stephanie Bauer (stellv. Vorsitzende), Prof. Dr. Ir. Pieter Jansens
-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------
02/07/2025 11:10 - Wang Yumeng wrote:
Dear JSC staff,
Â
I am yumeng Wang, from IFN-2 forshungszentrum Juelich. I am trying to install PLOFTRANÂ https://documentation.pflotran.org/user_guide/how_to/installation/installation.html. On the JURECA supercomputer for numerical modeling.
Â
However, I have some problem on installing it, probably due to the incorrect dependency between available PETSc and PFLOTRAN. Based on the https://documentation.pflotran.org/user_guide/how_to/installation/linux.html#linux-install, the PETSc V3.21.5 is required for pflotran to be used, whereas this older version is not available on the Stages2025. After my attempts, I found that tthe newer version of PETSc v3.22 is not consistent with the dependency requirement of PFLOTRAN.Â
Â
Could you help me to solve this problem?
Â
Thank you very much,
Â
Yumeng
Dear Glenn and Yumeng,
Thank you for the clarification. I couldn't find detailed release notes, and the one on their website is quite extensive, but it is interesting that they changed the API with a minor version increase.
@Yumeng, tomorrow I will try to install PETSc 3.21.5 or 3.21.6 and try again the compilation. I will keep you posted.
-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------
Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Dr. Stephanie Bauer (stellv. Vorsitzende), Prof. Dr. Ir. Pieter Jansens
-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------
PETSc does not follow strict semantic versioning. You will notice at https://petsc.org/release/changes/ that version 3 goes back to 2008. The API has changed during this period.
Â
Glenn
Yumeng,
Â
The PETSc devs are still in the process of refactoring the Fortran interface. My hope is that we can upgrade PFLOTRAN to 3.23.X by its release in April.
Â
Glenn
Â
From:
Wang, Yumeng <yu....@fz-juelich.de>
Date: Tuesday, February 11, 2025 at 7:36 AM
To: j...@jedbrown.org <j...@jedbrown.org>, Hammond, Glenn E <glenn....@pnnl.gov>
Cc: FM-sc <s...@fz-juelich.de>
Subject: Fw: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency
Dear Glenn and Jed,
Â
Thank you very much for the explanations and suggestions.Â
Â
Filipe and me are still working on to install the PFLOTRAN on the supercomputer at Juelich Forshungszentrum. Unfortunately, the technical issue has not been solved on our end yet. Filipe will try alternative approaches and check the problem tomorrow.Â
Â
On the other hand, could you let us know when PFLOTRAN will be adapted for the new version of PETSc, e.g., v3.22.X?Â
Â
Best regards,
Â
Yumeng
Â
From: SC Support Team <s...@fz-juelich.de>
Sent: February 11, 2025 3:42 PM
To: Wang, Yumeng
Subject: Re: [Ticket#1098039] Installing PFLOTRAN software on JURECA - PETSc and PFLOTRAN inconsistency
Â
Dear Yumeng,
Yes, I have tried to install PETSc v3.21.5 today using the easybuild userinstallations - as PFLOTRAN should be adapted for 3.22 soon, according to Glenn, then it's
not worth installing an older version of PETSc for all. After almost 1h, the installation failed on some of the tests it does at the end. I asked help for the person who is responsible for the PETSc installation, and they will check it tomorrow.
In the meantime, I'm also trying v3.21.6, to see if the same error happens. I will keep you posted.
Best regards,
Filipe Guimaraes
Juelich Supercomputing Support Team
-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------
Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Dr. Stephanie Bauer (stellv. Vorsitzende), Prof. Dr. Ir. Pieter Jansens
-------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------
07/02/2025 11:10 - Wang Yumeng wrote:
Â
Dear Philipe,
Â
As Glenn mentioned yesterday, the issue seems to be related to the inconsistency between the latest versions of PETSc and PFLOTRAN.
Â
Could you confirm whether PETSc v3.21.5 or v3.21.6 has been installed on Stages 2025 to facilitate the installation of PFLOTRAN? Additionally, any insights you could provide on this matter would be greatly appreciated.
Â
Looking forward to your response.
Dear Yumeng,
On the right-hand side figure, the scale is log-log, so you should also use that to have a better comparison (but from the values, I can see that they don't seem as good). Apart from that, it's not clear for me that from 4 to 32 the scaling is near-linear, as you don't have points in-between.ÂYumeng,
Â
A couple comments:
Â
Â
I hope that this helps explain the less than stellar parallel performance on the small problem.
Â
Regards,
Filipe,
Â
The code is parallelized solely through MPI.
Â
Glenn