Madelaine,
I cannot replicate the issue with v5.0 or the master branch compiled as debug or optimized, serial or parallel. I suspect that something is corrupted with the HDF5 installation, but that is speculation. Are you running this on a Linux box or a larger machine?
Glenn
From: pflotr...@googlegroups.com <pflotr...@googlegroups.com>
On Behalf Of Madelaine Griesel
Sent: Tuesday, June 11, 2024 8:57 AM
To: pflotr...@googlegroups.com
Subject: [pflotran-dev: 6259] PETSc Segmentation Violation Error
Check twice before you click! This email originated from outside PNNL.
--
You received this message because you are subscribed to the Google Groups "pflotran-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
pflotran-dev...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/pflotran-dev/CABKq66ucg1FpUEfEBHMggDLUjsgJ1joNyW2t%2B%2B-18jtUosLmkw%40mail.gmail.com.
Open MPI should be fine. Do you have another machine on which you can confirm that PFLOTRAN compiles and the input deck fails? That will help rule out the MIT Supercloud installation being an issue.
Glenn
To view this discussion on the web visit https://groups.google.com/d/msgid/pflotran-dev/0b098813-aff2-4dee-b917-1ef50f974db3n%40googlegroups.com.
Hello Glenn,
I've continued to look into this issue with the Supercloud team and they gave me the following information:
I was able to get a bit more information out of the stack trace by adding a debug flag to the pflotran build. From that I was able to identify the function the error is happening in:
#12 0x5601b49f2a45 in outputaggregatetofile
at pflotran/v5/pflotran/src/pflotran/output_observation.F90:288
I looked at that output_observation.f90 file on Bitbucket for version 4 (which worked) versus version 5 ( https://bitbucket.org/pflotran/pflotran/src/a2104cedea1528a00aa2718572d43a4461019c60/src/pflotran/output_observation.F90?at=maint%2Fv5.0) , and it looks like there were a few lines which changed to specify an H5 output, such as in the following example:
Is it possible that the changes to that output file might be causing the PETSc segmentation violation issue when writing to an H5 file in PFLOTRAN version 5? PFLOTRAN version 3 and 4 are able to write to an H5 file without any errors on the SuperCloud.
Thank you for your time,
Madelaine
Madelaine,
I am fairly confident that it is a bug, but we would need to attempt to replicate the issue (same compiler and hardware architecture). Will you please send the file $PETSC_DIR/$PETSC_ARCH/lib/petsc/conf/configure.log to pflotr...@googlegroups.com (to avoid spamming the userbase)? I am hoping it will provide such info.
Thanks,
Glenn
From: pflotr...@googlegroups.com <pflotr...@googlegroups.com>
On Behalf Of Madelaine Griesel
Sent: Tuesday, July 2, 2024 11:01 AM
To: pflotran-dev <pflotr...@googlegroups.com>
Subject: Re: [pflotran-dev: 6265] PETSc Segmentation Violation Error
Hello Glenn,
I've continued to look into this issue with the Supercloud team and they gave me the following information:
I was able to get a bit more information out of the stack trace by adding a debug flag to the pflotran build. From that I was able to identify the function the error is happening in:
#12 0x5601b49f2a45 in outputaggregatetofile
at pflotran/v5/pflotran/src/pflotran/output_observation.F90:288
I looked at that output_observation.f90 file on Bitbucket for version 4 (which worked) versus version 5 ( https://bitbucket.org/pflotran/pflotran/src/a2104cedea1528a00aa2718572d43a4461019c60/src/pflotran/output_observation.F90?at=maint%2Fv5.0) , and it looks like there were a few lines which changed to specify an H5 output, such as in the following example:
To view this discussion on the web visit https://groups.google.com/d/msgid/pflotran-dev/5f80f3f9-2c11-4e55-a252-4cee72eea64an%40googlegroups.com.
Madelaine,
I tried configuring similar to what is reported in your configure.log. The main differences were shared versus static libraries and the Supercloud installing OpenMPI as opposed to my configure script downloading it (same versions). Your input deck runs fine. I apologize, but I am unsure what to do. Can the Supercloud team try configuring the “--download-openmpi=yes” instead of specifying the system installation of OpenMPI and see if that version succeeds?
Glenn
To view this discussion on the web visit https://groups.google.com/d/msgid/pflotran-dev/PH0PR09MB7836933C1C98241D70F967629ADC2%40PH0PR09MB7836.namprd09.prod.outlook.com.