Madelaine,
I cannot replicate the issue with v5.0 or the master branch compiled as debug or optimized, serial or parallel. I suspect that something is corrupted with the HDF5 installation, but that is speculation. Are you running this on a Linux box or a larger machine?
Glenn
From: pflotr...@googlegroups.com <pflotr...@googlegroups.com>
On Behalf Of Madelaine Griesel
Sent: Tuesday, June 11, 2024 8:57 AM
To: pflotr...@googlegroups.com
Subject: [pflotran-dev: 6259] PETSc Segmentation Violation Error
Check twice before you click! This email originated from outside PNNL.
--
You received this message because you are subscribed to the Google Groups "pflotran-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
pflotran-dev...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/pflotran-dev/CABKq66ucg1FpUEfEBHMggDLUjsgJ1joNyW2t%2B%2B-18jtUosLmkw%40mail.gmail.com.
Open MPI should be fine. Do you have another machine on which you can confirm that PFLOTRAN compiles and the input deck fails? That will help rule out the MIT Supercloud installation being an issue.
Glenn
To view this discussion on the web visit https://groups.google.com/d/msgid/pflotran-dev/0b098813-aff2-4dee-b917-1ef50f974db3n%40googlegroups.com.
Hello Glenn,
I've continued to look into this issue with the Supercloud team and they gave me the following information:
I was able to get a bit more information out of the stack trace by adding a debug flag to the pflotran build. From that I was able to identify the function the error is happening in:
#12 0x5601b49f2a45 in outputaggregatetofile
at pflotran/v5/pflotran/src/pflotran/output_observation.F90:288
I looked at that output_observation.f90 file on Bitbucket for version 4 (which worked) versus version 5 ( https://bitbucket.org/pflotran/pflotran/src/a2104cedea1528a00aa2718572d43a4461019c60/src/pflotran/output_observation.F90?at=maint%2Fv5.0) , and it looks like there were a few lines which changed to specify an H5 output, such as in the following example:
Is it possible that the changes to that output file might be causing the PETSc segmentation violation issue when writing to an H5 file in PFLOTRAN version 5? PFLOTRAN version 3 and 4 are able to write to an H5 file without any errors on the SuperCloud.
Thank you for your time,
Madelaine
Madelaine,
I am fairly confident that it is a bug, but we would need to attempt to replicate the issue (same compiler and hardware architecture). Will you please send the file $PETSC_DIR/$PETSC_ARCH/lib/petsc/conf/configure.log to pflotr...@googlegroups.com (to avoid spamming the userbase)? I am hoping it will provide such info.
Thanks,
Glenn
From: pflotr...@googlegroups.com <pflotr...@googlegroups.com>
On Behalf Of Madelaine Griesel
Sent: Tuesday, July 2, 2024 11:01 AM
To: pflotran-dev <pflotr...@googlegroups.com>
Subject: Re: [pflotran-dev: 6265] PETSc Segmentation Violation Error
Hello Glenn,
I've continued to look into this issue with the Supercloud team and they gave me the following information:
I was able to get a bit more information out of the stack trace by adding a debug flag to the pflotran build. From that I was able to identify the function the error is happening in:
#12 0x5601b49f2a45 in outputaggregatetofile
at pflotran/v5/pflotran/src/pflotran/output_observation.F90:288
I looked at that output_observation.f90 file on Bitbucket for version 4 (which worked) versus version 5 ( https://bitbucket.org/pflotran/pflotran/src/a2104cedea1528a00aa2718572d43a4461019c60/src/pflotran/output_observation.F90?at=maint%2Fv5.0) , and it looks like there were a few lines which changed to specify an H5 output, such as in the following example:
To view this discussion on the web visit https://groups.google.com/d/msgid/pflotran-dev/5f80f3f9-2c11-4e55-a252-4cee72eea64an%40googlegroups.com.
Madelaine,
I tried configuring similar to what is reported in your configure.log. The main differences were shared versus static libraries and the Supercloud installing OpenMPI as opposed to my configure script downloading it (same versions). Your input deck runs fine. I apologize, but I am unsure what to do. Can the Supercloud team try configuring the “--download-openmpi=yes” instead of specifying the system installation of OpenMPI and see if that version succeeds?
Glenn
To view this discussion on the web visit https://groups.google.com/d/msgid/pflotran-dev/PH0PR09MB7836933C1C98241D70F967629ADC2%40PH0PR09MB7836.namprd09.prod.outlook.com.
Hello Glenn,
I'm revisiting this thread as I discovered that my issue with the PETSc segmentation violation error for PFLOTRAN Version 5 is related to the use of the PFLOTRAN PERIODIC_OBSERVATION under the output card options. Running Version 5 fails if I include PERIODIC_OBSERVATION but when I comment it out, it does run. When I used PFLOTRAN Version 3 instead, the model ran and output the appropriate files without any issue. I've uploaded the related PFLOTRAN input and log files for my attempts with both versions here: pflotran_petsc_error
Do you know what may be causing this issue? I would like to use PFLOTRAN Version 5 and the PERIODIC_OBSERVATION card to output an integral flux file.
Thanks for your time,
Madelaine
Madelaine,
Please upload a .tar.gz of the entire input deck to your shared drive. When I try to run pflotran_v5*, there are missing files. I note that I am using the PFLOTRAN development version (extension of v6), and I am hopeful that it will replicate the v5 issue.
Glenn
To view this discussion on the web visit https://groups.google.com/d/msgid/pflotran-dev/86264bc1-e971-44f6-9ed7-5a0bac23f285n%40googlegroups.com.
Madelaine,
Your input decks runs with 8 processes on PFLOTRAN v5, v6 and development branch on a Mac. Can you verify that the input deck fail with 8 processes (mpirun -n 8 …)?
On which operating system are you running?
Glenn
To view this discussion on the web visit https://groups.google.com/d/msgid/pflotran-dev/PH0PR09MB7836D29733E9E5AA531FFC0E9A7D2%40PH0PR09MB7836.namprd09.prod.outlook.com.
Hello Glenn,
Thank you for checking that. I'm using the MIT Lincoln Labs Supercloud. They are having system maintenance today but I'll update you with results from running my input deck with 8 processes as soon as I can tomorrow.
Best,
Madelaine
Madelaine,
Running on 8 processes was solely to confirm that we were running on the same core count. We have now ruled out “core count” as an issue (or difference). My installation runs but MIT’s does not.
Have you tried v6?
How may I obtain collaborator access to the MIT machine? Out of the question?
Can you try installing PETSc locally following the instructions at https://documentation.pflotran.org/user_guide/how_to/installation/linux.html#linux-install?
Glenn
From:
pflotr...@googlegroups.com <pflotr...@googlegroups.com> on behalf of Madelaine Griesel <madelain...@gmail.com>
Date: Thursday, October 10, 2024 at 1:10 PM
To: pflotran-dev <pflotr...@googlegroups.com>
Subject: Re: [pflotran-dev: 6294] PETSc Segmentation Violation Error
Hi Glenn,
I get the following error on the Supercloud when running my deck with 8 processes for both PFLOTRAN v3 & v5: "mpirun noticed that process rank 13 with PID 163754 on node d-17-10-4 exited on signal 9 (Killed)"
Error! Filename not specified.Error! Filename not specified.
To view this discussion on the web visit https://groups.google.com/d/msgid/pflotran-dev/6463dc01-3102-4469-8e09-21663b275fd3n%40googlegroups.com.