Hi all,
I'm trying to run PFLOTRAN on a cluster with 8 threads per node, utilizing SSH for communication between machines. I've verified that PFLOTRAN runs properly on all three machines independently and the connection between the machines via SSH without issues.
After then, I attempted the following command to utilize all nodes:
mpirun -n 24 -f /home/geofluids/hostfile $PFLOTRAN_DIR/src/pflotran/pflotran -input_prefix sample_197
The hostfile I used is
IP of machine1:8
IP of machine2:8
IP of machine3:8
Unfortunately, this returned an initialization error while creating the HDF5 file, with the error message starting as follows:
Since the complete error message is quite long, I’ve attached the as a .txt file for reference.
I would greatly appreciate any guidance or suggestions on how to resolve this issue.
Thank you in advance for your help!
Best regards,
Won Woo Yoon
Won Woo,
I am unsure that to recommend. Can you send the file $PETSC_DIR/$PETSC_ARCH/lib/petsc/configure.log to pflotr...@googlegroups.com?
Glenn
From:
pflotra...@googlegroups.com <pflotra...@googlegroups.com> on behalf of Won Woo Yoon <yys...@yonsei.ac.kr>
Date: Monday, November 25, 2024 at 4:15 AM
To: pflotran-users <pflotra...@googlegroups.com>
Subject: [pflotran-users: 8155] PFLOTRAN MPI Initialization Error with HDF5
Check twice before you click! This email originated from outside PNNL.
--
You received this message because you are subscribed to the Google Groups "pflotran-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
pflotran-user...@googlegroups.com.
To view this discussion visit
https://groups.google.com/d/msgid/pflotran-users/aca50fd0-1dce-44f8-949a-cd31ed4eb1den%40googlegroups.com.
Won Woo,
You are running Ubuntu on a cluster (outside my expertise). I assume that the issue does not exist on a single node…. If you can replicate the issue on a PETSc test problem that employs HDF5, perhaps the PETSc developers can help. I assume that you have searched Google for answers…?
Glenn
From:
pflotra...@googlegroups.com <pflotra...@googlegroups.com> on behalf of Won Woo Yoon <yys...@yonsei.ac.kr>
Date: Monday, November 25, 2024 at 4:15 AM
To: pflotran-users <pflotra...@googlegroups.com>
Subject: [pflotran-users: 8155] PFLOTRAN MPI Initialization Error with HDF5
Check twice before you click! This email originated from outside PNNL.
Hi all,
--
Thank you for your kind advice, and sorry for the late response.
I have identified some potential issues with the HDF5 installation version, but I have not yet had the chance to test a fix.
I plan to test it as soon as possible, and I will share an update if I am successful in resolving the issue.
Thank you once again for your valuable suggestions.
Best regards,
Won Woo Yoon
To view this discussion visit https://groups.google.com/d/msgid/pflotran-users/116804f3-8d2d-4448-aa0f-eca210e99402n%40googlegroups.com.
Laurin,
Sorry for the delayed response. Have you compared the permissions on both directories (and the hierarchy of directories above the local directory on the NFS-mounted disk? I suspect that this is a permissions issue.
Glenn
To view this discussion visit https://groups.google.com/d/msgid/pflotran-users/0f274135-4d15-49ed-b038-1850509e6a35n%40googlegroups.com.
Laurin,
Please send the configure.log for PETSc 3.20.0 to pflotr...@googlegroups.com. I want to see which HDF5 version PETSc installs.
The correct syntax is in place for independent I/O:
which leads me to believe that the issue is HDF5 1.14.x with ROMIO, the last bullet that ChatGPT reports.
Thanks,
Glenn
To view this discussion visit https://groups.google.com/d/msgid/pflotran-users/1f2a4328-4834-4a37-b419-63f353e33236n%40googlegroups.com.