Note sure what to tell you here. I modified the non-legacy regression suite to attempt to write to an hdf5 files. Can you pull the latest source and see if all regression tests pass?
cd $PFLOTRAN_DIR/src/pflotran
hg pull -u
make pflotran
make test
There should be 51 of 51 tests that pass.
Glenn
Igal,
Yes, clearly this is not an issue with HDF5 on a single parallel run. After a google search on the HDF5 error messages sent yesterday (i.e. “open failed on a remote node”), I found a couple posts that suggest that this is an issue with your MPI_IO and MPI installation. You are using a very recent version of mpich (3.0.4). I suggest installing an older, but more stable version of mpich2 (e.g. mpich2-1.4.1). I have had issues with newer version. Other than that, I am not sure what to suggest.
From here on out, we need to move this conversation over to the pflotran-dev mailing list as there are PETSc developers on that list that can possibly help us out. But please try mpich2-1.4.1. If that works, we can revisit the issue with mpich-3.0.4 with scientists at Argonne.
Glenn
I've tried to use the older mpich2-1.4.1 by typing for the master machine:
/usr/local/bin/mpiexec -n 4 .... machines.txt .. ./pflotran ...
and got the same errors on HDF5 write.
Igal