Greetings,
I am attempting to run my first DIRAC calculation and have yet to succeed. I believe I have successfully installed in the cluster I use with
./setup --mpi --fc=mpif90 --cc=mpicc --cxx=mpicxx --python=python3
When I try to run a test calculation it fails with the following error:
### Start output ###
DIRAC command : mpirun -np 8 /scratch/jearias/jearias/DIRAC_hf_methanol_1474619/dirac.x (PID=1474622)
pam, stdout info: process ended with nonzero stderr stream - check
content of the (master) scratch directory
------------------------------------------------------------------------------
name size (MB) last accessed
------------------------------------------------------------------------------
dirac.x 114.807 01/08/2025 10:17:13 AM
schema_labels.txt 0.017 01/08/2025 10:17:13 AM
MOLECULE.XYZ 0.000 01/08/2025 10:17:13 AM
DIRAC.INP 0.000 01/08/2025 10:17:13 AM
------------------------------------------------------------------------------
Total size of all files : 114.824 MB
Disk info: used available capacity [GB]
1493.767 2230.437 3724.204
creating archive file hf_methanol.tgz
archived working files: ['MOLECULE.XYZ', 'DIRAC.INP']
Could not construct hdf5 checkpoint file
going to delete the scratch directory ... done
exit date : 2025-01-08 10:17:14.077311
elapsed time : 00h00m00s
exit : ABNORMAL (CHECK DIRAC OUTPUT)
### End output ###
It seems
pam.in (line ~1925) is looking for the CHECKPOINT.h5 file, not finding it, trying to construct it with the nohdf5_load_data function from /utils/process_schema.py, and said function is not finding a file called CHECKPOINT.noh5. That's as far as I've been able to understand what's happening for now.
Would anyone know why these h5 files are not being found / created? Am I missing something in the input files / command line argument? (These are straight from the tutorial
https://www.diracprogram.org/doc/release-24/tutorials/getting_started.html)
Cheers,
Juan