Hi!
Thanks a lot!
Now i can do some tests :-)
First attempt was:
$ pam --mpi=2 --mol=
methanol.xyz --inp=hf.inp
What is this error?:
[...]
DIRAC command : /opt/bwgrid/mpi/openmpi/1.4.3-intel-12.0/bin/mpirun -np
2 /tmp/rutka/DIRAC_hf_methanol_16698/dirac.x (PID=16701)
command ended with return code: 142
pam, stdout info: process ended with nonzero stderr stream - check
[...]
This is the complete output:
-----------------------------------------------------------------------------------------------------------------
DIRAC python script running:
user : rutka
host : themis ( themis )
ip : 134.60.40.110
date and time : 2012-09-25 16:00:56.557976
input dir : /bwfs/ul/scratch/ws/rutka-test-0
pam command : /opt/bwgrid/chem/dirac/11.0.1/bin/pam
all pam args : ['--scratch=/tmp', '--global-scratch-disk',
'--mpi=2', '--mol=
methanol.xyz', '--inp=hf.inp']
executable : /opt/bwgrid/chem/dirac/11.0.1/bin/dirac.x
scratch dir : /tmp/rutka/DIRAC_hf_methanol_16913
output file : hf_methanol.out
DIRAC run : parallel (
launcher:/opt/bwgrid/mpi/openmpi/1.4.3-intel-12.0/bin/mpirun)
# of CPUs : 2
local disks : False
rsh/rcp : ssh scp
machine file : None
Creating the scratch directory.
Copying file " dirac.x " to scratch dir.
Copying file "
methanol.xyz " to scratch dir as "
MOLECULE.XYZ ".
Copying file " hf.inp " to scratch dir as " DIRAC.INP ".
basis set dirs : /bwfs/ul/scratch/ws/rutka-test-0
DIRAC command : /opt/bwgrid/mpi/openmpi/1.4.3-intel-12.0/bin/mpirun
-np 2 /tmp/rutka/DIRAC_hf_methanol_16913/dirac.x (PID=16918)
command ended with return code: 142
pam, stdout info: process ended with nonzero stderr stream - check
**** dirac-executable stderr console output : ****
Master node : --- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
Non-existing basis set in HERBAS
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 23111822.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
orterun_backend has exited due to process rank 0 with PID 16920 on
node themis exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by orterun_backend (as reported here).
--------------------------------------------------------------------------
directory: /bwfs/ul/scratch/ws/rutka-test-0
inputs:
methanol.xyz & hf.inp
creating archive file hf_methanol.tgz
archived working files: ['
MOLECULE.XYZ', 'DIRAC.INP']
content of the (master) scratch directory
rutka@themis:/tmp/rutka/DIRAC_hf_methanol_16913
------------------------------------------------------------------------------
name size (MB) last accessed
------------------------------------------------------------------------------
dirac.x 43.826 09/25/2012 04:00:57 PM
DIRAC.INP 0.000 09/25/2012 04:00:57 PM
MOLECULE.XYZ 0.000 09/25/2012 04:00:57 PM
fort9tvJHI 0.002 09/25/2012 04:00:57 PM
------------------------------------------------------------------------------
Total size of all files : 43.828 MB
Disk info: used available capacity [GB]
0.612 7.617 8.229
deleting the scratch directory
exit date : 2012-09-25 16:00:58.003154
exit code : 142
exit : ABNORMAL (CHECK DIRAC OUTPUT)
--------------------------------------------------------------------------------------------------------------