Installation Help wanted for a beginner please!

186 views
Skip to first unread message

Rainer Rutka

unread,
Sep 25, 2012, 8:49:05 AM9/25/12
to dirac...@googlegroups.com
Hello Group!

I'm a complete newcomer in working with Dirac.

My function is to adapt scientific software to our bwGRiD HP system.

see: www.bw-grid.de

Now, I need some help in getting a test with Dirac running on our server.

I can compile the source without any problem. Also the built-in tests are
running fine (95%).

Here's how i compliled and set-up (excerpt) the Dirac source :

---- Dirac-Insstall.sh (excerpt) ----
[...]
VERSION="11.0.1"
SOURCE_DIR="/opt/bwgrid/src/chem/dirac/${VERSION}"
TARGET_DIR="/opt/bwgrid/chem/dirac/${VERSION}"
[...]
# Load required modules:
module load compiler/intel/12.0
module load mpi/openmpi/1.4.3-intel-12.0
module load numlib/mkl/10.3.5
module load devel/cmake/2.8.7
[...]
# Make version
mkdir -vp $TARGET_DIR/install-doc
#
# Configure for parallel compilation using MPI and 32-bit integers
# This version will be build without 'ENABLE_LARGE_MOLECULES' flag.
# If you want to re-run 'setup' remove the build folder first (rm -r build)
./setup --fc=mpif90 --cc=mpicc --install ${TARGET_DIR} -D
BUILDNAME="bwgrid_dirac" 2>&1 | tee $TARGET_DIR/install-doc/setup.out
cd build
make 2>&1 | tee $TARGET_DIR/install-doc/make.out
make install 2>&1 | tee $TARGET_DIR/install-doc/make-install.out
cd ..
#
# Run built-in tests. Simulating 4 cores (duration: approx. 40 min.).
# # Tests are running fine !!!!!!
echo -e "--scratch=/tmp\n--global-scratch-disk" > diracrc
./runtest --mpi=4 --all 2>&1 | tee $TARGET_DIR/install-doc/builtin-tests.out
cd ..
[...]

Unfortunately I can't run tests in this way:

--------------------------------------------------------------------------------------
$ echo -e "--scratch=/tmp\n--global-scratch-disk" > diracrc
$ pam-dirac --mol=methanol.xyz --inp=hf.inp

I get this error - messages:

Installation directory you inserted is non-existent !
Used the default instead: /opt/bwgrid/chem/dirac/11.0.1/bin/pam-d
The dirac.x executable either does not exist (in default build
directory) or its location is wrongly or not specified. Check also the
corresponding diracrc file.
The dirac.x executable either does not exist (in default build
directory) or its location is wrongly or not specified. Check also the
corresponding diracrc file.

DIRAC python script running:

user : rutka
host : themis ( themis )
ip : 134.60.40.110
date and time : 2012-09-25 14:32:29.282165
input dir : /bwfs/ul/scratch/ws/rutka-test-0
pam command : /opt/bwgrid/chem/dirac/11.0.1/bin/pam-d/pam
all pam args : ['--scratch=/tmp', '--global-scratch-disk',
'--mol=methanol.xyz', '--inp=hf.inp']
executable : None
scratch dir : /tmp/rutka/DIRAC_hf_methanol_5762
output file : hf_methanol.out
DIRAC run : serial

Creating the scratch directory.
Copying file " None " to scratch dir as " dirac.x ".
Copying of files - shutil.copy(src, dest) - failed, error exit !
src full path: None
dest full path: /tmp/rutka/DIRAC_hf_methanol_5762/dirac.x
input dir : /bwfs/ul/scratch/ws/rutka-test-0
Copying of files - shutil.copy(src, dest) - failed, error exit !
src full path: None
dest full path: /tmp/rutka/DIRAC_hf_methanol_5762/dirac.x
input dir : /bwfs/ul/scratch/ws/rutka-test-0
---------------------------------------------------------------------------------------------------

That's true! The directory "/opt/bwgrid/chem/dirac/11.0.1/bin/pam-d"
is not existing.

What's wrong here ???

Thanks in advance.

A "newbe"!

:-)

rainer...@uni-konstanz.de






radovan bast

unread,
Sep 25, 2012, 9:01:33 AM9/25/12
to dirac...@googlegroups.com, Rainer Rutka
dear Rainer,

just to clarify first: is it correct that you are able
to run using runtest (which will run all tests)
but you are not able to run using pam(-dirac) directly
(in the hope to run a single job) - is that right?

best greetings,
radovan
--

# Radovan Bast

# Laboratoire de Chimie et Physique Quantiques
# CNRS/ Université Paul Sabatier
# Toulouse, France
# http://dirac.ups-tlse.fr/bast/

Rainer Rutka

unread,
Sep 25, 2012, 9:05:17 AM9/25/12
to dirac...@googlegroups.com
Am 25.09.2012 15:01, schrieb radovan bast:
> dear Rainer,
> just to clarify first: is it correct that you are able
> to run using runtest (which will run all tests)
Yes, thats true!
=============

> but you are not able to run using pam(-dirac) directly
> (in the hope to run a single job) - is that right?
Y E S !
Because the directory

"/opt/bwgrid/chem/dirac/11.0.1/bin/pam-d"

was/is not created in the 'make install' process.

see:

[rutka@n010103 bin]$ pwd
/opt/bwgrid/chem/dirac/11.0.1/bin
[rutka@n010103 bin]$ ls -l
insgesamt 80
-rwxr-xr-x 1 rutka soft 77330 25. Sep 09:58 pam-dirac
[rutka@n010103 bin]$


rainer...@uni-konstanz.de

radovan bast

unread,
Sep 25, 2012, 9:09:53 AM9/25/12
to dirac...@googlegroups.com, Rainer Rutka
thanks - i see. i don't think you do anything wrong.
it's possibly a bug on "our" side. i will look into this ...
best wishes,
radovan


On Tue, 25 Sep 2012 15:05:17 +0200, Rainer Rutka

Rainer Rutka

unread,
Sep 25, 2012, 9:10:58 AM9/25/12
to dirac...@googlegroups.com
Am 25.09.2012 15:09, schrieb radovan bast:
> thanks - i see. i don't think you do anything wrong.
> it's possibly a bug on "our" side. i will look into this ...
> best wishes,
> radovan
THANKS A LOT! :-)

>>>>
>>>> rainer...@uni-konstanz.de
>>>
>>
>

radovan bast

unread,
Sep 25, 2012, 9:25:40 AM9/25/12
to dirac...@googlegroups.com, Rainer Rutka
ok the problem is inside the pam-dirac
(which is a copy of the pam script).

it assumes that it is always called "pam" but we had to rename
it for "make install" because it could name conflict
with unix/linux "pam".

we will fix that for the next release/patch
but a workaround for you is to do the following:

$ cd /opt/bwgrid/chem/dirac/11.0.1/bin
$ mv pam-dirac pam
$ ln -s ../share/dirac/dirac.x

now it should work if you use "pam" instead of the renamed "pam-dirac".

i am sorry for the troubles. this error got undetected
because not many people use the "make install".

thanks for signaling the error & good luck,
radovan


On Tue, 25 Sep 2012 15:09:53 +0200, radovan bast <radova...@uit.no>
wrote:

Rainer Rutka

unread,
Sep 25, 2012, 10:03:35 AM9/25/12
to dirac...@googlegroups.com
Hi!
Thanks a lot!

Now i can do some tests :-)

First attempt was:

$ pam --mpi=2 --mol=methanol.xyz --inp=hf.inp

What is this error?:

[...]
DIRAC command : /opt/bwgrid/mpi/openmpi/1.4.3-intel-12.0/bin/mpirun -np
2 /tmp/rutka/DIRAC_hf_methanol_16698/dirac.x (PID=16701)
command ended with return code: 142
pam, stdout info: process ended with nonzero stderr stream - check
[...]

This is the complete output:
-----------------------------------------------------------------------------------------------------------------
DIRAC python script running:

user : rutka
host : themis ( themis )
ip : 134.60.40.110
date and time : 2012-09-25 16:00:56.557976
input dir : /bwfs/ul/scratch/ws/rutka-test-0
pam command : /opt/bwgrid/chem/dirac/11.0.1/bin/pam
all pam args : ['--scratch=/tmp', '--global-scratch-disk',
'--mpi=2', '--mol=methanol.xyz', '--inp=hf.inp']
executable : /opt/bwgrid/chem/dirac/11.0.1/bin/dirac.x
scratch dir : /tmp/rutka/DIRAC_hf_methanol_16913
output file : hf_methanol.out
DIRAC run : parallel (
launcher:/opt/bwgrid/mpi/openmpi/1.4.3-intel-12.0/bin/mpirun)
# of CPUs : 2
local disks : False
rsh/rcp : ssh scp
machine file : None

Creating the scratch directory.
Copying file " dirac.x " to scratch dir.
Copying file " methanol.xyz " to scratch dir as " MOLECULE.XYZ ".
Copying file " hf.inp " to scratch dir as " DIRAC.INP ".
basis set dirs : /bwfs/ul/scratch/ws/rutka-test-0

DIRAC command : /opt/bwgrid/mpi/openmpi/1.4.3-intel-12.0/bin/mpirun
-np 2 /tmp/rutka/DIRAC_hf_methanol_16913/dirac.x (PID=16918)
command ended with return code: 142
pam, stdout info: process ended with nonzero stderr stream - check

**** dirac-executable stderr console output : ****

Master node : --- SEVERE ERROR, PROGRAM WILL BE ABORTED ---
Non-existing basis set in HERBAS
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 23111822.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
orterun_backend has exited due to process rank 0 with PID 16920 on
node themis exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by orterun_backend (as reported here).
--------------------------------------------------------------------------

directory: /bwfs/ul/scratch/ws/rutka-test-0
inputs: methanol.xyz & hf.inp
creating archive file hf_methanol.tgz
archived working files: ['MOLECULE.XYZ', 'DIRAC.INP']

content of the (master) scratch directory
rutka@themis:/tmp/rutka/DIRAC_hf_methanol_16913
------------------------------------------------------------------------------
name size (MB) last accessed
------------------------------------------------------------------------------
dirac.x 43.826 09/25/2012 04:00:57 PM
DIRAC.INP 0.000 09/25/2012 04:00:57 PM
MOLECULE.XYZ 0.000 09/25/2012 04:00:57 PM
fort9tvJHI 0.002 09/25/2012 04:00:57 PM
------------------------------------------------------------------------------
Total size of all files : 43.828 MB
Disk info: used available capacity [GB]
0.612 7.617 8.229

deleting the scratch directory
exit date : 2012-09-25 16:00:58.003154
exit code : 142
exit : ABNORMAL (CHECK DIRAC OUTPUT)
--------------------------------------------------------------------------------------------------------------

radovan bast

unread,
Sep 25, 2012, 10:08:20 AM9/25/12
to dirac...@googlegroups.com, Rainer Rutka
On Tue, 25 Sep 2012 16:03:35 +0200, Rainer Rutka
<raine...@googlemail.com> wrote:

> Hi!
> Thanks a lot!
>
> Now i can do some tests :-)
>
> First attempt was:
>
> $ pam --mpi=2 --mol=methanol.xyz --inp=hf.inp
>
> What is this error?:

the interesting two lines are:

> basis set dirs : /bwfs/ul/scratch/ws/rutka-test-0

and

> Non-existing basis set in HERBAS

it means that DIRAC does not find the basis set or does not know
where the basis sets are.

one way to tell DIRAC where they are is to put it in ~/.diracrc

add a line:
--basis=/opt/bwgrid/chem/dirac/11.0.1/share/dirac/basis:/opt/bwgrid/chem/dirac/11.0.1/share/dirac/basis_dalton

this should do it.
alternative is to provide it as flag to pam (too much typing).

yet another alternative is to export the paths as BASDIR env variable.

good luck!
radovan

Rainer Rutka

unread,
Sep 25, 2012, 10:16:58 AM9/25/12
to dirac...@googlegroups.com

Hi Radovan!

Without any doubt the best support ever!

Everything is working like a charm now.

No errors !!!

You're the greatest!

:-)

Your SW will be included in the bwGRiD ASAP and will featured in our
www.bw-grid.de HP, too.

AGAIN: GREAT WORK!

:-)

radovan bast

unread,
Sep 25, 2012, 10:19:00 AM9/25/12
to dirac...@googlegroups.com, Rainer Rutka
thanks Rainer!
we will iron out these issues for next release/patch.
especially the basis set problem is a common one.
we will have a more robust setup and better error messages
in future.
good luck with the grid benchmarks,
radovan

Rainer Rutka

unread,
Oct 9, 2012, 8:36:32 AM10/9/12
to dirac...@googlegroups.com
Hi Radovan and list users!

Finally we got Dirac 11 accommodated and running on our BWGRiD system
here at the University of Constance in the South of Germany.

Thanks again to Radovan for his fast and able help.

It's a common practise to publish some hints on our BW-GRID homepage.
So we did with Dirac11, too.

You'll find the Module-File including the complete steps for building
Dirac11
with OpenMPI and the modules we used.
Accessorily an example PBS-Scriptfile is published for submitting a job
(standard examples used) using 'qsub'. At least you'll find some benchmarks.

Maybe somebody will find some usefully hints.
If you want to re-link to our page, you're welcome. We did it, too :-)

Here it is:

http://www.bw-grid.de/bwgrid-benutzer/software-anpassungen/chemie/dirac/

Bye!

All the best from the Lake Constance.

--------------------------------------
Rainer Rutka
Rechenzentrum [V511]
Universitaet Konstanz
78457 Konstanz
---
Kontakt:
Mail: rainer...@uni-konstanz.de
Fon: +49 (0)7531 88 5413


radovan bast

unread,
Oct 9, 2012, 8:45:26 AM10/9/12
to dirac...@googlegroups.com, Rainer Rutka
dear Rainer,

fantastic! thank you very much for
the work and the publicity. of course we will
link your page, too.

one comment to the benchmark timings at the bottom of
the page: the example is a small run taken from the test set
and is designed to test the xyz input reader but not
very representative of a realistic run; so i am
not surprised that it does not scale all that well.
if you are interested in a more representative benchmark
then we could provide other input files. there you would also
see a nicer scaling w.r.t. the nr of processors.

good luck and best regards,
radovan

Rainer Rutka

unread,
Oct 9, 2012, 8:47:20 AM10/9/12
to dirac...@googlegroups.com
Am 09.10.2012 14:45, schrieb radovan bast:
> dear Rainer,
>
> fantastic! thank you very much for
> the work and the publicity. of course we will
> link your page, too.
>
> one comment to the benchmark timings at the bottom of
> the page: the example is a small run taken from the test set
> and is designed to test the xyz input reader but not
> very representative of a realistic run; so i am
> not surprised that it does not scale all that well.
> if you are interested in a more representative benchmark
> then we could provide other input files. there you would also
> see a nicer scaling w.r.t. the nr of processors.
>
> good luck and best regards,
> radovan
>
>
Of course!
Send it and I'll do the benchmarks and release it.
I'll include it in our PBS, too.
YES!!!


Reply all
Reply to author
Forward
0 new messages