Error with running FF perturbation theory

52 views
Skip to first unread message

Giorgio Visentin

unread,
May 9, 2023, 11:58:10 AM5/9/23
to dirac...@googlegroups.com
Dear sirs,
I am running the FF perturbation theory module using the input and mol files available from the tutorial on finite-field CC calculations of dipole moment and polarizability.
The calculation immediately crashes with no input, but, instead, returning the sentence

"Traceback (most recent call last):
  File "/home/xi58teb/DIRAC-23.0-Source/build_intel_2020_mpi/pam", line 2377, in <module>
    sys.exit(main())
  File "/home/xi58teb/DIRAC-23.0-Source/build_intel_2020_mpi/pam", line 2122, in main
    return_code = dirac_run.perform()
  File "/home/xi58teb/DIRAC-23.0-Source/build_intel_2020_mpi/pam", line 2096, in perform
    self.pam_variables.process_pam_arguments()
  File "/home/xi58teb/DIRAC-23.0-Source/build_intel_2020_mpi/pam", line 1001, in process_pam_arguments
    self.dirac_mpi_command = self.mpirun + ' -np ' + str(self.nr_proc) + ' '
TypeError: unsupported operand type(s) for +: 'NoneType' and 'str'

Is there a way to fix this issue?

I thank you for your time.

Kind regards,
Giorgio

Trond SAUE

unread,
May 9, 2023, 12:00:07 PM5/9/23
to dirac...@googlegroups.com

Dear Giorgio,

can you send the complete output file ?

All the best,

   Trond

--
Trond Saue
Laboratoire de Chimie et Physique Quantiques
UMR 5626 CNRS --- Université Toulouse III-Paul Sabatier
118 route de Narbonne, F-31062 Toulouse, France

Phone : +33/561556361 Fax: +33/561556065
Mail : trond...@irsamc.ups-tlse.fr
Web : https://dirac.ups-tlse.fr/saue
DIRAC : http://www.diracprogram.org/
ESQC : http://www.esqc.org/
ERC Advanced Grant: HAMP-vQED
Book: Principles and Practices of Molecular Properties: Theory, Modeling, and Simulations

Giorgio Visentin

unread,
May 9, 2023, 12:03:56 PM5/9/23
to dirac...@googlegroups.com
Dear Prof. Saue,
the problem is the calculation returns no output file, except for the error file I showed you.

Kind regards,

Giorgio

--
You received this message because you are subscribed to the Google Groups "dirac-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dirac-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dirac-users/c4f63f90-6de6-6035-739c-439946796003%40irsamc.ups-tlse.fr.

Trond SAUE

unread,
May 9, 2023, 12:19:01 PM5/9/23
to dirac...@googlegroups.com
On 5/9/23 18:03, Giorgio Visentin wrote:
the problem is the calculation returns no output file, except for the error file I showed you.

Well, then send the inputs and the command you gave for the calculation.

Giorgio Visentin

unread,
May 9, 2023, 12:22:13 PM5/9/23
to dirac...@googlegroups.com
Dear Prof. Saue,
this is the input file:

**DIRAC
.TITLE
 BeH,ffpt RelCC
.WAVE F
.4INDEX
**HAMILTONIAN
.X2C
.OPERATOR
 ZDIPLEN
 COMFACTOR
 -0.0010
**INTEGRALS
*READINP
.UNCONTRACT
**WAVE FUNCTIONS
.SCF
.RELCCSD
*SCF
.CLOSED SHELL
4
.OPEN SHELL
1
1/2
.EVCCNV
1.0D-9  5.0D-7
# reads starting (unperturbed) MOs, DFPCMO
.MAXITR
8
**MOLTRA
# all 5 electrons,all virtuals
.ACTIVE
all
**RELCC
.ENERGY
.PRINT
1
# all electrons active
.NELEC
3 2
*CCENER
.MAXIT # maximum number of iterations for (0,0) sector
60
.NTOL
10
*END OF

This is the mol file:

INTGRL
 BeH
 STO-2G smallest basis
C   2    2  X  Y
        4.    1
Be    0.0000000000        0.0000000000        0.0000000000              
LARGE BASIS STO-2G
        1.    1
H     0.0000000000        0.0000000000        1.7325000297              
LARGE BASIS STO-2G
FINISH

And this is the script:
#!/bin/bash
#SBATCH--job-name=DIRACTEST
#SBATCH--nodes=1
#SBATCH--mem=185000
#SBATCH--ntasks=72
#SBTACH--cpus-per-task=1
#SBATCH--partition=s_standard
#SBATCH--time=7-05:00:00
#SBATCH--output=diracloop%j.out
#SBATCH--error=diracloop%j.err

#SBATCH --mem                   80gb
#SBATCH --mail-type             ALL
#SBATCH --mail-user             Giorgio....@skoltech.ru

#----------------------------------------------------------------------------------------
  echo "---- The Job is executed at $(date)\n\t by $USER on $SLURM_SUBMIT_HOST ----"
  printf " %-30s --> %s\n" "Job ID"  $SLURM_JOB_ID
  echo -e "-------------------------------------------------------------------------------\n"
  start=`date +%s`
#-------------------------- Memory (#SBATCH --mem=XXXgb 'it should be in GB)
  max_mem=$( awk -v a="$SLURM_MEM_PER_NODE" -v b="$SLURM_NTASKS"  'BEGIN{printf("%.2f",a/b/1000)}'  )  #in GB
  mem_cpu=$( awk -v a="$max_mem"  'BEGIN{printf("%.2f",a*0.96)}'  )  #in GB. 90% of max_mem per thread

#-----------------------------inputs
input=Test.inp
molecule=Test.mol
#----------------------------- run Dirac
# specify which field strengths to run
#for distance in 3.0
#do
## run relcc
/home/xi58teb/DIRAC-23.0-Source/build_intel_2020_mpi/pam --mol=$molecule --inp=$input --ag=$max_mem --gb=$mem_cpu
#do#ne
  end=`date +%s`
  runtime=$((end-start))

  echo -e "\n Time execution (sec): $runtime\n\n"
  echo "---- COMPLETED! The Job has finished at $(date) ----"

#
#
#
exit 0
 Best regards,

Giorgio

--
You received this message because you are subscribed to the Google Groups "dirac-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dirac-users...@googlegroups.com.

Ilias Miroslav, doc. RNDr., PhD.

unread,
May 9, 2023, 12:25:17 PM5/9/23
to dirac...@googlegroups.com
Hello,

insert
/home/xi58teb/DIRAC-23.0-Source/build_intel_2020_mpi/pam --help

a check that $molecule and $input files are read in your sbatch script :
/home/xi58teb/DIRAC-23.0-Source/build_intel_2020_mpi/pam --mol=$molecule --inp=$input --ag=$max_mem --gb=$mem_cpu

Best, 

Miro



From: dirac...@googlegroups.com <dirac...@googlegroups.com> on behalf of Giorgio Visentin <grgvi...@gmail.com>
Sent: Tuesday, May 9, 2023 18:22
To: dirac...@googlegroups.com <dirac...@googlegroups.com>
Subject: Re: [dirac-users] Error with running FF perturbation theory
 

Giorgio Visentin

unread,
May 9, 2023, 12:29:38 PM5/9/23
to dirac...@googlegroups.com
Dear Miro,
I replaced $input and $molecule with the explicit keywords. However, I still have the same error message with no output.
Best,
Giorgio

Ilias Miroslav, doc. RNDr., PhD.

unread,
May 9, 2023, 12:32:04 PM5/9/23
to dirac...@googlegroups.com
well, based on 

#SBATCH--output=diracloop%j.out
#SBATCH--error=diracloop%j.err

you should get this stdout and stderr files in your home directory, or in some other directory.

Put more control command into your sbatch script, like pwd.

Miro



Sent: Tuesday, May 9, 2023 18:29

Trond SAUE

unread,
May 9, 2023, 12:37:50 PM5/9/23
to dirac...@googlegroups.com

Hi,

a useful keywork is this INPTEST

https://www.diracprogram.org/doc/release-23/manual/dirac.html#inptest

which simply goes through the input without running any calculation.

This is why I did using and the attached inputs

pam --inp=xx --mol=xx

The test passes, so it appears that something is rather wrong in the setup of your script.

See what happens if you use your script to run this example

https://www.diracprogram.org/doc/release-23/tutorials/getting_started.html

All the best,

  Trond

xx.mol
xx.inp
xx_xx.out

Giorgio Visentin

unread,
May 9, 2023, 12:46:11 PM5/9/23
to dirac...@googlegroups.com
Dear all,
thanks for your help. Still, the calculation does not run even if I run it without the script, but straight from the terminal. It seems the issue started when I exported the path of pam. Now, even though I deleted the path, the calculation reports the same error.
Is there a reason for this strange behavior?

Best wishes,

Giorgio

--
You received this message because you are subscribed to the Google Groups "dirac-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dirac-users...@googlegroups.com.

Trond SAUE

unread,
May 9, 2023, 1:27:39 PM5/9/23
to dirac...@googlegroups.com

So also this calculation

https://www.diracprogram.org/doc/release-23/tutorials/getting_started.html

does not work with you pam.

What is the output of

which pam

All the best,

   Trond

On 5/9/23 18:45, Giorgio Visentin wrote:
Still, the calculation does not run even if I run it without the script, but straight from the terminal. I
--

Giorgio Visentin

unread,
May 10, 2023, 2:50:17 AM5/10/23
to dirac...@googlegroups.com
Dear Prof. Saue,
the output of which pam is correct, since it is:

~/DIRAC-23.0-Source/build_intel_2020_mpi/pam
Best wishes,

Giorgio

--
You received this message because you are subscribed to the Google Groups "dirac-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dirac-users...@googlegroups.com.

Trond SAUE

unread,
May 10, 2023, 3:13:54 AM5/10/23
to dirac...@googlegroups.com

..and you confirm that this calculation does not work ?

https://www.diracprogram.org/doc/release-23/tutorials/getting_started.html

Giorgio Visentin

unread,
May 10, 2023, 3:19:01 AM5/10/23
to dirac...@googlegroups.com
yes I confirm


--
You received this message because you are subscribed to the Google Groups "dirac-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dirac-users...@googlegroups.com.

Johann Pototschnig

unread,
May 10, 2023, 4:27:14 AM5/10/23
to dirac-users
The error message indicates that there is a problem setting up the mpirun command.

Can you check your MPI-distribution? (in command line: mpirun --version)

In the build directory (the same as pam) there should be a file: build_info.h
Can you provide it?

You could try:
---
echo "/home/xi58teb/DIRAC-23.0-Source/build_intel_2020_mpi/pam --mol=$molecule --inp=$input --ag=$max_mem --gb=$mem_cpu"
/home/xi58teb/DIRAC-23.0-Source/build_intel_2020_mpi/pam --mol=$molecule --inp=$input --ag=$max_mem --gb=$mem_cpu --mpi=1
---
or
---
export DIRAC_MPI_COMMAND="mpirun -np 8"
/home/xi58teb/DIRAC-23.0-Source/build_intel_2020_mpi/pam --mol=$molecule --inp=$input --ag=$max_mem --gb=$mem_cpu
---

best wishes,
Johann

Giorgio Visentin

unread,
May 10, 2023, 4:38:53 AM5/10/23
to dirac...@googlegroups.com
Dear Prof. Potoschnig,
below the content of the file you mentioned is reported:

#ifndef BUILD_INFO_H_INCLUDED
#define BUILD_INFO_H_INCLUDED

#define USER_NAME                "xi58teb"
#define HOST_NAME                "login01"
#define SYSTEM                   "Linux-3.10.0-957.1.3.el7.x86_64"
#define CMAKE_VERSION            "3.20.2"
#define CMAKE_GENERATOR          "Unix Makefiles"
#define CMAKE_BUILD_TYPE         "release"
#define CONFIGURATION_TIME "2023-05-09 16:52:30.920280"

#define FORTRAN_COMPILER_ID "Intel"
#define FORTRAN_COMPILER         "/cluster/intel/compilers_and_libraries_2020.2.254/linux/mpi/intel64/bin/mpiifort"
#define FORTRAN_COMPILER_VERSION "19.1"
#define FORTRAN_COMPILER_FLAGS   " -w -assume byterecl -g -traceback -DVAR_IFORT  -qopenmp -i8"

#define C_COMPILER_ID            "Intel"
#define C_COMPILER               "/cluster/intel/compilers_and_libraries_2020.2.254/linux/mpi/intel64/bin/mpiicc"
#define C_COMPILER_VERSION "19.1"
#define C_COMPILER_FLAGS         " -g -wd981 -wd279 -wd383 -wd1572 -wd177  -qopenmp"

#define CXX_COMPILER_ID          "Intel"
#define CXX_COMPILER             "/cluster/intel/compilers_and_libraries_2020.2.254/linux/mpi/intel64/bin/mpiicpc"
#define CXX_COMPILER_VERSION     "19.1.2"
#define CXX_COMPILER_FLAGS " -Wno-unknown-pragmas  -qopenmp"

#define STATIC_LINKING           "False"
#define ENABLE_64BIT_INTEGERS    "True"
#define ENABLE_MPI               "True"
#define MPI_LAUNCHER             "/cluster/intel/compilers_and_libraries_2020.2.254/linux/mpi/intel64/bin/mpiexec"

#define MATH_LIBS                "-Wl,--start-group;/cluster/intel/parallel_studio_xe_2020.2.108/compilers_and_libraries_2020/linux/mkl/lib/intel64/libmkl_lapack95_ilp64.a;/cluster/intel/parallel_studio_xe_2020$
#define ENABLE_BUILTIN_BLAS "OFF"
#define ENABLE_BUILTIN_LAPACK    "OFF"
#define EXPLICIT_LIBS            "unknown"

#define ENABLE_EXATENSOR         "OFF"
#define EXATENSOR_REPO           "unknown"
#define EXATENSOR_CONFIG         "unknown"
#define EXATENSOR_HASH           "unknown"

#define DEFINITIONS              "HAVE_MKL_BLAS;HAVE_MKL_LAPACK;HAVE_MPI;HAVE_OPENMP;VAR_MPI;VAR_MPI2;USE_MPI_MOD_F90;SYS_LINUX;PRG_DIRAC;INT_STAR8;INSTALL_WRKMEM=64000000;HAS_PCMSOLVER;BUILD_GEN1INT;HAS_PELIB;$

#endif /* BUILD_INFO_H_INCLUDED */

Best regards,

Giorgio

Reply all
Reply to author
Forward
0 new messages