./setup --mpi --int64 (Dirac 17, it doesn't work in 64 bits)

180 views
Skip to first unread message

Andy Danian Zapata Escobar

unread,
Aug 3, 2018, 8:43:57 PM8/3/18
to dirac-users

Hi

I tried to install the Dirac17 in 64 bits. I have the compiler opempi-3.0.0 in 64 bits (ompi_info -a | grep 'Fort integer Size' = 8). I used the commads in http://diracprogram.org/doc/release-12/installation/int64/mpi.html to compile of openmpi.

After, I used the next command in the dirac's main directory

./setup --mpi --int64

The output:
___________________________________________________________________________________________________________________________________
C=mpif90 CC=mpicc CXX=mpicxx cmake -DEXTRA_FCFLAGS="''" -DEXTRA_CFLAGS="''" -DEXTRA_CXXFLAGS="''" -DPREPROCESSOR_DEFINITIONS="''" -DPYTHON_INTERPRETER="''" -DENABLE_BLAS=auto -DENABLE_LAPACK=auto -DMKL_FLAG=off -DMATH_LIB_SEARCH_ORDER="MKL;ESSL;OPENBLAS;ATLAS;ACML;SYSTEM_NATIVE" -DBLAS_LANG=Fortran -DLAPACK_LANG=Fortran -DENABLE_MPI=True -DENABLE_CODE_COVERAGE=False -DENABLE_STATIC_LINKING=False -DENABLE_PROFILING=False -DENABLE_RUNTIMECHECK=False -DENABLE_64BIT_INTEGERS=True -DEXPLICIT_LIBS="off" -DENABLE_PCMSOLVER=ON -DPCMSOLVER_ROOT='' -DCMAKE_BUILD_TYPE=release -G "Unix Makefiles" /home/yuly/Programs/DIRAC-17.0-Source

-- The Fortran compiler identification is GNU 5.4.0
-- The C compiler identification is GNU 5.4.0
-- The CXX compiler identification is GNU 5.4.0
-- Check for working Fortran compiler: /usr/local/openmpi-3.0.0/bin/mpif90
-- Check for working Fortran compiler: /usr/local/openmpi-3.0.0/bin/mpif90  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /usr/local/openmpi-3.0.0/bin/mpif90 supports Fortran 90
-- Checking whether /usr/local/openmpi-3.0.0/bin/mpif90 supports Fortran 90 -- yes
-- Check for working C compiler: /usr/local/openmpi-3.0.0/bin/mpicc
-- Check for working C compiler: /usr/local/openmpi-3.0.0/bin/mpicc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/local/openmpi-3.0.0/bin/mpicxx
-- Check for working CXX compiler: /usr/local/openmpi-3.0.0/bin/mpicxx -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found PythonInterp: /usr/bin/python (found version "2.7.12")
-- Searching for BLAS using search order MKL;ESSL;OPENBLAS;ATLAS;ACML;SYSTEM_NATIVE
-- Searching for LAPACK using search order MKL;ESSL;OPENBLAS;ATLAS;ACML;SYSTEM_NATIVE
-- MATH_LIB_SEARCH_ORDER set to MKL;ESSL;OPENBLAS;ATLAS;ACML;SYSTEM_NATIVE
-- Found MPI_C: /usr/local/openmpi-3.0.0/lib/libmpi.so 
-- Found MPI_CXX: /usr/local/openmpi-3.0.0/lib/libmpi.so 
-- Found MPI_Fortran: /usr/local/openmpi-3.0.0/lib/libmpi_usempif08.so;/usr/local/openmpi-3.0.0/lib/libmpi_usempi_ignore_tkr.so;/usr/local/openmpi-3.0.0/lib/libmpi_mpifh.so;/usr/local/openmpi-3.0.0/lib/libmpi.so 
-- Enable profiling: False
-- Enable run-time checking: False
-- Performing Test MPI_COMPILER_MATCHES
-- Performing Test MPI_COMPILER_MATCHES - Success
-- mpi.mod matches current compiler, setting -DUSE_MPI_MOD_F90
-- Performing Test MPI_ITYPE_MATCHES
-- Performing Test MPI_ITYPE_MATCHES - Success
-- Set CDash default timeout for single test set to 1500 seconds. Overwritten by test's TIMEOUT property label and, ultimatively, by pam timeout setting.
-- Test's 'basis_input_scripted' timeout set to 3600 seconds. Overwritten by pam timeout setting.
-- PCMSolver not found. The pre-packaged version will be built.
-- Polarizable Continuum Model via PCMSolver ENABLED
-- Gen1Int module: ON
-- PElib module: ON
-- Found Git: /usr/bin/git (found version "2.7.4")
-- The XCFun submodule ENABLED
-- Davidson-type +Q corrections for (MR)CISD: ON
-- ESR property module: OFF
-- Stieltjes external module ENABLED
-- Interest library: OFF
-- KRCC module: OFF
-- Enable compilation of standalone relccsd.x: OFF
-- OpenRSP library: OFF
-- LAO properties without connection matrices: OFF
-- Spinfree MCSCF module: OFF
-- Atomic oo-order spin-orbit correction module: OFF
-- srDFT module: OFF
-- Specialized tutorial tests DISABLED
-- Unit control tests DISABLED
-- User name: yuly
-- Host name: Bader
-- Operating system: Linux-4.15.0-29-generic
-- CMake version: 3.5.1
-- CMake generator: Unix Makefiles
-- CMake build type: release
-- Configuration time: 2018-08-04 00:30:41.804991
-- Python version: 2.7.1
-- Fortran compiler: /usr/local/openmpi-3.0.0/bin/mpif90
-- Fortran compiler version: GNU 5.4.0
-- Fortran compiler flags:  -g -fcray-pointer -fbacktrace -fno-range-check -DVAR_GFORTRAN -DVAR_MFDS  -fdefault-integer-8
-- C compiler: /usr/local/openmpi-3.0.0/bin/mpicc
-- C compiler version: GNU 5.4.0
-- C compiler flags:  -g
-- CXX compiler: /usr/local/openmpi-3.0.0/bin/mpicxx
-- CXX compiler version: GNU 5.4.0
-- CXX compiler flags:  -g -Wall -Wno-unknown-pragmas -Wno-sign-compare -Woverloaded-virtual -Wwrite-strings -Wno-unused
-- Static linking: False
-- 64-bit integers: True
-- MPI parallelization: True
-- MPI launcher: /usr/local/openmpi-3.0.0/bin/mpiexec
-- Compile definitions: USE_BUILTIN_BLAS;USE_BUILTIN_LAPACK;HAVE_MPI;VAR_MPI;VAR_MPI2;USE_MPI_MOD_F90;SYS_LINUX;PRG_DIRAC;INT_STAR8;INSTALL_WRKMEM=64000000;HAS_PCMSOLVER;BUILD_GEN1INT;HAS_PELIB;MOD_QCORR;HAS_STIELTJES
-- Could NOT find Sphinx (missing:  SPHINX_EXECUTABLE)
-- Adding target release
-- Set CDash default timeout for single test set to 1500 seconds. Overwritten by test's TIMEOUT property label and, ultimatively, by pam timeout setting.
-- Test's 'basis_input_scripted' timeout set to 3600 seconds. Overwritten by pam timeout setting.
-- Configuring done
-- Generating done
-- Build files have been written to: /home/yuly/Programs/DIRAC-17.0-Source/build

   configure step is done
   now you need to compile the sources:
   $ cd build
   $ make
________________________________________________________________________________________

The compiling finish good (100%).  However, the dirac is really of 32 bits instead to be of 64

What is it my mistake?

Thanks, have nice day

Ilias Miroslav, doc. RNDr., PhD.

unread,
Aug 4, 2018, 5:55:12 AM8/4/18
to dirac-users

Hi,


in the compiled  dirac.x output, you should see printouts like:


** interface to 64-bit integer MPI enabled **

64-bit integers          | True
MPI parallelization      | True
.

.



Do you see them ?


M.





Od: dirac...@googlegroups.com <dirac...@googlegroups.com> v mene používateľa Andy Danian Zapata Escobar <daniane...@gmail.com>
Odoslané: sobota, 4. augusta 2018 2:43
Komu: dirac-users
Predmet: [dirac-users] ./setup --mpi --int64 (Dirac 17, it doesn't work in 64 bits)
 

Hi

I tried to install the Dirac17 in 64 bits. I have the compiler opempi-3.0.0 in 64 bits (ompi_info -a | grep 'Fort integer Size' = 8). I used the commads in http://diracprogram.org/doc/release-12/installation/int64/mpi.html to compile of openmpi.
How to build MPI libraries for 64-bit integers¶. Before you continue please verify whether you really need 64-bit integers. It is extra work and perhaps not needed.

--
You received this message because you are subscribed to the Google Groups "dirac-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dirac-users...@googlegroups.com.
Visit this group at https://groups.google.com/group/dirac-users.
For more options, visit https://groups.google.com/d/optout.
Message has been deleted

Ilias Miroslav, doc. RNDr., PhD.

unread,
Aug 4, 2018, 12:44:05 PM8/4/18
to dirac-users

Hi,


64-bit integers          | True
MPI parallelization      | True


so, you have now 64-bit integer parallel DIRAC. You can do large-scale calculations with the code.




--
doc. RNDr. Miroslav Iliaš, PhD.

Katedra chémie
Fakulta prírodných vied
Univerzita Mateja Bela
Tajovského 40
97401 Banská Bystrica
tel: +421 48 446 7351
email : Mirosla...@umb.sk

Miroslav Iliaš, PhD.
Department of Chemistry
Faculty of Natural Sciences
Matej Bel University
Tajovského 40
97401 Banska Bystrica
Slovakia
tel: +421 48 446 7351
email :  Mirosla...@umb.sk



Od: dirac...@googlegroups.com <dirac...@googlegroups.com> v mene používateľa Andy Danian Zapata Escobar <daniane...@gmail.com>
Odoslané: sobota, 4. augusta 2018 18:10
Komu: dirac-users
Predmet: Re: [dirac-users] ./setup --mpi --int64 (Dirac 17, it doesn't work in 64 bits)
 
Hello, Dear Doc. Miroslav Iliaš

Thanks for your answer. When I exec to dirac.x (./dirac.x), I see about configuration:

__________________________________________________________________________________________
Configuration and build information
-----------------------------------

Operating system         | Linux-4.15.0-29-generic
CMake version            | 3.5.1
CMake generator          | Unix Makefiles
CMake build type         | release

Configuration time       | 2018-08-04 00:30:41.804991
Python version           | 2.7.1
Fortran compiler         | /usr/local/openmpi-3.0.0/bin/mpif90
Fortran compiler version | 5.4.0

Fortran compiler flags   |  -g -fcray-pointer -fbacktrace -fno-range-check -DVAR_GFORTRAN -DVAR_MFDS  -fdefault-integer-8
C compiler               | /usr/local/openmpi-3.0.0/bin/mpicc
C compiler version       | 5.4.0
C compiler flags         |  -g
C++ compiler             | /usr/local/openmpi-3.0.0/bin/mpicxx
C++ compiler version     | 5.4.0
C++ compiler flags       |  -g -Wall -Wno-unknown-pragmas -Wno-sign-compare -Woverloaded-virtual -Wwrite-strings -Wno-unused
Static linking           | False

64-bit integers          | True
MPI parallelization      | True
MPI launcher             | /usr/local/openmpi-3.0.0/bin/mpiexec
Math libraries           | unknown
Builtin BLAS library     | OFF
Builtin LAPACK library   | OFF
Explicit libraries       | unknown

Compile definitions      | USE_BUILTIN_BLAS;USE_BUILTIN_LAPACK;HAVE_MPI;VAR_MPI;VAR_MPI2;USE_MPI_MOD_F90;SYS_LINUX;PRG_DIRAC;INT_STAR8;INSTALL_WRKMEM=64000000;HAS_PCMSOLVER;BUILD_GEN1INT;HAS_PELIB;MOD_QCORR;HAS_STIELTJES
___________________________________________________________________________________________________________

If I can I will send you the dirac.x

I wait for your answer, thanks to help me, have nice day Doc. Mioslav Iliaš.

Andy Danian Zapata Escobar

unread,
Aug 4, 2018, 1:41:55 PM8/4/18
to dirac-users
Hello, Dear Doc. Miroslav Iliaš

Thanks for your answer. When I exec to dirac.x (./dirac.x), I see about configuration:

__________________________________________________________________________________________
  ** interface to 64-bit integer MPI enabled **

DIRAC serial starts by allocating 64000000 words (    488.28 MB -  0.477 GB)
 of memory    out of the allowed maximum of 2147483648 words (  16384.00 MB - 16.000 GB)
__________________________________________________________________________________________

Configuration and build information
-----------------------------------

Operating system         | Linux-4.15.0-29-generic
CMake version            | 3.5.1
CMake generator          | Unix Makefiles
CMake build type         | release
Configuration time       | 2018-08-04 00:30:41.804991
Python version           | 2.7.1
Fortran compiler         | /usr/local/openmpi-3.0.0/bin/mpif90
Fortran compiler version | 5.4.0
Fortran compiler flags   |  -g -fcray-pointer -fbacktrace -fno-range-check -DVAR_GFORTRAN -DVAR_MFDS  -fdefault-integer-8
C compiler               | /usr/local/openmpi-3.0.0/bin/mpicc
C compiler version       | 5.4.0
C compiler flags         |  -g
C++ compiler             | /usr/local/openmpi-3.0.0/bin/mpicxx
C++ compiler version     | 5.4.0
C++ compiler flags       |  -g -Wall -Wno-unknown-pragmas -Wno-sign-compare -Woverloaded-virtual -Wwrite-strings -Wno-unused
Static linking           | False

64-bit integers          | True
MPI parallelization      | True
MPI launcher             | /usr/local/openmpi-3.0.0/bin/mpiexec
Math libraries           | unknown
Builtin BLAS library     | OFF
Builtin LAPACK library   | OFF
Explicit libraries       | unknown
Compile definitions      | USE_BUILTIN_BLAS;USE_BUILTIN_LAPACK;HAVE_MPI;VAR_MPI;VAR_MPI2;USE_MPI_MOD_F90;SYS_LINUX;PRG_DIRAC;INT_STAR8;INSTALL_WRKMEM=64000000;HAS_PCMSOLVER;BUILD_GEN1INT;HAS_PELIB;MOD_QCORR;HAS_STIELTJES
___________________________________________________________________________________________________________

If I can I will send you the dirac.x

I see it is good, my problem is that I need to use over 16 GB of memory in calculations, I apologize, I didn't tell you that in the before email

When I used over 16GB, I get (./pam --noarch --scratch=/home/Programs/DIRAC-17.0-Source/scratch --ag=40 --mb=20000 --inp=MP2.inp --mol=H2O.mol)

________________________________________________________________________
  ** interface to 64-bit integer MPI enabled **

DIRAC master    (Bader) starts by allocating     2621000000 r*8 words (  19.528 GB) of memory
DIRAC nodes 1 to  11 starts by allocating          2621000000 r*8 words (  19.528 GB) of memory
DIRAC master    (Bader) to allocate at most      5368000000 r*8 words (  39.995 GB) of memory
DIRAC nodes 1 to  11 to allocate at most           5368000000 r*8 words (  39.995 GB) of memory

Note: maximum allocatable memory for master+nodes can be set by -aw (MW)/-ag (GB) flags in pam

 Not enough memory available when attempting to allocate           19997 Mb in subroutine __allocator_eh_MOD_allocator_errorhandler+
________________________________________________________________________


I wait for your answer, thanks to help me, have nice day Doc. Mioslav Iliaš.


El sábado, 4 de agosto de 2018, 6:55:12 (UTC-3), Miroslav ILIAŠ escribió:

Ilias Miroslav, doc. RNDr., PhD.

unread,
Aug 4, 2018, 2:10:31 PM8/4/18
to dirac-users


Well, so the memory is the issue. You demanded 12*20 GB memory ... too much ?


You must know what is the maximum memory your node can offer (example, we have --mem=120GB pre node, exclusive ), what is the memory ONE MPI-thread consumes (for example, let max. be 30GB), then you set number of threads (as example, 120/30=4).


The recommended command is  "pam --gb=... --ag=..."   and gb value should be slighthly lower than ag value.


( See also here http://diracprogram.org/doc/master/tutorials/cc_memory_count/count_cc_memory.html )



Od: dirac...@googlegroups.com <dirac...@googlegroups.com> v mene používateľa Andy Danian Zapata Escobar <daniane...@gmail.com>
Odoslané: sobota, 4. augusta 2018 19:41
Komu: dirac-users
Predmet: Re: [dirac-users] ./setup --mpi --int64 (Dirac 17, it doesn't work in 64 bits)
 

Andy Danian Zapata Escobar

unread,
Aug 4, 2018, 5:33:16 PM8/4/18
to dirac-users
Hi, Dear Doc. Miroslav Iliaš

Thanks for your clarification and your time, now my calculations run good, I already understand my mistake

Have nice day
Reply all
Reply to author
Forward
0 new messages