Dirac14 parallel build

114 views
Skip to first unread message

renu bala

unread,
Nov 3, 2015, 9:19:54 AM11/3/15
to dirac-users

   Dear Dirac experts,


   I installed the Dirac14.1 version in parallel mode using the following setup script.


   $./setup –fc=mpif90 –cc=mpicc –cxx=mpic++ -D ENABLE_PCMSOLVER=OFF –int64

   $cd build

   $make


   After that I run
   $ make test


   Few tests are failed.

   The following test FAILED:


    7:krci_energy

    8:molecular_mean_field_hamiltonian_schemes

    10:mp2_properties

    11:reladc_sip

    12:reladc_sipeigv

    13:reladc_dip

    15:cc_energy_and_mp2_dipole

    22:efg

    23:fde_static-vemb_dipole

    24:fde_response_autodiff

    26:fscc

    27:fscc_highspin

    30:lucita_large

    31:mcscf_energy

    33:response_nmr_spin-spin

    38:krci_energy_q_corrections

    45:bss_energy

    46:cc_linear

    56:ecp

    59:krci_properties_perm_dipmom

    61:mcscf_properties

    64:mp2_natural_orbitals

    78:lucita_short

    83:krci_properties_omega_tdm

    88:scf_levelshift

    92:lucita_q_corrections


   Also, I am attaching here the files which are generated in the folder
   /home/renu/DIRAC14_Parallel/DIRAC-14.1-Source/build/Testing/Temporary


  

    Thanks and regards,

    Renu

CTestCostData.txt
LastTest.log
LastTestsFailed.log

Radovan Bast

unread,
Nov 4, 2015, 4:35:43 AM11/4/15
to dirac-users
dear Renu,

this is because you test an MPI binary sequentially ("make test" does not know that
it should test in parallel):

please set DIRAC_MPI_COMMAND

best wishes,
  radovan

--
You received this message because you are subscribed to the Google Groups "dirac-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dirac-users...@googlegroups.com.
Visit this group at http://groups.google.com/group/dirac-users.
For more options, visit https://groups.google.com/d/optout.

Radovan Bast

unread,
Nov 4, 2015, 5:04:03 AM11/4/15
to dirac-users
just adding a clarification/precision (hopefully):

an MPI binary does not need to be tested in parallel but needs
to be launched via mpirun/mpiexec.

a "sequential" MPI run can be achieved by setting:
$ export DIRAC_MPI_COMMAND="mpirun -np 1"
Reply all
Reply to author
Forward
0 new messages