Hello,
I am having trouble compiling gromacs 4.6.1 to work with plumed-2.0-beta and MPI. I wonder if there is a compilation switch somewhere to compile PLUMED 2.0 with MPI other than simply having those libaries available at compile-time?
I can compile successfully and run in serial, but with MPI I always get the following error:
+++ Internal PLUMED error
+++ file Communicator.cpp, line 96
+++ message: you are trying to use an MPI function, but PLUMED has been compiled without MPI support
I compile plumed like this:
module purge
module load intel/13.1.1 openmpi/intel/1.6.4
./configure.sh
make
## Note that lapack is not available on this cluster, so before running make I need to edit configurations/linux.icc and replace:
DYNAMIC_LIBS=-lstdc++ -llapack -lblas
with:
DYNAMIC_LIBS=-lstdc++ -mkl
THen I source sourceme.sh and patch gromacs 4.6.1 source with "plumed patch -p"
Then I compile gromacs in parallel like this:
module purge
module load gcc/4.6.1
module load cmake/2.8.6
module load intel/13.1.1
module load openmpi/intel/1.6.4
export FFTW_LOCATION=/project/p/pomes/cneale/GPC/exe/intel/fftw-3.1.2_centos6computeA/exec
cmake ../source__B/ \
-DCMAKE_PREFIX_PATH=$FFTW_LOCATION \
-DCMAKE_INSTALL_PREFIX=$(pwd) \
-DGMX_X11=OFF \
-DCMAKE_CXX_COMPILER=mpicxx \
-DCMAKE_C_COMPILER=mpicc \
-DGMX_MPI=ON \
-DGMX_PREFER_STATIC_LIBS=ON
make mdrun
make install-mdrun
And that installation goes fine.
But when I try to run, I get the error noted above. I run like this:
module load intel/13.1.1
module load openmpi/intel/1.6.4
mpirun -np 4 /project/p/pomes/cneale/GPC/exe/intel/gromacs-4.6.1_plumed2.0/exec2/bin/mdrun_mpi -deffnm test -dlb yes -npme -1 -cpt 60 -maxh 0.1 -px test.pull.pos.xvg -pf test.pull.force.xvg -xvg none -plumed plumed.dat
and I get the error message noted above, namely:
+++ Internal PLUMED error
+++ file Communicator.cpp, line 96
+++ message: you are trying to use an MPI function, but PLUMED has been compiled without MPI support
I also tried to compile gromacs with thread-MPI (=without MPI) and again it worked with -mdrun -nt 1 but not with mpirun -nt 2, for which it gave the same error about PLUMED not being compiled with MPI.
Finally, if I use these patched compilations without the -plumed flag to mdrun, then they work just fine.
Thank you for your assistance.