Segmentation fault when using Gromacs with ntmpi > 1 and OPES

38 views
Skip to first unread message

alx.th...@gmail.com

unread,
Aug 2, 2024, 9:49:21 AM8/2/24
to PLUMED users
Hi,
I'm getting segfaults without any further detailing when I try to run a simulation with thread-MPI gromacs. Like this:

gmx mdrun ... -plumed plumed.dat -ntmpi 4 -ntomp 4

If I use only one thread-MPI rank, everything is running well, as expected:

gmx mdrun ... -plumed plumed.dat -ntmpi 1 -ntomp 16

The contents of the plumed.dat file are quite simple, it is just 1D OPES_METAD_EXPLORE run acting on a distance, with no additional walls or calculations.

I haven't found any restrictions on using Plumed with thread-MPI gromacs in the manual. Is there something I am doing wrong, or is it generally not supported?

Best,
Alex

Michele Invernizzi

unread,
Aug 5, 2024, 7:52:10 AM8/5/24
to plumed...@googlegroups.com
Hi Alex,

It is supported and it should work. you can try the following to debug:
  1. are the regtest working correctly? https://github.com/plumed/plumed2/tree/master/regtest
  2. do you still have the issue if you use OPES_METAD or METAD instead of OPES_METAD_EXPLORE? (you can get segfaults with gromacs if the bias is too aggressive)
  3. can you provide a minimal setup so we can reproduce the issue ourselves? we would need all the input files together with the plumed and gromacs version used
Best,
Michele

--
You received this message because you are subscribed to the Google Groups "PLUMED users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to plumed-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/plumed-users/b93e92a3-b2df-487c-b7f9-404b2ddef6f8n%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages