Hi,
I'm getting segfaults without any further detailing when I try to run a simulation with thread-MPI gromacs. Like this:
gmx mdrun ... -plumed plumed.dat -ntmpi 4 -ntomp 4
If I use only one thread-MPI rank, everything is running well, as expected:
gmx mdrun ... -plumed plumed.dat -ntmpi 1 -ntomp 16
The contents of the plumed.dat file are quite simple, it is just 1D OPES_METAD_EXPLORE run acting on a distance, with no additional walls or calculations.
I haven't found any restrictions on using Plumed with thread-MPI gromacs in the manual. Is there something I am doing wrong, or is it generally not supported?
Best,
Alex