Hi everyone,
I am new to gmx_MMPBSA. I've installed it with conda env create --file env.yml and had no problems during the installation. When I try doing gmx_MMPBSA_test -t 3, I get:
[INFO ] Cloning gmx_MMPBSA repository in /home/maria/Documents/gmxMMPBSA/gmx_MMPBSA_test
Cloning into '/home/maria/Documents/gmxMMPBSA/gmx_MMPBSA_test'...
remote: Enumerating objects: 23593, done.
remote: Counting objects: 100% (232/232), done.
remote: Compressing objects: 100% (93/93), done.
remote: Total 23593 (delta 163), reused 161 (delta 132), pack-reused 23361 (from 2)
Receiving objects: 100% (23593/23593), 337.56 MiB | 38.05 MiB/s, done.
Resolving deltas: 100% (16027/16027), done.
Updating files: 100% (699/699), done.
[INFO ] Cloning gmx_MMPBSA repository...Done.
[INFO ] Example STATE
--------------------------------------------------------------------------------
[INFO ] Protein-Ligand (Single trajectory approximation) RUNNING
[ERROR ] Protein-Ligand (Single trajectory approximation) [ 1/ 1] ERROR
Please, check the test log
(/home/maria/Documents/gmxMMPBSA/gmx_MMPBSA_test/examples/Protein_ligand/ST/3.log)
Checking the 3.log file, I have:
[ERROR ] MMPBSA_Error
/home/maria/anaconda3/envs/gmxMMPBSA/bin.AVX2_256/gmx trjconv failed when querying com_traj.xtc
I've also noticed that if I simply run
gmx_MMPBSA -O -i mmpbsa.in -cs com.tpr -ct com_traj.xtc -ci index.ndx -cg 3 4 -cp topol.top -nogui (inside the respective test folder), it works; however, with no MPI support. If I try mpirun -np 2 gmx_MMPBSA -O -i mmpbsa.in -cs com.tpr -ct com_traj.xtc -ci index.ndx -cg 3 4 -cp topol.top -o FINAL_RESULTS_MMPBSA.dat -eo FINAL_RESULTS_MMPBSA.csv, it doesn't work, leading me to think there is an issue with MPI support. In this case, I get something like (in the case of Protein-DNA test):
Could not import PyQt5/PyQt6. gmx_MMPBSA_ana will be disabled until PyQt5/PyQt6 is installed
Check the gmx_MMPBSA.log file to report the problem.
[ERROR ] Unable to start gmx_MMPBSA_ana...
[INFO ] Finalized...
[ERROR ] Unable to start gmx_MMPBSA_ana...
[INFO ] Finalized...
--------------------------------------------------------------------------
prterun detected that one or more processes exited with non-zero status,
thus causing the job to be terminated. The first process to do so was:
Process name: [prterun-schrodinger-20459 @ 1,0]
Exit code: 1
I couldn't find anything about this issue on the website or here, so I thought I might ask :) As I wanted to run calculations with MPI on, this error bugs me.
Any ideas?
Thank you!