Project

General

Profile

Bug #2953

[ intermolecular_interactions ] with [ distance_restraints ] / [ bonds ] not working

Added by Eiso AB 2 months ago. Updated about 2 months ago.

Status:
Closed
Priority:
High
Assignee:
Category:
mdrun
Target version:
Affected version - extra info:
Affected version:
Difficulty:
uncategorized
Close

Description

see also https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2019-May/125277.html

I'm trying to implement NMR (noe) and Hbond restraints between protein, ligand and 2 SOL molecules
using [ distance_restraints ] (or [ bonds ])

[ intermolecular_interactions ]

[ distance_restraints ]
; ai aj ftype label rtype low up0 up1 weight ; & source
1607 5241 1 18 1 0.5680 0.5680 0.6384 1 ; # 120 leu hd1# 411 lig h09 5.68 # 4.74 0.94
1608 5241 1 18 1 0.5680 0.5680 0.6384 1 ; #
1609 5241 1 18 1 0.5680 0.5680 0.6384 1 ; #
...etc

[ bonds ]
5261 5569 10 0.2800 0.3200 10.4333 1 ; # 411 lig o19 501 sol ow 3.00
etc...

With both [ bonds ] and [ distance_restraints ] the mdrun starts, but non of the restraints seem to be active - waters just fly away etc.

Also for [ distance_restraints ] no Dist. Res energy is printed in the log file - unlike when they intramolecular and defined with the appropriate molecule section.

Not sure if it is related, but when I run gmx_disre I get the following error:

%> gmx disre -s md_0_10.tpr -f md_0_10.xtc

-------------------------------------------------------
Program: gmx disre, version 2019-beta1
Source file: src/gromacs/listed-forces/disre.cpp (line 174)
Function: init_disres(FILE*, const gmx_mtop_t*, t_inputrec*, const t_commrec*, const gmx_multisim_t*, t_fcdata*, t_state*, gmx_bool)::__lambda0

Assertion failed:
Condition: type_max - type_min + 1 == dd->nres
All distance restraint parameter entries in the topology should be consecutive

and

%> gmx nmr -s md_0_10.tpr -f md_0_10.edr -viol viol.xvg

yields:

-------------------------------------------------------
Program: gmx nmr, version 2019-beta1
Source file: src/gromacs/commandline/filenm.cpp (line 88)
Function: opt2fn(const char*, int, const t_filenm*)::__lambda1

Assertion failed:
Condition: false
opt2fn should be called with a valid option

If needed I can produce a toy example but will have to make one
up since I can't share the files I'm working on -let me know.

Eiso

topol.top (276 Bytes) topol.top Berk Hess, 05/14/2019 04:23 PM
conf.gro (80.9 KB) conf.gro Berk Hess, 05/14/2019 04:23 PM
grompp.mdp (7.83 KB) grompp.mdp Berk Hess, 05/14/2019 04:23 PM
test-1.log (22.2 KB) test-1.log mdrun log file for run that works ok. Eiso AB, 05/15/2019 09:02 AM
test-2.log (24.5 KB) test-2.log mdrun log file for broken run Eiso AB, 05/15/2019 09:02 AM

Associated revisions

Revision 3cbf2eb5 (diff)
Added by Berk Hess about 2 months ago

Fix missing intermolecular interactions with DD

When running with domain decomposition, all intermolecular interactions
(when present) were ignored.

Fixes #2953

Change-Id: I98783f4175b40fdfa6ad035323e0897e8caaee5c

History

#1 Updated by Berk Hess 2 months ago

  • Status changed from New to Accepted
  • Assignee set to Berk Hess

I tried a toy example with 400 water molecules and a distance restraint between molecules 1 and 2. That runs and gives distance restraint energies.
But gmx disre says:
Source file: src/gromacs/gmxana/gmx_disre.cpp (line 187)

Fatal error:
tpr inconsistency. ndr = 0, label = 18

Could you provide a toy example which fails in mdrun?

#2 Updated by Eiso AB 2 months ago

can you send me the toy example? then I can work from that.

#3 Updated by Berk Hess 2 months ago

I uploaded a fix for gmx disre, which I forgot to update after allowing free label value choices (a long time ago).

#4 Updated by Berk Hess 2 months ago

A attached a water system with one restraint between the first two molecules. This seems to work fine. I get energies and the distance quickly gets within the bounds.

#5 Updated by Eiso AB 2 months ago

Berk Hess wrote:

I tried a toy example with 400 water molecules and a distance restraint between molecules 1 and 2. That runs and gives distance restraint energies.
But gmx disre says:
Source file: src/gromacs/gmxana/gmx_disre.cpp (line 187)

Fatal error:
tpr inconsistency. ndr = 0, label = 18

this specific error disappears when label = 0 as in:

[ distance_restraints ]
; ai aj ftype label rtype low up0 up1 weight ; & source
1 4 1 0 1 0.5 0.5 0.6 1

Could you provide a toy example which fails in mdrun?

#6 Updated by Eiso AB 2 months ago

ok, some progress ... I guess it's something with domain decomp

your toy system works fine on one system (desktop + GPU) and fails on the one I was using before (2x12 core E5-2690)

The mdrun log files are quite different, one mentions the distance restraints at the point the other starts giving info about DD
Log files are attached.

#7 Updated by Berk Hess 2 months ago

Yes, changing the label fixes it, but so should my fix on gerrit.gromacs.org.

If the issue is with DD, I should be able to reproduce and fix it.

#8 Updated by Eiso AB 2 months ago

Good,
Pretty sure the label issue is separate (i.e when the label starts from 0 the dis. restr. are still ignored)

Some issues to check:
  • are multiple [ distance_restraints ' sections allowed e.g. intra-molecular in a [ molecule ] section and inter-molecular after a [ intermolecular_interactions ] ?
  • maybe related: gmx nmr also fails on the system where the dist restr. are working properly.
gmx nmr -s test.tpr -f test.edr -viol viol.xvg

-------------------------------------------------------
Program:     gmx nmr, version 2019-beta1
Source file: src/gromacs/fileio/enxio.cpp (line 340)

Fatal error:
Energy header magic number mismatch, this is not a GROMACS edr file
If you want to use the correct frames before the corrupted frame and avoid
this fatal error set the env.var. GMX_ENX_NO_FATAL

WARNING: Energy header magic number mismatch, this is not a GROMACS edr file

setting the export GMX_ENX_NO_FATAL=1 gives

WARNING: Energy header magic number mismatch, this is not a GROMACS edr file

-------------------------------------------------------
Program:     gmx nmr, version 2019-beta1
Source file: src/gromacs/fileio/enxio.cpp (line 836)

File input/output error:
Cannot write energy file header; maybe you are out of disk space?


(I'm not out of disk space)

#9 Updated by Berk Hess 2 months ago

  • Status changed from Accepted to Fix uploaded
  • Priority changed from Normal to High
  • Target version set to 2019.3

Yes, the label issue is separate. This fix I uploaded was for that.

Now I found the DD issue: all intermolecular interactions are ignored with DD. So this issue is more general. I uploaded a fix.
You can run with -ntmpi 1 for the moment (or apply the fix).
Note that the 2018 release is not affected by this issue.

#10 Updated by Eiso AB 2 months ago

I applied the patches but even with the fix I still need to set -ntmpi 1
(nstdisreout=0)

You can run with -ntmpi 1 for the moment (or apply the fix).

  gmx mdrun -v -deffnm test -c test.pdb

Back Off! I just backed up test.log to ./#test.log.5#
Reading file test.tpr, VERSION 2019-beta1 (single precision)

NOTE: Parallelization is limited by the small number of atoms,
      only starting 12 thread-MPI ranks.
      You can use the -nt and/or -ntmpi option to optimize the number of threads.

-------------------------------------------------------
Program:     gmx mdrun, version 2019-beta1
Source file: src/gromacs/listed-forces/disre.cpp (line 163)
MPI rank:    4 (out of 12)

Fatal error:
With MPI parallelization distance-restraint pair output is not supported. Use
nstdisreout=0 or use OpenMP parallelization on a single node.

#11 Updated by Berk Hess 2 months ago

Yes, that is a restriction. So as the message says, you have to set nstdisreout=0 or use -ntmpi 1.

#12 Updated by Eiso AB 2 months ago

For me, on the system that previously failed, with the fixes, setting nstdisreout=0 or 100 doesn't matter.
It only runs ok with -ntpmi 1

without -ntpmi 1 I get the following (slightly different from above)

NOTE: Parallelization is limited by the small number of atoms,
      only starting 12 thread-MPI ranks.
      You can use the -nt and/or -ntmpi option to optimize the number of threads.

Changing nstlist from 10 to 20, rlist from 0.969 to 1.075

-------------------------------------------------------
Program:     gmx mdrun, version 2019-beta1
Source file: src/gromacs/domdec/domdec.cpp (line 2403)
MPI rank:    0 (out of 12)

Fatal error:
There is no domain decomposition for 12 ranks that is compatible with the
given box and a minimum cell size of 1.87908 nm
Change the number of ranks or mdrun option -rdd or -dds
Look in the log file for details on the domain decomposition

#13 Updated by Berk Hess 2 months ago

Your system is too small to run efficiently over so many MPI ranks and/or cores.
You can run with -ntmpi 1 to parallelize over all cores with OpenMP only. Still it could be faster to use fewer cores.

#14 Updated by Berk Hess about 2 months ago

  • Status changed from Fix uploaded to Resolved

#15 Updated by Mark Abraham about 2 months ago

  • Status changed from Resolved to Closed

Also available in: Atom PDF