Project

General

Profile

Bug #1296

gromacs 4.6.2 regressiontests hang on darwin10/11/12

Added by Jack Howarth over 6 years ago. Updated almost 6 years ago.

Status:
Closed
Priority:
Normal
Assignee:
Category:
mdrun
Target version:
Affected version - extra info:
Affected version:
Difficulty:
uncategorized
Close

Description

The new grimaces 4.6.2 when built with either the llvm-gcc compilers of Xcode 4.2 on darwin10 or the clang compilers of Xcode 4.6.3 on darwin11/12 hangs in the test suite run...

[100%] Built target gmxtests
make -f CMakeFiles/check.dir/build.make CMakeFiles/check.dir/depend
cd /sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build && /sw/bin/cmake -E cmake_depends "Unix Makefiles" /sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2 /sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2 /sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build /sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build /sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build/CMakeFiles/check.dir/DependInfo.cmake --color=
make -f CMakeFiles/check.dir/build.make CMakeFiles/check.dir/build
/sw/bin/ctest --output-on-failure
Test project /sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build
Start 1: regressiontests/simple

This problem doesn't exist with grimaces 4.6.0 or 4.6.1. Using the FSF gcc 4.8.1 compilers to build gromacs 4.6.2 shows the same issue.

History

#1 Updated by Jack Howarth over 6 years ago

I am not sure how to debug this within the cmake build...

$ cd /sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build
  1. /sw/bin/ctest --output-on-failure --extra-verbose
    UpdateCTestConfiguration from :/sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build/DartConfiguration.tcl
    Parse Config file:/sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build/DartConfiguration.tcl
    UpdateCTestConfiguration from :/sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build/DartConfiguration.tcl
    Parse Config file:/sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build/DartConfiguration.tcl
    Test project /sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/build
    Constructing a list of tests
    Done constructing a list of tests
    Checking test dependency graph...
    Checking test dependency graph end
    test 1
    Start 1: regressiontests/simple

1: Test command: /usr/bin/perl "/sw/src/fink.build/gromacs-4.6.2-r/gromacs-4.6.2/regressiontests-4.6.2/gmxtest.pl" "simple" "-crosscompile" "-noverbose" "-nosuffix"
1: Test timeout computed to be: 1500

#2 Updated by Mark Abraham over 6 years ago

Hmm. AFAIK none of the devs builds with the Xcode toolchains. We do test gcc and clang on Mac, though.

I'd guess this is some kind of race condition provoked by the refactoring we did to threadMPI between 4.6.1 and 4.6.2. Some stuff did get broken. It would be of interest to know whether cmake -DGMX_THREAD_MPI=off has the same issue with your toolchain - I expect not. Either way, I expect we will release 4.6.3 in the next day or so, and I hope that the fixes there will also fix your symptoms.

#3 Updated by Mark Abraham over 6 years ago

  • Category changed from testing to mdrun
  • Status changed from New to Feedback wanted

Turns out this was fixed shortly after 4.6.2. Please let us know how you go with 4.6.3

#4 Updated by Jack Howarth over 6 years ago

There doesn't appear to be a regressiontests-4.6.3.tar.gz posted at http://gerrit.gromacs.org/download. Am I supposed to test gromacs 4.6.3 against regressiontests-4.6.2.tar.gz?

#5 Updated by Mark Abraham over 6 years ago

Yes, the tests have not changed.

#6 Updated by Jack Howarth over 6 years ago

While the stock gromacs build gf 4.6.3 passes the regressions tests from 4.6.2 on x86_64-apple-darwin12, you appear to have broken the linkage order required for the the ___emutls_get_address symbols to be resolved from /sw/lib/gcc4.8/lib/libgcc_s.1.dylib (aka libgcc_ext_10.5)....

  1. /sw/bin/mpiexec -np 8 -wdir /sw/src/fink.build/gromacs-mpi-4.6.3-1/gromacs-4.6.3/regressiontests-4.6.3/simple/angles1 mdrun_mpi -notunepme -table ../table -tablep ../tablep
    dyld: lazy symbol binding failed: Symbol not found: ___emutls_get_address
    Referenced from: /sw/lib/gcc4.8/lib/libgomp.1.dylib
    Expected in: /usr/lib/libSystem.B.dylib

The functional linkage for mdrun_mpi from a gomacs 4.6.1 build is....

  1. otool -L mdrun_mpi
    mdrun_mpi:
    /sw/lib/openmpi/libmpi.1.dylib (compatibility version 2.0.0, current version 2.7.0)
    /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 169.3.0)
    /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate (compatibility version 1.0.0, current version 4.0.0)
    /sw/src/fink.build/gromacs-mpi-4.6.1-3/gromacs-4.6.1/build/src/kernel/libgmxpreprocess_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/src/fink.build/gromacs-mpi-4.6.1-3/gromacs-4.6.1/build/src/mdlib/libmd_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/src/fink.build/gromacs-mpi-4.6.1-3/gromacs-4.6.1/build/src/gmxlib/libgmx_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/lib/libfftw3f.3.dylib (compatibility version 7.0.0, current version 7.2.0)
    /sw/lib/gcc4.8/lib/libgomp.1.dylib (compatibility version 2.0.0, current version 2.0.0)
    /sw/lib/gcc4.8/lib/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1.0.0)

The broken mdrun_mpi linkage from 4.6.3 is....

  1. otool -L mdrun_mpi
    mdrun_mpi:
    /sw/src/fink.build/gromacs-mpi-4.6.3-1/gromacs-4.6.3/build/src/kernel/libgmxpreprocess_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/src/fink.build/gromacs-mpi-4.6.3-1/gromacs-4.6.3/build/src/mdlib/libmd_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/src/fink.build/gromacs-mpi-4.6.3-1/gromacs-4.6.3/build/src/gmxlib/libgmx_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/lib/libfftw3f.3.dylib (compatibility version 7.0.0, current version 7.2.0)
    /sw/lib/openmpi/libmpi.1.dylib (compatibility version 2.0.0, current version 2.7.0)
    /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 169.3.0)
    /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate (compatibility version 1.0.0, current version 4.0.0)
    /sw/lib/gcc4.8/lib/libgomp.1.dylib (compatibility version 2.0.0, current version 2.0.0)
    /sw/lib/gcc4.8/lib/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1.0.0)

#7 Updated by Jack Howarth over 6 years ago

Is there a reason for the wildly different linkage order for mdrun_mpi? I can see moving one or two linkages but the new linkage seem to reorder almost all of the libraries used.

#8 Updated by Mark Abraham over 6 years ago

Jack Howarth wrote:

Is there a reason for the wildly different linkage order for mdrun_mpi? I can see moving one or two linkages but the new linkage seem to reorder almost all of the libraries used.

Discussion in #1067

#9 Updated by Mark Abraham over 6 years ago

Jack Howarth wrote:

While the stock gromacs build gf 4.6.3 passes the regressions tests from 4.6.2 on x86_64-apple-darwin12, you appear to have broken the linkage order required for the the ___emutls_get_address symbols to be resolved from /sw/lib/gcc4.8/lib/libgcc_s.1.dylib (aka libgcc_ext_10.5)....

  1. /sw/bin/mpiexec -np 8 -wdir /sw/src/fink.build/gromacs-mpi-4.6.3-1/gromacs-4.6.3/regressiontests-4.6.3/simple/angles1 mdrun_mpi -notunepme -table ../table -tablep ../tablep
    dyld: lazy symbol binding failed: Symbol not found: ___emutls_get_address
    Referenced from: /sw/lib/gcc4.8/lib/libgomp.1.dylib
    Expected in: /usr/lib/libSystem.B.dylib

Hmm, something is happening here that I don't yet understand. The relative order of libgomp and libSystem.B has not changed.

The functional linkage for mdrun_mpi from a gomacs 4.6.1 build is....

  1. otool -L mdrun_mpi
    mdrun_mpi:
    /sw/lib/openmpi/libmpi.1.dylib (compatibility version 2.0.0, current version 2.7.0)
    /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 169.3.0)
    /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate (compatibility version 1.0.0, current version 4.0.0)
    /sw/src/fink.build/gromacs-mpi-4.6.1-3/gromacs-4.6.1/build/src/kernel/libgmxpreprocess_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/src/fink.build/gromacs-mpi-4.6.1-3/gromacs-4.6.1/build/src/mdlib/libmd_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/src/fink.build/gromacs-mpi-4.6.1-3/gromacs-4.6.1/build/src/gmxlib/libgmx_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/lib/libfftw3f.3.dylib (compatibility version 7.0.0, current version 7.2.0)
    /sw/lib/gcc4.8/lib/libgomp.1.dylib (compatibility version 2.0.0, current version 2.0.0)
    /sw/lib/gcc4.8/lib/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1.0.0)

The broken mdrun_mpi linkage from 4.6.3 is....

  1. otool -L mdrun_mpi
    mdrun_mpi:
    /sw/src/fink.build/gromacs-mpi-4.6.3-1/gromacs-4.6.3/build/src/kernel/libgmxpreprocess_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/src/fink.build/gromacs-mpi-4.6.3-1/gromacs-4.6.3/build/src/mdlib/libmd_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/src/fink.build/gromacs-mpi-4.6.3-1/gromacs-4.6.3/build/src/gmxlib/libgmx_mpi.8.dylib (compatibility version 8.0.0, current version 8.0.0)
    /sw/lib/libfftw3f.3.dylib (compatibility version 7.0.0, current version 7.2.0)
    /sw/lib/openmpi/libmpi.1.dylib (compatibility version 2.0.0, current version 2.7.0)
    /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 169.3.0)
    /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate (compatibility version 1.0.0, current version 4.0.0)
    /sw/lib/gcc4.8/lib/libgomp.1.dylib (compatibility version 2.0.0, current version 2.0.0)
    /sw/lib/gcc4.8/lib/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1.0.0)

It looks like a mess, but we've actually just re-ordered a chunk of 4 libraries and a chunk of 3 libraries. :)

I can see no reason why this error should have come up. It doesn't on our mac build slave (various compilers, but never gcc with OpenMP). Googling finds http://stackoverflow.com/questions/7885246/what-is-the-emutls-get-address-symbol which suggests a now-fixed bug in gcc, but you're using gcc 4.8. Our *nix gcc-4.8 build slave is happy.

Maybe your libfftw was compiled with an old gcc? You could try hacking out the lines with ACCELERATE_FRAMEWORK in the top-level CMakeLists.txt. Otherwise, I'm out of my league with Mac linking.

#10 Updated by Roland Schulz over 6 years ago

The Jenkins clang 3.2 build also hangs in the kernel regression-tests. It is fine with -nt 2. So we didn't notice it with the automatic tests. With -nt 8 it hangs every ~1/30 times. The coverage build hangs (gcc) hangs about 1/3 times. I noticed it triggering the build for #1300. A test which hangs is e.g. ElecCoul_VdwBham_GeomW4P1. But it doesn't seem to be the only one.

#11 Updated by Rossen Apostolov almost 6 years ago

  • Assignee changed from Mark Abraham to Roland Schulz
  • Target version changed from 4.6.3 to 5.0

Jack, do the tests still fail with 4.6.5?

#12 Updated by Roland Schulz almost 6 years ago

  • Status changed from Feedback wanted to Accepted
  • Assignee changed from Roland Schulz to Mark Abraham

I just tried it again. ~/workspace/Gromacs_Coverage_Mac_master/bin/gmx mdrun on jenkins-mac for nb_kernel_ElecCoul_VdwBham_GeomW4P1. Before I reported that it would hang 1/3 times. This time the first hang was only after ~150x.

Thread 6 (process 16776):
#0  0x00007fff8534bbf2 in __psynch_mutexwait ()
#1  0x00007fff87de01a1 in pthread_mutex_lock ()
#2  0x0000000106ba4558 in tMPI_Thread_mutex_init_once (mtx=0x108481f20) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/gmxlib/thread_mpi/pthreads.c:418
#3  0x0000000106ba4739 in tMPI_Thread_mutex_lock (mtx=0x108481f20) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/gmxlib/thread_mpi/pthreads.c:472
#4  0x0000000106bc54af in gmx_nonbonded_setup (fr=0x7fd0bd024200, bGenericKernelOnly=0) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/gmxlib/nonbonded/nonbonded.c:111
#5  0x0000000107e271c9 in init_forcerec (fp=0x0, oenv=0x7fd0bac0ced0, fr=0x7fd0bd024200, fcd=0x7fd0bad04bc0, ir=0x7fd0bb82fe00, mtop=0x7fd0bae03860, cr=0x7fd0bae037e0, box=0x109106b50, tabfn=0x7fd0bae03260 "table.xvg", tabafn=0x7fd0bae032a0 "tabletf.xvg", tabpfn=0x7fd0bae032e0 "tablep.xvg", tabbfn=0x7fd0bae03320 "table.xvg", nbpu_opt=0x106654200 "auto", bNoSolvOpt=0, print_force=-1) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/mdlib/forcerec.c:2960
#6  0x000000010663c5c4 in mdrunner (hw_opt=0x109106d40, fplog=0x0, cr=0x7fd0bae037e0, nfile=33, fnm=0x7fd0bb82f600, oenv=0x7fd0bac0ced0, bVerbose=0, bCompact=1, nstglobalcomm=-1, ddxyz=0x109106dac, dd_node_order=1, rdd=0, rconstr=0, dddlb_opt=0x106654200 "auto", dlb_scale=0.800000012, ddcsx=0x0, ddcsy=0x0, ddcsz=0x0, nbpu_opt=0x106654200 "auto", nstlist_cmdline=0, nsteps_cmdline=-2, nstepout=100, resetstep=-1, nmultisim=0, repl_ex_nst=0, repl_ex_nex=0, repl_ex_seed=-1, pforce=-1, cpt_period=15, max_hours=-1, deviceOptions=0x106641fb4 "", Flags=1055744) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/programs/mdrun/runner.c:1572
#7  0x0000000106636ec5 in mdrunner_start_fn (arg=0x7fd0bac0c800) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/programs/mdrun/runner.c:172
#8  0x0000000106bb1c49 in tMPI_Thread_starter (arg=0x7fd0bb00da48) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/gmxlib/thread_mpi/tmpi_init.c:397
#9  0x0000000106ba4041 in tMPI_Thread_starter ()
#10 0x00007fff87de18bf in _pthread_start ()
#11 0x00007fff87de4b75 in thread_start ()

Thread 2 (process 16776):
#0  0x00007fff8534bbca in __psynch_cvwait ()
#1  0x00007fff87de5274 in _pthread_cond_wait ()
#2  0x0000000106ba4edf in tMPI_Thread_cond_wait (cond=0x7fd0bac0fa28, mtx=0x7fd0bac0f9e0) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/gmxlib/thread_mpi/pthreads.c:709
#3  0x0000000106baded5 in tMPI_Comm_split (comm=0x7fd0bac0f910, color=-1, key=1, newcomm=0x108efa7f0) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/gmxlib/thread_mpi/comm.c:650
#4  0x0000000107dd4500 in make_load_communicator (dd=0x7fd0bc301c40, dim_ind=0, loc=0x108efa840) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/mdlib/domdec.c:5637
#5  0x0000000107dd4e49 in make_load_communicators (dd=0x7fd0bc301c40) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/mdlib/domdec.c:5746
#6  0x0000000107dd5b8f in setup_dd_grid (fplog=0x0, dd=0x7fd0bc301c40) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/mdlib/domdec.c:5932
#7  0x000000010663d24f in mdrunner (hw_opt=0x108efad40, fplog=0x0, cr=0x7fd0bae02a60, nfile=33, fnm=0x7fd0bb82e800, oenv=0x7fd0bac0ced0, bVerbose=0, bCompact=1, nstglobalcomm=-1, ddxyz=0x108efadac, dd_node_order=1, rdd=0, rconstr=0, dddlb_opt=0x106654200 "auto", dlb_scale=0.800000012, ddcsx=0x0, ddcsy=0x0, ddcsz=0x0, nbpu_opt=0x106654200 "auto", nstlist_cmdline=0, nsteps_cmdline=-2, nstepout=100, resetstep=-1, nmultisim=0, repl_ex_nst=0, repl_ex_nex=0, repl_ex_seed=-1, pforce=-1, cpt_period=15, max_hours=-1, deviceOptions=0x106641fb4 "", Flags=1055744) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/programs/mdrun/runner.c:1736
#8  0x0000000106636ec5 in mdrunner_start_fn (arg=0x7fd0bac0c800) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/programs/mdrun/runner.c:172
#9  0x0000000106bb1c49 in tMPI_Thread_starter (arg=0x7fd0bb00d3a8) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/gmxlib/thread_mpi/tmpi_init.c:397
#10 0x0000000106ba4041 in tMPI_Thread_starter ()
#11 0x00007fff87de18bf in _pthread_start ()
#12 0x00007fff87de4b75 in thread_start ()

Thread 1 (process 16776):
#0  0x00007fff8534bbca in __psynch_cvwait ()
#1  0x00007fff87de5274 in _pthread_cond_wait ()
#2  0x0000000106ba4edf in tMPI_Thread_cond_wait (cond=0x7fd0bac0fa70, mtx=0x7fd0bac0f9e0) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/gmxlib/thread_mpi/pthreads.c:709
#3  0x0000000106bade09 in tMPI_Comm_split (comm=0x7fd0bac0f910, color=0, key=0, newcomm=0x7fff661d7cf0) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/gmxlib/thread_mpi/comm.c:627
#4  0x0000000107dd4500 in make_load_communicator (dd=0x7fd0bae053e0, dim_ind=0, loc=0x7fff661d7d40) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/mdlib/domdec.c:5637
#5  0x0000000107dd4e49 in make_load_communicators (dd=0x7fd0bae053e0) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/mdlib/domdec.c:5746
#6  0x0000000107dd5b8f in setup_dd_grid (fplog=0x7fff7220dee0, dd=0x7fd0bae053e0) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/mdlib/domdec.c:5932
#7  0x000000010663d24f in mdrunner (hw_opt=0x7fff661da800, fplog=0x7fff7220dee0, cr=0x7fd0bad017d0, nfile=33, fnm=0x7fff661d9c30, oenv=0x7fd0bac0ced0, bVerbose=0, bCompact=1, nstglobalcomm=-1, ddxyz=0x7fff661da920, dd_node_order=1, rdd=0, rconstr=0, dddlb_opt=0x106654200 "auto", dlb_scale=0.800000012, ddcsx=0x0, ddcsy=0x0, ddcsz=0x0, nbpu_opt=0x106654200 "auto", nstlist_cmdline=0, nsteps_cmdline=-2, nstepout=100, resetstep=-1, nmultisim=0, repl_ex_nst=0, repl_ex_nex=0, repl_ex_seed=-1, pforce=-1, cpt_period=15, max_hours=-1, deviceOptions=0x106641fb4 "", Flags=1055744) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/programs/mdrun/runner.c:1736
#8  0x000000010663fc21 in gmx_mdrun (argc=1, argv=0x7fff661daba0) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/programs/mdrun/mdrun.cpp:752
#9  0x000000010673c7d9 in run (this=0x7fd0bac01ee0, argc=3, argv=0x7fff661daba0) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/commandline/cmdlinemodulemanager.cpp:928
#10 0x000000010673f531 in gmx::CommandLineModuleManager::run (this=0x7fff661dab20, argc=3, argv=0x7fff661daba0) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/gromacs/commandline/cmdlinemodulemanager.cpp:1308
#11 0x00000001065dd8f7 in main (argc=4, argv=0x7fff661dab98) at /Volumes/workspace/Gromacs_Coverage_Mac_master/src/programs/gmx/gmx.cpp:58

All other threads have the same trace as 1. The value of the mutex on thread 6 is:

(gdb) print mutex_init
$2 = {
  __sig = 1297437784, 
  __opaque = "\000\000\000\000` ", '\0' <repeats 11 times>, "9\000\000\0009\000\000\000\000\000\000\000\000\000\000x\016#\b\001\000\000\000|\016#\b\001\000\000\000\000\000\000\000\000\000\000" 
}

Anyone knows how to get the owner from that? It seems that thread 6 is the problem. mutex_init shouldn't be locked so it shouldn't be hanging there. mutex_init is only locked in tMPI_Thread_mutex_init_once and it always unlocks it before exiting. Anyone any idea?

#13 Updated by Gerrit Code Review Bot almost 6 years ago

Gerrit received a related patchset '1' for Issue #1296.
Uploader: Roland Schulz ()
Change-Id: If5b771c858f23057d28729fd650820ee4f3451e9
Gerrit URL: https://gerrit.gromacs.org/2969

#14 Updated by Roland Schulz almost 6 years ago

  • Status changed from Accepted to In Progress
  • Assignee changed from Mark Abraham to Roland Schulz
  • Target version changed from 5.0 to 4.6.6

#15 Updated by Roland Schulz almost 6 years ago

  • Status changed from In Progress to Resolved

#16 Updated by Rossen Apostolov almost 6 years ago

  • Status changed from Resolved to Closed

Great!

Also available in: Atom PDF