Project

General

Profile

Bug #572

CMake + MPI does not work on PowerPC

Added by Justin Lemkul about 9 years ago. Updated about 9 years ago.

Status:
Closed
Priority:
Normal
Assignee:
Erik Lindahl
Category:
mdrun
Target version:
Affected version - extra info:
Affected version:
Difficulty:
uncategorized
Close

Description

I reported several issues on the gmx-users list yesterday (http://lists.gromacs.org/pipermail/gmx-users/2010-September/054352.html), so I'll break things down into several issues. The first is that on Mac OSX 10.3 on PowerPC, compiling Gromacs with CMake and OpenMPI-1.2.3 does not actually result in an MPI-enabled mdrun. Instead, it executes serial jobs over each of the requested nodes.

Compilers are gcc-3.3 and OpenMPI-1.2.3. The native MPI implementation (mpich-1.2.5) does not work; the installation fails because apparently mpi.h cannot be found. I think this is a problem on our cluster rather than Gromacs, although utilizing the native MPI worked in previous versions (4.0.x and prior).

Here's how I'm building mdrun:

alias cmake=’/home/rdiv1001/compilers/cmake-2.8.2-osx/bin/cmake’

cmake ../gromacs-4.5.1 -DFFTW3F_LIBRARIES=/home/rdiv1001/fftw-3.0.1-osx/lib/libfftw3f.a
-DFFTW3F_INCLUDE_DIR=/home/rdiv1001/fftw-3.0.1-osx/include/
-DCMAKE_INSTALL_PREFIX=/home/rdiv1001/gromacs-4.5_cmake-osx
–DGMX_BINARY_SUFFIX=_4.5_cmake_mpi –DGMX_THREADS=OFF –DBUILD_SHARED_LIBS=OFF
–DGMX_X11=OFF –DGMX_MPI=ON –DMPI_COMPILER=/home/rdiv1001/compilers/openmpi-1.2.3-osx/bin/mpicxx –DMPI_INCLUDE_PATH=/home/rdiv1001/compilers/openmpi-1.2.3-osx/include

make mdrun

make install-mdrun

The resulting mdrun binary reports the following when I attempt to run the command over two nodes:

Log file opened on Mon Sep 27 21:36:00 2010
Host: n235 pid: 6857 nodeid: 0 nnodes: 1
The Gromacs distribution was built TMP_TIME by
[CMAKE] (TMP_MACHINE)

Aside from the junk being printed, the use of nnodes = 1 seems to indicate that the MPI is not being invoked properly.

I should note that this issue may be compiler-specific, rather than hardware-specific, as my initial report suggested. We have an additional partition on the cluster running Yellowdog Linux and more modern compilers (gcc-4.2.2 and OpenMPI-1.4.2). Installing using these compilers creates a properly-functioning mdrun_mpi executable.

I cannot solve the problem on the OSX partition by upgrading compilers. Long story, but I'm stuck with the compilers I've listed above. I am working with the system admins to see if there's anything that can be done, but there are some proprietary agreements that prevent them from installing newer open-source compilers (and I've had no luck at all with the IBM compilers).

History

#1 Updated by Roland Schulz about 9 years ago

I forgot to ask you that on the mailing list? Do you get any output regarding MPI from cmake? What are the MPI related values in CMakeCache.txt?

#2 Updated by Justin Lemkul about 9 years ago

(In reply to comment #1)

I forgot to ask you that on the mailing list? Do you get any output regarding
MPI from cmake? What are the MPI related values in CMakeCache.txt?

I'm going to start again using the native MPI implementation (mpich-1.2.5) as a new starting point. Basically, I'd like to replicate what I've always been able to do with --enable-mpi in previous versions. So here goes:

cmake ../gromacs-4.5.1 -DFFTW3F_LIBRARIES=/home/rdiv1001/fftw-3.0.1-osx/lib/libfftw3f.a -DFFTW3F_INCLUDE_DIR=/home/rdiv1001/fftw-3.0.1-osx/include/ -DCMAKE_INSTALL_PREFIX=/home/rdiv1001/gromacs-4.5_cmake-DGMX_BINARY_SUFFIX=_4.5_cmake_mpi -DGMX_THREADS=OFF -DBUILD_SHARED_LIBS=OFF -DGMX_X11=OFF -DGMX_MPI=ON

Pertinent information taken from CMakeCache.txt:

CMAKE_CXX_COMPILER:FILEPATH=/usr/bin/c++
CMAKE_C_COMPILER:FILEPATH=/usr/bin/gcc
GMX_MPI:BOOL=ON
//Enable MPI_IN_PLACE for MPIs that have it defined
GMX_MPI_IN_PLACE:BOOL=ON
//Executable for running MPI programs.
MPIEXEC:FILEPATH=/nfs/auts/compilers/mpich-1.2.5/bin/mpirun
//Maximum number of processors available to run MPI applications.
MPIEXEC_MAX_NUMPROCS:STRING=2
//Flag used by MPI to specify the number of processes for MPIEXEC;
MPIEXEC_NUMPROC_FLAG:STRING=-np
//These flags will come after all flags given to MPIEXEC.
MPIEXEC_POSTFLAGS:STRING=
// run by MPIEXEC.
MPIEXEC_PREFLAGS:STRING=
//MPI compiler. Used only to detect MPI compilation flags.
MPI_COMPILER:FILEPATH=/nfs/auts/compilers/mpich-1.2.5/bin/mpicxx
//MPI compilation flags
MPI_COMPILE_FLAGS:STRING=-DUSE_STDARG -D_SMP_ -DUSE_INLINE -DEARLY_SEND_COMPLETION -DLAZY_MEM_UNREGISTER -D_REENTRANT -DVIADEV_RPUT_SUPPORT -DMT_BIG_ENDIAN -DVAPI -DUSE_STDARG -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_UNISTD_H=1 -DHAVE_STDARG_H=1 -DUSE_STDARG=1 -DMALLOC_RET_VOID=1
//Extra MPI libraries to link against
MPI_EXTRA_LIBRARY:STRING=/nfs/compilers/mpich-1.2.5/lib/libpmpich.a;/nfs/compilers/mpich-1.2.5/lib/libmpich.a;/nfs/compilers/mpich-1.2.5/lib/libpmpich.a;/nfs/compilers/mpich-1.2.5/lib/libmpich.a;/usr/mellanox/lib/libmtl_common.dylib;/usr/mellanox/lib/libvapi.dylib;/usr/mellanox/lib/libmpga.dylib;/usr/lib/libdl.dylib
//MPI include path
MPI_INCLUDE_PATH:STRING=/usr/mellanox/include;/usr/mellanox/wrap
//MPI library to link against
MPI_LIBRARY:FILEPATH=/nfs/compilers/mpich-1.2.5/lib/libpmpich++.a
//MPI linking flags
MPI_LINK_FLAGS:STRING=
...
//Details about finding MPI
FIND_PACKAGE_MESSAGE_DETAILS_MPI:INTERNAL=[/nfs/compilers/mpich-1.2.5/lib/libpmpich++.a][/usr/mellanox/include;/usr/mellanox/wrap]
...
//ADVANCED property for variable: MPIEXEC
MPIEXEC-ADVANCED:INTERNAL=1
//ADVANCED property for variable: MPIEXEC_MAX_NUMPROCS
MPIEXEC_MAX_NUMPROCS-ADVANCED:INTERNAL=1
//ADVANCED property for variable: MPIEXEC_NUMPROC_FLAG
MPIEXEC_NUMPROC_FLAG-ADVANCED:INTERNAL=1
//ADVANCED property for variable: MPIEXEC_POSTFLAGS
MPIEXEC_POSTFLAGS-ADVANCED:INTERNAL=1
//ADVANCED property for variable: MPIEXEC_PREFLAGS
MPIEXEC_PREFLAGS-ADVANCED:INTERNAL=1
//ADVANCED property for variable: MPI_COMPILER
MPI_COMPILER-ADVANCED:INTERNAL=1
//ADVANCED property for variable: MPI_COMPILE_FLAGS
MPI_COMPILE_FLAGS-ADVANCED:INTERNAL=1
//ADVANCED property for variable: MPI_EXTRA_LIBRARY
MPI_EXTRA_LIBRARY-ADVANCED:INTERNAL=1
//ADVANCED property for variable: MPI_INCLUDE_PATH
MPI_INCLUDE_PATH-ADVANCED:INTERNAL=1
//Result of TRY_COMPILE
MPI_IN_PLACE_COMPILE_OK:INTERNAL=FALSE
//Scratch variable for MPI detection
MPI_LIB:INTERNAL=MPI_LIB-NOTFOUND

Unfortunately, "make mdrun" fails with the following errors. Using the native MPI implementation has worked in the past, so I don't know what's wrong. Here's what I get from "make mdrun":

Scanning dependencies of target gmx[ 0%] Building C object src/gmxlib/CMakeFiles/gmx.dir/replace.c.oIn file included from /home/rdiv1001/install/gromacs-4.5.1/include/typedefs.h:61,
from /home/rdiv1001/install/gromacs-4.5.1/include/macros.h:39,
from /home/rdiv1001/install/gromacs-4.5.1/src/gmxlib/replace.c:43:
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:39:17: mpi.h: No such file or directory
In file included from /home/rdiv1001/install/gromacs-4.5.1/include/typedefs.h:61,
from /home/rdiv1001/install/gromacs-4.5.1/include/macros.h:39,
from /home/rdiv1001/install/gromacs-4.5.1/src/gmxlib/replace.c:43:
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:132: error: parse error before "MPI_Comm"
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:132: warning: no semicolon at end of struct or union
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:145: error: parse error before "req_pme"
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:145: warning: type defaults to `int' in declaration of `req_pme'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:145: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:217: error: parse error before '}' token
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:217: warning: type defaults to `int' in declaration of `gmx_domdec_t'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:217: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:224: error: parse error before "MPI_Group"
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:224: warning: no semicolon at end of struct or union
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:225: warning: type defaults to `int' in declaration of `mpi_comm_masters'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:225: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:229: error: parse error before '}' token
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:229: warning: type defaults to `int' in declaration of `gmx_multisim_t'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:229: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:236: error: parse error before "MPI_Comm"
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:236: warning: no semicolon at end of struct or union
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:238: error: parse error before "comm_inter"
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:238: warning: type defaults to `int' in declaration of `comm_inter'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:238: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:240: warning: type defaults to `int' in declaration of `gmx_nodecomm_t'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:240: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:257: error: parse error before "MPI_Comm"
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:257: warning: no semicolon at end of struct or union
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:258: warning: type defaults to `int' in declaration of `mpi_comm_mygroup'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:258: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:264: error: parse error before "nc"
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:264: warning: type defaults to `int' in declaration of `nc'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:264: error: conflicting types for `nc'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:149: error: previous declaration of `nc'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:264: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:267: error: parse error before '*' token
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:267: warning: type defaults to `int' in declaration of `dd'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:267: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:275: error: parse error before '*' token
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:275: warning: type defaults to `int' in declaration of `ms'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:275: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:280: error: parse error before '}' token
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:280: warning: type defaults to `int' in declaration of `t_commrec'
/home/rdiv1001/install/gromacs-4.5.1/include/types/commrec.h:280: warning: data definition has no type or storage class
In file included from /home/rdiv1001/install/gromacs-4.5.1/include/typedefs.h:63,
from /home/rdiv1001/install/gromacs-4.5.1/include/macros.h:39,
from /home/rdiv1001/install/gromacs-4.5.1/src/gmxlib/replace.c:43:
/home/rdiv1001/install/gromacs-4.5.1/include/types/fcdata.h:65: error: parse error before "MPI_Comm"
/home/rdiv1001/install/gromacs-4.5.1/include/types/fcdata.h:65: warning: no semicolon at end of struct or union
/home/rdiv1001/install/gromacs-4.5.1/include/types/fcdata.h:66: warning: type defaults to `int' in declaration of `t_disresdata'
/home/rdiv1001/install/gromacs-4.5.1/include/types/fcdata.h:66: warning: data definition has no type or storage class
/home/rdiv1001/install/gromacs-4.5.1/include/types/fcdata.h:112: error: parse error before "t_disresdata"
/home/rdiv1001/install/gromacs-4.5.1/include/types/fcdata.h:112: warning: no semicolon at end of struct or union
/home/rdiv1001/install/gromacs-4.5.1/include/types/fcdata.h:115: error: parse error before '}' token
/home/rdiv1001/install/gromacs-4.5.1/include/types/fcdata.h:115: warning: type defaults to `int' in declaration of `t_fcdata'
/home/rdiv1001/install/gromacs-4.5.1/include/types/fcdata.h:115: warning: data definition has no type or storage class
In file included from /home/rdiv1001/install/gromacs-4.5.1/include/typedefs.h:66,
from /home/rdiv1001/install/gromacs-4.5.1/include/macros.h:39,
from /home/rdiv1001/install/gromacs-4.5.1/src/gmxlib/replace.c:43:
/home/rdiv1001/install/gromacs-4.5.1/include/types/ifunc.h:52: error: parse error before "t_fcdata"
make3: * [src/gmxlib/CMakeFiles/gmx.dir/replace.c.o] Error 1
make2:
[src/gmxlib/CMakeFiles/gmx.dir/all] Error 2
make1:
[src/kernel/CMakeFiles/mdrun.dir/rule] Error 2
make: *
[mdrun] Error 2

It is because of these errors that I tried using OpenMPI-1.2.3, but that work-around doesn't seem to work either, so I think it best to diagnose the issue from here.

#3 Updated by Roland Schulz about 9 years ago

the problem is that the cmake/FindMPI.cmake doesn't detect the include path correctly and thus the mpi.h isn't found.

Cmake uses "mpicc -show" to find the include path instead of using mpicc during the compile. What is the output of "mpicc -show" for you? And where is the mpi.h?

#4 Updated by Roland Schulz about 9 years ago

Rossen, Teemu,

what is the latest idea on how to fix Cmake with Cray? If we allow to use the wrapper for that issue, this would fix this issue too.

#5 Updated by Justin Lemkul about 9 years ago

(In reply to comment #3)

the problem is that the cmake/FindMPI.cmake doesn't detect the include path
correctly and thus the mpi.h isn't found.

Cmake uses "mpicc -show" to find the include path instead of using mpicc during
the compile. What is the output of "mpicc -show" for you? And where is the
mpi.h?

$ mpicc -show
gcc -I/usr/mellanox/include -I/usr/mellanox/wrap -L/usr/mellanox/lib -L/nfs/compilers/mpich-1.2.5/lib -lpmpich -lmpich -lpmpich -lmpich -lmtl_common -lvapi -lmpga -ldl

I'll have to find out from our admins where mpi.h is located. Weirdly, I can't find one. But I know the native MPI implementation used to work with all prior versions of Gromacs, so there should be a way to work with it. I will report back when I know.

#6 Updated by Teemu Murtola about 9 years ago

(In reply to comment #4)

what is the latest idea on how to fix Cmake with Cray? If we allow to use the
wrapper for that issue, this would fix this issue too.

I would still advocate for the possibility of using the wrapper when the user has provided it, e.g., through CMAKE_C_COMPILER. That would be by far the most robust way, and as said, should probably also fix the problems here. It would probably be possible to just add a try_compile() for a simple MPI program to the beginning of FindMPI.cmake and some logic to skip everything else if that succeeds, without changes in other parts.

#7 Updated by Justin Lemkul about 9 years ago

Thus far, my system admins have not responded to any of my inquiries, so I'm still flying solo. What I did manage to find were alternate installations of mpich-1.2.5 that are available system-wide that should function properly. I located mpi.h (location given in the command below), but the CMake installation still fails.

cmake ../gromacs-4.5.1 -DFFTW3F_LIBRARIES=/home/rdiv1001/fftw-3.0.1-osx/lib/libfftw3f.a -DFFTW3F_INCLUDE_DIR=/home/rdiv1001/fftw-3.0.1-osx/include/ -DCMAKE_INSTALL_PREFIX=/home/rdiv1001/gromacs-4.5.1_cmake-osx-test -DGMX_BINARY_SUFFIX=_4.5.1_cmake_mpi -DGMX_THREADS=OFF -DBUILD_SHARED_LIBS=OFF -DGMX_X11=OFF -DGMX_MPI=ON -DMPI_INCLUDE_PATH=/nfs/compilers/mpich-1.2.5-gcc3.4.6/include/ -DMPI_COMPILER=/nfs/compilers/mpich-1.2.5-gcc3.4.6/bin/mpicc -DMPI_LIBRARY=/nfs/compilers/mpich-1.2.5-gcc3.4.6/lib/libmpich.a

make mdrun
...

Linking C executable mdrun_4.5.1_cmake_mpi
ld: Undefined symbols:
_EVAPI_get_hca_hndl
_EVAPI_release_hca_hndl
_EVAPI_set_async_event_handler
_VAPI_alloc_pd
_VAPI_create_cq
_VAPI_create_qp
_VAPI_dealloc_pd
_VAPI_deregister_mr
_VAPI_destroy_cq
_VAPI_destroy_qp
_VAPI_event_record_sym
_VAPI_event_syndrome_sym
_VAPI_modify_qp
_VAPI_open_hca
_VAPI_query_hca_port_prop
_VAPI_query_qp
_VAPI_strerror
_VAPI_register_mr
_VAPI_poll_cq
_EVAPI_post_inline_sr
_VAPI_post_rr
_VAPI_post_sr
_PMPI_Allreduce
_PMPI_Bcast
_PMPI_Sendrecv
make3: * [src/kernel/mdrun_4.5.1_cmake_mpi] Error 1
make2:
[src/kernel/CMakeFiles/mdrun.dir/all] Error 2
make1:
[src/kernel/CMakeFiles/mdrun.dir/rule] Error 2
make: *
[mdrun] Error 2

As far as I know, those symbols should be defined in libmpich.a, correct? So why won't mdrun link against this library properly? It links against my static FFTW libraries just fine.

#8 Updated by Roland Schulz about 9 years ago

it is not enough to link against libmpich.a. According to your mpicc -show output you need to link against -lpmpich -lmpich -lpmpich -lmpich -lmtl_common
-lvapi -lmpga -ldl.

#9 Updated by Justin Lemkul about 9 years ago

(In reply to comment #8)

it is not enough to link against libmpich.a. According to your mpicc -show
output you need to link against -lpmpich -lmpich -lpmpich -lmpich -lmtl_common
-lvapi -lmpga -ldl.

Can you tell me how to make CMake do that? It's simple with autoconf, just "--enable-mpi" and it goes just fine. The problem there, of course, is bug 574.

Sorry for the flurry of requests that are only peripherally related to Gromacs, but the sysadmins here are ignoring me, and I feel that, in the absence of some more documentation (and perhaps wikifying some of these issues and their resolutions, once they've actually been resolved), it's almost hopeless for a completely new user to try to make sense of something that suddenly doesn't work.

#10 Updated by Roland Schulz about 9 years ago

the libraries seemed to be detected correctly by cmake. From the CMakeCache you posted:
MPI_EXTRA_LIBRARY:STRING=/nfs/compilers/mpich-1.2.5/lib/libpmpich.a;/nfs/compilers/mpich-1.2.5/lib/libmpich.a;/nfs/compilers/mpich-1.2.5/lib/libpmpich.a;/nfs/compilers/mpich-1.2.5/lib/libmpich.a;/usr/mellanox/lib/libmtl_common.dylib;/usr/mellanox/lib/libvapi.dylib;/usr/mellanox/lib/libmpga.dylib;/usr/lib/libdl.dylib

#11 Updated by Justin Lemkul about 9 years ago

OK, I've got this sorted out. I don't know whether the resolution will be considered a CMake/Gromacs bug, or the weird way our cluster appears to be set up. In any case, the CMake script was detecting the MPI_COMPILER as /nfs/auts/compilers/mpich-1.2.5/bin/mpicxx, but then trying to set MPI_LIBRARY and MPI_INCLUDE_PATH to different locations that are not related to this compiler. If I issue:

cmake ../gromacs-4.5.1 -DFFTW3F_LIBRARIES=/home/rdiv1001/fftw-3.0.1-osx/lib/libfftw3f.a -DFFTW3F_INCLUDE_DIR=/home/rdiv1001/fftw-3.0.1-osx/include/ -DCMAKE_INSTALL_PREFIX=/home/rdiv1001/gromacs-4.5.1_cmake-osx-test -DGMX_BINARY_SUFFIX=_4.5.1_cmake_mpi -DGMX_THREADS=OFF -DBUILD_SHARED_LIBS=OFF -DGMX_X11=OFF -DGMX_MPI=ON -DMPI_INCLUDE_PATH=/nfs/auts/compilers/mpich-1.2.5/include -DMPI_LIBRARY=/nfs/auts/compilers/mpich-1.2.5/lib/libpmpich++.a

make mdrun

make install-mdrun

I get a functioning, MPI-aware mdrun binary. Hooray!

I'll leave it to the developers to render a verdict, but it seems to me that the mismatch between compiler, libraries, and header locations could cause quite a lot of headaches for some people.

#12 Updated by Roland Schulz about 9 years ago

The problems seems that the mpicc -show output is wrong and thus it is a bug of the cluster setup and not of GROMACS. Having said this, I agree with Teemu that if we allow the mpicc wrapper to be used than it wouldn't be any problem. Thus I changed it to "enhancement".

#13 Updated by Rossen Apostolov about 9 years ago

Hi Justin,

I've commited a patch to CMake from Roland that uses the mpicc wrapper, can you check whether it solves you're problem?

#14 Updated by Justin Lemkul about 9 years ago

(In reply to comment #13)

Hi Justin,

I've commited a patch to CMake from Roland that uses the mpicc wrapper, can you
check whether it solves you're problem?

No, it's still not working. The errors are the same as before. My commands:

cmake ../gromacs -DFFTW3F_LIBRARIES=/home/rdiv1001/fftw-3.0.1-osx/lib/libfftw3f.a -DFFTW3F_INCLUDE_DIR=/home/rdiv1001/fftw-3.0.1-osx/include/ -DCMAKE_INSTALL_PREFIX=/home/rdiv1001/gromacs_4.5.2_git_cmake_mpich-osx -DGMX_BINARY_SUFFIX=_4.5.2_git_mpi -DGMX_THREADS=OFF -DBUILD_SHARED_LIBS=OFF -DGMX_MPI=ON

make mdrun

I'm still also getting identical behavior from the very first post. Specifying a functional OpenMPI implementation still results in a non-MPI mdrun.

And what's more, the -DGMX_BINARY_SUFFIX is now completely ignored. With OpenMPI (see original bug report), mdrun builds, but is called "mdrun_mpi" instead of "mdrun_4.5.2_git_mpi." Separate issue, I suppose, but worth mentioning.

#15 Updated by Roland Schulz about 9 years ago

you have to set the C compiler to mpicc to use this option.
Thus add -D CMAKE_C_COMPILER=mpicc

#16 Updated by Justin Lemkul about 9 years ago

(In reply to comment #15)

you have to set the C compiler to mpicc to use this option.
Thus add -D CMAKE_C_COMPILER=mpicc

Got it. Works now, thanks. I am now getting correct MPI behavior. The bug can be closed. Thanks for everyone's efforts!

#17 Updated by Rossen Apostolov about 9 years ago

Great, closing it then!

Also available in: Atom PDF