Since I also have access to an RH 6 machine, I decided to try
installing there. I built using the RH included python, 2.6.6, using
gcc 4.7.0 and openmpi-1.6.0. I get some additional messages about
infiniband that didn't show up on RH 5. Here's the output to
Lisandro's suggested command line (I only have openmpi, not mpich to
use):
$ mpirun -np 5 python test/runtests.py --verbose --no-threads --
include cco_obj_inter
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
CMA: unable to open /dev/infiniband/rdma_cm
CMA: unable to open /dev/infiniband/rdma_cm
CMA: unable to open /dev/infiniband/rdma_cm
CMA: unable to open /dev/infiniband/rdma_cm
CMA: unable to open /dev/infiniband/rdma_cm
[
4...@host-rh6.engin.umich.edu] Python 2.6 (/usr/bin/python)
[
4...@host-rh6.engin.umich.edu] MPI 2.1 (Open MPI 1.6.0)
[
4...@host-rh6.engin.umich.edu] mpi4py 1.3 (build/lib.linux-x86_64-2.6/
mpi4py)
[
2...@host-rh6.engin.umich.edu] Python 2.6 (/usr/bin/python)
[
2...@host-rh6.engin.umich.edu] MPI 2.1 (Open MPI 1.6.0)
[
2...@host-rh6.engin.umich.edu] mpi4py 1.3 (build/lib.linux-x86_64-2.6/
mpi4py)
[
1...@host-rh6.engin.umich.edu] Python 2.6 (/usr/bin/python)
[
1...@host-rh6.engin.umich.edu] MPI 2.1 (Open MPI 1.6.0)
[
1...@host-rh6.engin.umich.edu] mpi4py 1.3 (build/lib.linux-x86_64-2.6/
mpi4py)
[
0...@host-rh6.engin.umich.edu] Python 2.6 (/usr/bin/python)
[
0...@host-rh6.engin.umich.edu] MPI 2.1 (Open MPI 1.6.0)
[
0...@host-rh6.engin.umich.edu] mpi4py 1.3 (build/lib.linux-x86_64-2.6/
mpi4py)
[
3...@host-rh6.engin.umich.edu] Python 2.6 (/usr/bin/python)
[
3...@host-rh6.engin.umich.edu] MPI 2.1 (Open MPI 1.6.0)
[
3...@host-rh6.engin.umich.edu] mpi4py 1.3 (build/lib.linux-x86_64-2.6/
mpi4py)
testAllgather (test_cco_obj_inter.TestCCOObjInter) ... testAllgather
(test_cco_obj_inter.TestCCOObjInter) ... testAllgather
(test_cco_obj_inter.TestCCOObjInter) ... testAllgather
(test_cco_obj_inter.TestCCOObjInter) ... testAllgather
(test_cco_obj_inter.TestCCOObjInter) ... ERROR
system information:
$ python
Python 2.6.6 (r266:84292, Sep 12 2011, 14:03:14)
[GCC 4.4.5 20110214 (Red Hat 4.4.5-6)] on linux2
$ gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/home/software/rhel6/gcc/4.7.0/libexec/gcc/x86_64-
unknown-linux-gnu/4.7.0/lto-wrapper
Target: x86_64-unknown-linux-gnu
Configured with: ../gcc-4.7.0/configure --prefix=/home/software/rhel6/
gcc/4.7.0 --with-mpfr=/home/software/rhel6/gcc/mpfr-3.1.0/ --with-mpc=/
home/software/rhel6/gcc/mpc-0.9/ --with-gmp=/home/software/rhel6/gcc/
gmp-5.0.5/ --disable-multilib
Thread model: posix
gcc version 4.7.0 (GCC)
Again, there seems to be something 'magical' about -np 5, as it passes
all tests with any other number of procs.
I just also note that it does not always error on the task with rank
4, it sometimes errors on rank 3 (I think I have the terminology
right).
testAllgather (test_cco_obj_inter.TestCCOObjInter) ... testAllgather
(test_cco_obj_inter.TestCCOObjInter) ... testAllgather
(test_cco_obj_inter.TestCCOObjInter) ... testAllgather
(test_cco_obj_inter.TestCCOObjInter) ... testAllgather
(test_cco_obj_inter.TestCCOObjInter) ... ERROR
ERROR
-- bennet
> hg clonehttps://
code.google.com/p/mpi4py/