Hello,
I have tried to build and install mpi4py ( version 1.1.0 ) on a Rocks
4.3 cluster frontend and when trying to run the demo it does not work
well ( see below) although I could not say it does not work at all. I
have tried with different versions of mpi.cfg as openMPI and mpich are
both installed with no better result. The build looks ok. I have no
idea on where to go next ... I take any suggestion !
Marc
$mpirun -mca btl self,tcp -np 5 --nolocal --hostfile hosts.txt
python2.4 demo/helloworld.py
Hello, World! I am process 1 of 5 on compute-0-1.local.
Hello, World! I am process 0 of 5 on compute-0-0.local.
Hello, World! I am process 3 of 5 on compute-0-4.local.
[compute-0-1.local:04708] *** An error occurred in MPI_Errhandler_free
[compute-0-0.local:08909] *** An error occurred in MPI_Errhandler_free
[compute-0-1.local:04708] *** on communicator MPI_COMM_WORLD
[compute-0-1.local:04708] *** MPI_ERR_ARG: invalid argument of some
other kind
[compute-0-1.local:04708] *** MPI_ERRORS_ARE_FATAL (goodbye)
[compute-0-4.local:31197] *** An error occurred in MPI_Errhandler_free
[compute-0-4.local:31197] *** on communicator MPI_COMM_WORLD
[compute-0-4.local:31197] *** MPI_ERR_ARG: invalid argument of some
other kind
[compute-0-4.local:31197] *** MPI_ERRORS_ARE_FATAL (goodbye)
[compute-0-0.local:08909] *** on communicator MPI_COMM_WORLD
[compute-0-0.local:08909] *** MPI_ERR_ARG: invalid argument of some
other kind
[compute-0-0.local:08909] *** MPI_ERRORS_ARE_FATAL (goodbye)
[
merlan.im2np.fr:32504] [0,0,0]-[0,1,2] mca_oob_tcp_msg_recv: readv
failed with errno=104
[
merlan.im2np.fr:32504] [0,0,0]-[0,1,4] mca_oob_tcp_msg_recv: readv
failed with errno=104
1 additional process aborted (not shown)