Error: Unable to get the current working directory

1,850 views
Skip to first unread message

olga

unread,
Oct 1, 2012, 9:15:02 AM10/1/12
to deap-...@googlegroups.com
Hi,
Did anyboy see this one when using dtm?
More precisely it occurres when calling deap.dtm.start(main)
Thanks

François-Michel De Rainville

unread,
Oct 1, 2012, 9:59:55 AM10/1/12
to deap-...@googlegroups.com
Could you post the complete error log please?


Regards,
François-Michel De Rainville

olga

unread,
Oct 2, 2012, 7:03:13 AM10/2/12
to deap-...@googlegroups.com
Well, it's just 1 line:
[ORD03143:08783] Error: Unable to get the current working directory
where ORD03143 is my computer's name and 08783 is the pid
But the program does not stop, it keeps on running.
And when I call main() without dtm start, this error does not appear.

Marc-André Gardner

unread,
Oct 3, 2012, 2:50:15 AM10/3/12
to deap-...@googlegroups.com
Hi,

As far as I know, this issue has more to do with MPI than with DTM. The reason it does not occur when calling main() directly is just that dtm.start is responsible for the MPI init (so if you do not call it, MPI will not be used at all).

Now, the MPI implementation you use may be different, but generally this is related to the MPI initialization on all nodes. By default, most MPI flavors try to use the current working directory (the one from which you execute the mpirun/mpiexec). If this directory is not available on all nodes, it tries to use $HOME. If this directory is not available too (or if the $HOME env. variable is not defined), and if you did not provide it an explicit working directory, then it will crash in the same way you see.

With OpenMPI, the working directory may be set by using the -wdir option (see http://www.open-mpi.org/doc/v1.4/man1/mpiexec.1.php and http://www.open-mpi.org/faq/?category=running), with the other implementations like MPICH, it should be almost the same, but you will have to find the exact option name.

If you have more issues with that, I suggest you to post on the mpi4py mailing list (https://groups.google.com/forum/?fromgroups=#!forum/mpi4py). DTM is built on it, and you may have better support there for pure MPI problems.

I hope this could help; do not hesitate to ask if you have any other questions,

Marc-André Gardner

olga

unread,
Oct 4, 2012, 11:27:33 AM10/4/12
to deap-...@googlegroups.com
Hi Marc-André,
Thanks a lot for your very detailed answer.
It is certainly time for me to learn more about mpi :)


On Wednesday, October 3, 2012 8:50:15 AM UTC+2, Marc-André Gardner wrote:
Hi
Reply all
Reply to author
Forward
0 new messages