Thanks!
Daniel
Say I run a batch job:
bjssub -i -n 2 -s 10000 /bin/bash
I get the two nodes, and the environment is set to NODES=n0000,n0001
All this is fine. However, when I try to xrx anything, it forces the
NODES value as the set of nodes where the stuff runs, and doesn't stop
to check if there is another option on the command line. E.g.:
[danny@dgk3 ~]$ xrx -p date
n0000: Wed Dec 17 23:35:00 UTC 2008
n0001: Wed Dec 17 22:33:41 UTC 2008
[danny@dgk3 ~]$ xrx n0001 date
Error: bad command name: no file "n0001" in $PATH
Should the behaviour not be to use the command line first and only
then default to the NODES environment variable? Take the example of
obtaining a node to run stuff in. In my case I have several cores per
node, so I want to run several different jobs. If I only get a single
node, then it should not be a problem (assuming you can put the xrx
command in the background). However, if I ask for several nodes, I
may still want control of where to submit all these jobs. In the
latter case simply doing "xrx command" will send the command to both
nodes, thus running two instances when I may not want that.
In my bproc clusters I parse the NODES variable in order to decide
where to run my jobs using bpsh.
Daniel
Here is one problem:
Say I run a batch job:
bjssub -i -n 2 -s 10000 /bin/bash
I get the two nodes, and the environment is set to NODES=n0000,n0001
All this is fine. However, when I try to xrx anything, it forces the
NODES value as the set of nodes where the stuff runs, and doesn't stop
to check if there is another option on the command line. E.g.:
[danny@dgk3 ~]$ xrx -p date
n0000: Wed Dec 17 23:35:00 UTC 2008
n0001: Wed Dec 17 22:33:41 UTC 2008
[danny@dgk3 ~]$ xrx n0001 date
Error: bad command name: no file "n0001" in $PATH
Should the behaviour not be to use the command line first and only
then default to the NODES environment variable?