The basic class that orchestrates these jobs use Queue.Queue() to feed
the product of the first job into the Queue for the next job.
Each Thread receives a dynamically generated shell script from some
classes I wrote and then runs the script using
subprocess.call(["shell_script_file.sh"])
I tested the code on a mac laptop and also on ubuntu. Curiously on Mac
OSX 32 bit Core duo running snow leopard, the code always runs fine.
However on my ubuntu box I get sporadic errors detailed below.
I tried replacing the
subprocess.call() with
subprocess.Popen(["shellscriptfile.sh"],stdout=open("unique_process_id.log","w"),stderr=open("unique_error_log.log","w")
But I get the same "OSError: [Errno 26] Text file busy" error
Everytime I run the same job queue a different part of the job fails.
Unfortunately I dont see anybody else reporting this OSError. ANy help
in troubleshooting my "newbie" thread code will be greatly
appreciated.
Thanks
hari
The orchestrator class is at:
https://github.com/harijay/auriga/blob/master/process_multi.py
A sample thread subclass is at :
https://github.com/harijay/auriga/blob/master/scatomtzrunthread.py
Detailed error:
Exception in thread Thread-1:
Traceback (most recent call last):
File "/usr/lib/python2.6/threading.py", line 532, in
__bootstrap_inner
self.run()
File "/home/hari/Dropbox/auriga/scatomtzrunthread.py", line 28, in
run
stat = subprocess.call([file])
File "/usr/lib/python2.6/subprocess.py", line 480, in call
return Popen(*popenargs, **kwargs).wait()
File "/usr/lib/python2.6/subprocess.py", line 633, in __init__
errread, errwrite)
File "/usr/lib/python2.6/subprocess.py", line 1139, in
_execute_child
raise child_exception
OSError: [Errno 26] Text file busy
You say dynamically generated. Any chance you are (re)using the same
filename each time? Is it possible that two uses of that filename
could occur at the same time? That is, is it possible that at the
same time while one process is running from the script file, another
process is trying to re-write the script file? And so maybe you need
to have dynamically generated and unique filenames
Most often I see references to binary executable files for the error
message, but I've also seen references to script files, e.g.
http://www.cyberciti.biz/faq/binbash-bad-interpreter-text-file-busy/
Searching 'errno 26', the third Google response suggests that you are
trying to write to a file (especially an executable or shared library?)
that is already in use. Perhaps just trying to read when locked will
trigger?
--
Terry Jan Reedy
The only thing I am doing is dir1 and dir2 will each have a
script1.sh . But that still makes it mutually exclusive paths.
Hari
[...]
> But I get the same "OSError: [Errno 26] Text file busy" error
>
> Everytime I run the same job queue a different part of the job fails.
>
> Unfortunately I dont see anybody else reporting this OSError. ANy help
> in troubleshooting my "newbie" thread code will be greatly appreciated.
Try catching the OSError exception and inspecting the filename attribute.
Something like this might help:
# Untested
def report_error(type, value, traceback):
if type is OSError:
print value.filename
sys.__excepthook__(type, value, traceback)
sys.excepthook = report_error
--
Steven
> Each Thread receives a dynamically generated shell script from some
> classes I wrote and then runs the script using
>
> subprocess.call(["shell_script_file.sh"])
> But I get the same "OSError: [Errno 26] Text file busy" error
"Text file busy" aka ETXTBSY occurs if you try to execute a file which is
open for write by any process.
Be sure to explicitly close() the script before attempting to execute it.
Don't rely upon the destructor closing it, as that may be deferred.
Also, if you spawn any child processes, ensure that they don't inherit the
descriptor being used to write to the script. Ideally, you should set the
close-on-exec flag on the descriptor as soon as the file is opened. Using
close_fds=True in subprocess.Popen() will solve the issue for processes
spawned that way, but you also need to consider subprocesses which may be
spawned "behind the scenes" by library functions.