First off, I'm assuming that the stdout+stderr of these commands is of
reasonable size rather than hundreds of megabytes.
What you want is a finite pool of threads (or processes) that execute
the tasks. multiprocessing.pool.Pool will do it. So will
concurrent.futures, which is what I'd personally use just out of more
familiarity with it.
In either case your task should wrap a call to subprocess.
subprocess.run is your easiest answer if you've got Python 3.5; the task
would call it with stdout and stderr=subprocess.PIPE, get the
CompletedProcess back, and then store the .stdout and .stderr string
results. For older Python, create a subprocess.Popen (again with stdout
and stderr=subprocess.PIPE) and call the communicate() method.
There's probably a dozen other ways. That one there, that's your
easiest.
--
Rob Gaddi, Highland Technology --
www.highlandtechnology.com
Email address domain is currently out of order. See above to fix.