Well if you want to stick with the multiprocessing module and just endure the possible seg faults and some files not getting processed, you could print a warning and then do a os._exit(1) to just terminate the process.
Using subprocess, you could have your script take arguments, such as a -file flag to accept the file to process. When you run it without that particular flag it would start your batch operation, and then subprocesses can call the same executable again with specific -file options. Here would be a pretty crude example:
#!/usr/bin/env python
import sys
import subprocess
def process(f):
print "process:", f
if __name__ == "__main__":
if '-file' in sys.argv:
process(sys.argv[-1])
sys.exit()
for names in ('foo', 'bar', 'baz'):
cmd = [sys.argv[0], '-file', names]
# cmd = [sys.executable, sys.argv[0], '-file', names]
subprocess.Popen(cmd)
If your script is already executable, then you wouldn't need that commented line that runs it with "python <script>".
You would want to actually do this in a bounded way, like using a thread pool or threads+queue to chew on your list of files to process.
Normally the multiprocessing module could work for what you are doing, but I think it isn't playing nice with Maya's initialized environment when the process forks.