I found the cause: under Max OS X and Unix, there is a single shell script "run" to start Spart master, worker, and executor. Under Windows, there is a cascade: "run.cmd" calls "run2.cmd" which calls java. So when the ExecutorRunner (which runs in the worker process) wants to kill the executor process via
process.destroy(), it actually only kills the process of "run.cmd", and the process of "run2.cmd" (=> java running the executor) stays alive.
I would like to report this as an issue, but the Github issues page that is linked to on
http://spark-project.org/docs/latest/contributing-to-spark.html does not exist (404 error by Github).
Kind regards
Christoph