V8 (node.js too) is not thread-safe. That is, you cannot call v8::*
functions outside of the main thread. (There is a v8::Locker class
but it's probably not what you want, it's essentially a big
serializing mutex.)
You mention that your logic processing callback doesn't always run in
a timely manner. Have you profiled where your application is spending
its time with the --prof switch or the perf tool?
Take a look at deps/v8/tools. linux-tick-processor is for processing
the v8.log file that --prof generates. ll_prof.py is a script that
knows hows to combine the output of --prof with perf.data files, for
when you need more fine-grained profiling data. (--prof is a sampling
profiler with a granularity of ~1 ms.)
As to having multiple threads, there are two possible approaches:
* Something like threads-a-gogo which spins up worker threads that run
JS code in a restricted environment.
* Multiple processes that communicate with each other over UNIX pipes.
Support for that is baked into the child_process module.
One caveat is that process.send() serializes your data to JSON and
deserializes it again in the other process, which can be slow for
large messages. If that's an issue, you can set up a channel manually
and transmit only binary data (i.e. Buffers.)