--
You received this message because you are subscribed to the Google Groups "torch7" group.
To unsubscribe from this group and stop receiving emails from it, send an email to torch7+un...@googlegroups.com.
To post to this group, send email to tor...@googlegroups.com.
Visit this group at http://groups.google.com/group/torch7.
For more options, visit https://groups.google.com/d/optout.
ParallelTable is indeed not muliti-threaded (if that is what you are asking about).What you can do for multithreaded training, is to use the threads package: https://github.com/torch/threads-ffi/tree/master/benchmark