Inference workflows

12 views
Skip to first unread message

mmwo...@gmail.com

unread,
Oct 3, 2022, 2:01:25 PM10/3/22
to SpiNNaker Users Group
hi,

I'd like to try porting an inference workflow we have to the Spinnaker.  It involves (a) massively parallel simulations and (b) some deep neural network training (pytorch based).   Having read the spinnaker 2 paper, I suspect the individual simulations themselves would fit on single chips (they don't benefit from multicore parallelism on regular CPUs), and the deep net could be ported with the snn_toolbox.  Assuming that my naive assumptions hold, a few questions still:

(1) For the massively parallel simulations, my intuition on a GPU would be to write the simulations in a batched/vectorized form in e.g. TensorFlow.  Would doing this and then using snn_toolbox be workable or terrible? 

(2) It seems like snn_toolbox doesn't handle training, but if I implemented the gradients myself, could an iterative optimizer still be implemented? 

thanks in advance,
Marmaduke

Edward Jones

unread,
Oct 10, 2022, 1:40:28 PM10/10/22
to SpiNNaker Users Group
Hi Marmaduke,

The SNNtoolbox is a rather high-level tool that works to to convert trained feed-forward ANNs to SNNs layer-by-layer. It sounds like you're looking to convert tensors and operations on them more generically on to SpiNNaker, something that doesn't currently exist. If you could express your models as a PyNN model running on SpiNNaker, you may be able to take advantage of SpiNNaker's parallelism, but it depends how tightly coupled that inference needs to be with the PyTorch side (e.g. if you alternate between (a) and (b) latency could be an issue). Looking at things more generically, SpiNNaker does have support for mapping computational graphs on to the machine, see SpiNNaker Graph Front End, which could be another place to start to investigate leveraging SpiNNaker's parallelism for your task, though this would likely require more development work. Can you give a little more detail about what you are trying to run?

Ed
Reply all
Reply to author
Forward
0 new messages