Speed Up Brian2 Simulations

387 views
Skip to first unread message

plat...@gmail.com

unread,
Aug 1, 2017, 12:28:52 PM8/1/17
to Brian Development
Hi,
Any suggestions for (dramatically) improving brian2 simulations running time? (without standalone code or GPU programming)

I am simulating a lot (> 1000) independent stochastic LIF neurons (considered as trials) for a long period of time (> 10 sec). They fire like Poisson neurons (very sparsely, ~3Hz), that's why I need long simulations to record enough spikes.

Thanks,
Jonathan

Marcel Stimberg

unread,
Aug 17, 2017, 12:50:07 PM8/17/17
to Brian Development, plat...@gmail.com

Hi Jonathan,


Any suggestions for (dramatically) improving brian2 simulations running time? (without standalone code or GPU programming)

I am simulating a lot (> 1000) independent stochastic LIF neurons (considered as trials) for a long period of time (> 10 sec). They fire like Poisson neurons (very sparsely, ~3Hz), that's why I need long simulations to record enough spikes.

How slow is your simulation? Is your model straightforward or do you use any non-standard model elements (e.g. a `NetworkOperation`)? Do you already use the C++ standalone mode (I assume when you say "without standalone code or GPU programming" you mean that you don't want to *write* such code yourself)? Did you try using multiple cores with OpenMP (http://brian2.readthedocs.io/en/stable/user/computation.html#multi-threading-with-openmp). It might also be useful to print the output of `profiling_summary()` (http://brian2.readthedocs.io/en/stable/reference/brian2.core.network.profiling_summary.html) to figure out where the time is spent.

Best,
  Marcel

Racheltx

unread,
Oct 5, 2017, 11:47:07 AM10/5/17
to Brian Development
Jonathan thanks for the question, I have a similar problem. 

I am running networks of 5000 neurons with run time  betwen 10,000-30,000 ms, with loops for increasing c values. So for one simulation, I need to run the loop for 30-40 times at least. 

I was able to simulate fast two weeks ago. I was even able to run 4 of these simulations at once, in different Spyder windows. 

Then suddenly my code stopped running. While I was using python I was not even able to go to web or word etc. 
I was getting the Spyder update warnings. I updated thinking that that would solve the problem. It did not graph for a while. Then, it kind of solved the problem. But, my simulations are taking longer then before. 1000 ms run is taking around 58 ms for one loop. I was not measuring while it was fast but it was much faster. I need to run 10,000 ms at least. And python kernel dies several times in the middle of simulations. 

So I am trying to understand what causes the problem. 
Is it anaconda, spyder, python or brian? Any ideas? 

Thanks

Marcel Stimberg

unread,
Oct 5, 2017, 12:27:58 PM10/5/17
to brian-de...@googlegroups.com

Hi,

my reply is similar as for Jonathan's question: it's impossible to say where to improve the simulation speed without further details...
One possibility for a dramatic slowdown that affects everything running on the same machine is that you ran out of memory. This can also kill the Python process. A typical candidate for this is recording variables (in particular synaptic variables, given that in most simulations there are many more synapses than neurons) for every time step. As a rough calculation you can assume that recordings take 8 bytes per time step per variable per neuron/synapse. For example, if you use the standard time step of 0.1ms and record a single variable (e.g. the membrane potential) from 5000 neurons for 30 seconds, you'll need about 11 GB of RAM just for this recording.

Best,
  Marcel
--

---
You received this message because you are subscribed to the Google Groups "Brian Development" group.
To unsubscribe from this group and stop receiving emails from it, send an email to brian-developm...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Racheltx

unread,
Oct 17, 2017, 11:00:07 AM10/17/17
to Brian Development

Hi Marcel, I did not know one simulation would use that much memory. I was running around 30 loops for 11 different T values which makes around 300 simulations at once. So, the memory it uses is huge. I will split them up. Thanks

Reply all
Reply to author
Forward
0 new messages