Memory error

Skip to first unread message

Nov 10, 2020, 2:48:42 AM11/10/20
to ProjectMesa
I use Batchrunner and Collector to collect a lot of data, and it may reach to 1GB if the number of runs is increased. I got a memory error. Any suggestion?


Tom Pike

Nov 16, 2020, 3:14:09 PM11/16/20
to ProjectMesa
You need to save the data and clearit. Depending on how big each model run is you can either run a whole model and then save it as part of each model run (as you are generating the data yourself, pickle works well to save the data collector objects or you can save them as a .csv with pandas). You should then clear out the data collector to finish the model run, otherwise I pretty sure batchrunner will continue to store the data for each run (effectively bath runner will return an empyt dataframe). Or if each run is too large a certain number of steps you can follow the same process and then clear out the datacollector

Hope this helps.

Nov 19, 2020, 7:03:42 AM11/19/20
to ProjectMesa
Thanks a lot. Shall I change the or there is another alternative?


Tom Pike

Nov 19, 2020, 7:52:54 AM11/19/20
to ProjectMesa
I would think the easiest alternative is at the end of your model run save datacollector then clear datacollector.

So at the end of a model run something like:

data = datacollector.get_model_vars_dataframe()
datacollector.model_vars = {}

The same would work for the agent records to.

The specific will matter here and what you want to do with the data, but hopefully this gives an example


nada gh

Nov 19, 2020, 9:41:27 AM11/19/20
to Tom Pike, ProjectMesa
If I use batch_run.run_all() it will run all the iterations at once!

Project repos:
You received this message because you are subscribed to a topic in the Google Groups "ProjectMesa" group.
To unsubscribe from this topic, visit
To unsubscribe from this group and all its topics, send an email to
To view this discussion on the web visit
Reply all
Reply to author
0 new messages