I'm using Sage scripts to run tasks in batch. They look like:
def dostuff(X):
result = [X] #plus irrelevant calculations
return result
print dostuff([1,2,3,4,n])
This worked fine for all recent data sets with an input array size of 1.0 - 2.4M records. However Sage crashes if larger data sets (3.5M - 10.0M)are used with segmentation faults. Is the way I'm using Sage terribly wrong or did I reached some hard boundaries and does it make sense that it crashes? Or does it seem to be a bug? Thanks for your help.
Cheers,
Jeroen
Ubuntu 12.04 LTS x64 on Intel Xeon E3 with 24GB RAM
Sage Version 6.1, Release Date: 2014-01-30 (GIT)