Segmentation fault when using a very large input array

51 views
Skip to first unread message

Jeroen

unread,
Feb 4, 2014, 12:07:58 PM2/4/14
to sage-s...@googlegroups.com
Hi,

I'm using Sage scripts to run tasks in batch. They look like:

def dostuff(X):
result = [X] #plus irrelevant calculations
return result
print dostuff([1,2,3,4,n])

This worked fine for all recent data sets with an input array size of 1.0 - 2.4M records. However Sage crashes if larger data sets (3.5M - 10.0M)are used with segmentation faults. Is the way I'm using Sage terribly wrong or did I reached some hard boundaries and does it make sense that it crashes? Or does it seem to be a bug? Thanks for your help.


Cheers,

Jeroen

Ubuntu 12.04 LTS x64 on Intel Xeon E3 with 24GB RAM
Sage Version 6.1, Release Date: 2014-01-30 (GIT)

Volker Braun

unread,
Feb 5, 2014, 10:46:21 AM2/5/14
to sage-s...@googlegroups.com
You need post working code that demonstrates your problem and the error message if you want to get help. There is no built-in limitation on list length, but you might exceed your computer's RAM. Check "ulimit -a".
Reply all
Reply to author
Forward
0 new messages