[NiBabel] Loading large 4D NIfTI file -> MemoryError

377 views
Skip to first unread message

Kai Schlamp

unread,
May 14, 2013, 8:27:20 AM5/14/13
to nipy...@googlegroups.com
Hello together,

I try to load a (large) 4D NIfTI file (fMRI EPI timeseries from Human Connectome Project, about 1 GB) with NiBabel.

import nibabel as nib
img = nib.load("fmri.nii.gz")
data = img.get_data()

The program crashes during "img.get_data()" with an "MemoryError" (having 6 GB of RAM in my machine).

Is there a way to fetch data from the image partially, for example:

img.get_data(3) // for 3rd time series

Any suggestions? Or do I really have to split up the 4D volume into multiple 3D volumes (in my case 1200 time points, so 1200 images per 4D volume)?

Best regards,
Kai

Yaroslav Halchenko

unread,
May 14, 2013, 8:40:29 AM5/14/13
to nipy...@googlegroups.com
before gurus wake up - FWIW if it was uncompressed nifti
(fmri.nii) then it would be automatically (IIRC) memory mapped to the
file so there should be no such huge memory penalty right away.
--
Yaroslav O. Halchenko, Ph.D.
http://neuro.debian.net http://www.pymvpa.org http://www.fail2ban.org
Senior Research Associate, Psychological and Brain Sciences Dept.
Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755
Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419
WWW: http://www.linkedin.com/in/yarik

Kai Schlamp

unread,
May 14, 2013, 8:50:52 AM5/14/13
to nipy...@googlegroups.com
Thanks a lot! After loading the extracted NIfTI file with NiBabel everything works perfectly (and lightning fast).

bbf

unread,
Sep 8, 2014, 12:00:21 PM9/8/14
to nipy...@googlegroups.com
Hi - I just ran into the same problem, and was wondering if there had been a more thorough solution in the last year or so.  I notice that the HCP file decompresses to a .nii file which is just a tiny bit bigger than 4GB, which makes me suspicious that there is something not completely 64 bit clean either in nibabel or in Anaconda 64 bit python (my system has 32GB of RAM, so should not have a problem with this).  Oh, and this is on a RHEL6 cluster.

Blaise
Reply all
Reply to author
Forward
0 new messages