errors running large data files

9 views
Skip to first unread message

Sarah W Bottjer

unread,
Aug 15, 2025, 6:05:54 PMAug 15
to Open Ephys

We are using a miniamp-64 to collect 64 channels of data; 1-hour files (each ~16 Gb) for analysis.

When we try to run a Matlab script to merge 4 of these large data files on a 64 Gb RAM computer, the program gives an "Out of memory" error.  When we run the same files on a 128 Gb RAM computer, the program crashes.   We are able to merge two 1-hour files without error on the 128 Gb RAM machine (but not on 64Gb)   

If we run these 2 merged files in Kilosort4, the program will run with the default 60000 batch size setting but VERY slowly (on the 126 Gb computer).  If we increase the batch size, we get an "Out of memory" error.

How can we avoid these errors and make KS4 run faster?

Josh Siegle

unread,
Aug 19, 2025, 6:29:30 PMAug 19
to Sarah W Bottjer, Open Ephys
Hi Sarah,

I would recommend using SpikeInterface to concatenate these files virtually.

You can create a SpikeInterface Recording object that includes multiple "segments," each of which points to different underlying data file. You can then ask it to perform spike sorting across all of the segments without needing to actually merge the files together.

This tutorial explains how to create a Recording with multiple segments: 


Hope that helps!

Josh


--
You received this message because you are subscribed to the Google Groups "Open Ephys" group.
To unsubscribe from this group and stop receiving emails from it, send an email to open-ephys+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/open-ephys/d050f1c5-46b5-42b0-ad55-5114a258a855n%40googlegroups.com.

Reply all
Reply to author
Forward
0 new messages