writing hdf5 files with Neo

154 views
Skip to first unread message

Luke Yuri Prince

unread,
Jun 1, 2015, 10:04:58 AM6/1/15
to neurale...@googlegroups.com
Hi,

I'm new to using Neo so this may be a silly question.

I'm trying a to create an hdf5 file that is a compilation of recordings from Signal and Spike2. I have converted these data to analogsignalarrays and I'm trying to save these objects as an h5 file. However, the resulting file won't open and produces a NameError. The file won't open in hdfview either.

Sample Code

import numpy as np

import stfio as sf

from neo.io import Spike2IO,NeoHdf5IO

from neo.core import Block,Segment,AnalogSignalArray

from quantities import kHz,pA


signal_data = sf.read('path_to_file',ftype='cfs')

signal_data = np.array(signal_data[0])


blk = Block()


seg = Segment(name='pre-drug',index=0)

a = AnalogSignalArray(signal_data*pA, sampling_rate=25*kHz)

seg.analogsignalarrays.append(a)

blk.segments.append(seg)


spike2_data = np.array([])

for filenum in ['000','001','002']:

   r = Spike2IO('path_to_file'+filenum+'.smr')

   seg = r.read_segment()

   data = np.squeeze(np.array(seg.analogsignals))

   T = 600

   sf = 10000

   gain = 200 # 200pA/V

   idx = T*sf

   data = data[:idx]*gain

   spike2_data = np.append(ir_epsc,data,axis=0)


a = AnalogSignalArray(ir_epsc*pA,sampling_rate=10*kHz)

seg = Segment(name='irregular stimulus',index=2)

seg.analogsignalarrays.append(a)

blk.segments.append(seg)


writer = NeoHdf5IO(filename='data_compilation.h5')

writer.write_block(blk)


If you have any advice on how to solve this, I'd really appreciate it.


Best Wishes,


Luke

    

Samuel Garcia

unread,
Jun 1, 2015, 10:47:38 AM6/1/15
to neurale...@googlegroups.com
Hi,
I do not understand the loop in the middle but I guess you known.
No errors when writing ?
Could you send the error message you have when reading ?


Samuel
--
You received this message because you are subscribed to the Google Groups "Neural Ensemble" group.
To unsubscribe from this group and stop receiving emails from it, send an email to neuralensembl...@googlegroups.com.
To post to this group, send email to neurale...@googlegroups.com.
Visit this group at http://groups.google.com/group/neuralensemble.
For more options, visit https://groups.google.com/d/optout.

Luke Yuri Prince

unread,
Jun 1, 2015, 10:55:51 AM6/1/15
to neurale...@googlegroups.com
/home/luke/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/neo/io/hdf5io.pyc in __init__(self, filename, **kwargs)
    288         self.name_indices = {}
    289         if filename:
--> 290             self.connect(filename=filename)
    291 
    292     def _read_entity(self, path="/", cascade=True, lazy=False):

/home/luke/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/neo/io/hdf5io.pyc in connect(self, filename)
    326                 self.connected = True
    327             except:
--> 328                 raise NameError("Incorrect file path, couldn't find or create a file.")
    329             self.objects_by_ref = {}
    330             self.name_indices = {}

NameError: Incorrect file path, couldn't find or create a file. 

I'm absolutely sure the filepath is correct though.

Luke Yuri Prince

unread,
Jun 1, 2015, 11:01:38 AM6/1/15
to neurale...@googlegroups.com
No errors when writing either, sorry.


On Monday, 1 June 2015 15:47:38 UTC+1, sgarcia wrote:

Robert Pröpper

unread,
Jun 1, 2015, 2:13:39 PM6/1/15
to neurale...@googlegroups.com
Hi,

how big is the resulting file, does it have a realistic size (or rather something like 0 bytes)? Could you simplify your code, use generated dummy data etc. to generate a minimal test case that produces the error which other people can test easily?

Best,
Robert

Michael Schmuker

unread,
Jun 2, 2015, 4:22:52 AM6/2/15
to neurale...@googlegroups.com
Hi,

May I suggest to remove line 327 and 328 in hdf5io.py. Catching an exception, not handling it but raising another one with less information to go with it makes no sense.

Debugging will be much easier if the real exception is thrown instead of the useless one that is generated in the except: block.

Either handle that exception, or don’t catch it.

Just my 2 cents…

Best,

Michael

Michael Schmuker

unread,
Jun 2, 2015, 4:24:56 AM6/2/15
to neurale...@googlegroups.com
Hi,

May I suggest to remove line 327 and 328 in hdf5io.py (see quoted bit below). Catching an exception, not handling it but raising another one with less information to go with it makes no sense.

I bet that debugging this issue will be trivial if the real exception is thrown instead of the useless one that is generated in the except: block.

Just my 2 cents…

Best,

Michael


> /home/luke/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/neo/io/hdf5io.pyc in connect(self, filename)
> 326 self.connected = True
> 327 except:
> --> 328 raise NameError("Incorrect file path, couldn't find or create a file.")
> 329 self.objects_by_ref = {}
> 330 self.name_indices = {}
>

--
Michael Schmuker
University of Sussex
School of Engineering and Informatics
Falmer, Brighton BN1 9QJ

Tel: +44 (0) 1273 876565

http://biomachinelearning.net

Luke Yuri Prince

unread,
Jun 2, 2015, 4:56:45 AM6/2/15
to neurale...@googlegroups.com
Sure. This example produces the same error when trying to open, and also can't be opened in hdfviewer

import numpy as np

from neo.io import NeoHdf5IO

from neo.core import Block,Segment,AnalogSignalArray

from quantities import kHz,pA


x = np.random.randn(10000,3)

a = AnalogSignalArray(x*pA,sampling_rate=10*kHz)


blk = Block()

seg = Segment(name='example')

blk.segments.append(seg)

seg.analogsignalarrays.append(a)


w = NeoHdf5IO(filename='example.h5')

w.write_block(blk)


# Restart kernel, then

from neo.io import NeoHdf5IO
r = NeoHdf5IO(filename='example.h5')

Luke Yuri Prince

unread,
Jun 2, 2015, 5:42:29 AM6/2/15
to neurale...@googlegroups.com
Sorry, forgot to reply to the other part. The file does have a reasonable size- 145.8MB (this is actually nearly 3 times larger than the sum of the individual spike2 and signal files that my script puts together).

On Monday, 1 June 2015 19:13:39 UTC+1, Robert Pröpper wrote:

Luke Yuri Prince

unread,
Jun 2, 2015, 6:41:29 AM6/2/15
to neurale...@googlegroups.com
OK. So I went back to check whether I had all of the dependencies, and realised I hadn't installed pytables. This wasn't raised as an exception before though. After installing it and checking all of the other dependencies, which all appear to be there, I still have the same problem.

Luke Yuri Prince

unread,
Jun 2, 2015, 6:52:16 AM6/2/15
to neurale...@googlegroups.com
Problem solved. The file needed to be explicitly closed after writing to allow tables to open it again later. Could you update the doc examples and the errors to show that this needs to be done?

Best Wishes,

Luke

MCzerwinski

unread,
Jul 1, 2015, 8:24:54 AM7/1/15
to neurale...@googlegroups.com
Hi all
I am also starting with neo and elephant. I'm trying to move into them, so I am rewriting my code so it works for neo.
The first thing I try is to use neo object in place of a numpy array and a dictionary for metadata.

I written it into a segment, and want to save it, for example as an h5 file. And, obviously, I have a different error.

The code looks like this:
import neo
import mcdIO

S = mcdIO.loadMyDataNEO(filename,fs, metadata) #my loading function creates a healthy neo.Segment
saveName = 'test' # 'test.h5'  #I tried both
iom = neo.NeoHdf5IO(saveName)
iom.save(S)
#iom.write_segment(S) #tried this also
iom.close()

It cannot save correctly. For cascade=False it gaves no errors, but saves only the metadata (and loads them later).

The segment is created like this:
S = neo.core.Segment()
for key in metadata_dictionary:
    S.annotations[key] = metadata_dictionary[key]
S.analogsignals.append(my_neo_analogsignalN)

Error mesage:

<ipython-input-81-6b79b9c487fc> in <module>()
----> 1 iom.write_segment(S2, cascade = True)

/home/mczerwinski/pythonhack/neo/io/hdf5io.pyc in _write_entity(self, obj, where, cascade, lazy)
    288         Wrapper for base io "writer" functions.
    289         """
--> 290         self.save(obj, where, cascade, lazy)
    291
    292     #-------------------------------------------

/home/mczerwinski/pythonhack/neo/io/hdf5io.pyc in save(self, obj, where, cascade, lazy)
    491                                 pass
    492                     if child_node is None:
--> 493                         child_node = self.save(child, where=ch._v_pathname)
    494
    495                     if len(child._single_parent_containers) > 1:

/home/mczerwinski/pythonhack/neo/io/hdf5io.pyc in save(self, obj, where, cascade, lazy)
    457                 node._f_setAttr(par_cont, '')
    458         # we checked already obj is compliant, loop over all safely
--> 459         for attr in obj._all_attrs:
    460             if hasattr(obj, attr[0]):  # save an attribute if exists
    461                 assign_attribute(getattr(obj, attr[0]), attr[0], path, node)
AttributeError: 'numpy.ndarray' object has no attribute '_all_attrs'


I tried the last loop, it is working
obj = S
 for container in getattr(S,  '_child_containers', []):
    for child in getattr(S, container):
        print child._all_attrs

prints like a lot of:
(('signal', <class 'quantities.quantity.Quantity'>, 1), ('sampling_rate', <class 'quantities.quantity.Quantity'>, 0), ('t_start', <class 'quantities.quantity.Quantity'>, 0), ('channel_index', <type 'int'>), ('name', <type 'str'>), ('description', <type 'str'>), ('file_origin', <type 'str'>))

so I have a hard time figuring why it thinks that there is a numpy array, where I am almost sure it is a neo object that has "_all_attrs"


Best wishes,

Michał
Reply all
Reply to author
Forward
0 new messages