HDF5ExtError: Problems creating the EArray

77 views
Skip to first unread message

coolco...@gmail.com

unread,
Mar 11, 2019, 11:53:04 AM3/11/19
to Acoular-users
Hi,

I am using Acoular to test some different geometries of microphone arrays.
Basically, I changed some codes in the "Example 2" from the Acoular website, picked only the moving source from the example and imported new array geometries from ".xml" files.
Last week, I was still able to test some new arrays using the modified code.
But today, I keep getting the error "HDF5ExtError: Problems creating the EArray". Please see the whole error message down here:

---------------------------------------------------------------------------
HDF5ExtError                              
Traceback (most recent call last)
<ipython-input-5-d8e9af4c25c5> in <module>()
 
21 figure(1,(8,7))
 
22 i = 1
---> 23 for res in cacht.result(1):
 
24     res0 = res[0].reshape(g.shape)
 
25     map2 += res0 # average

~\AppData\Local\Continuum\anaconda3\lib\site-packages\acoular\tprocess.py in result(self, num)
 
603             ac = self.h5f.create_earray(self.h5f.root, name, \
 
604                                        tables.atom.Float32Atom(), \
--> 605 (0, self.numchannels)) 606             ac.set_attr('sample_freq', self.sample_freq)
 
607             for data in self.source.result(num):

~\AppData\Local\Continuum\anaconda3\lib\site-packages\tables\file.py in create_earray(self, where, name, atom, shape, title, filters, expectedrows, chunkshape, byteorder, createparents, obj, track_times)
 
1391                        filters=filters, expectedrows=expectedrows,
 
1392                        chunkshape=chunkshape, byteorder=byteorder,
-> 1393 track_times=track_times) 1394
 
1395         if obj is not None:

~\AppData\Local\Continuum\anaconda3\lib\site-packages\tables\earray.py in __init__(self, parentnode, name, atom, shape, title, filters, expectedrows, chunkshape, byteorder, _log, track_times)
 
160         super(EArray, self).__init__(parentnode, name, atom, shape, title,
 
161                                      filters, chunkshape, byteorder, _log,
--> 162 track_times) 163
 
164     # Public and private methods

~\AppData\Local\Continuum\anaconda3\lib\site-packages\tables\carray.py in __init__(self, parentnode, name, atom, shape, title, filters, chunkshape, byteorder, _log, track_times)
 
220         # The `Array` class is not abstract enough! :(
 
221         super(Array, self).__init__(parentnode, name, new, filters,
--> 222 byteorder, _log, track_times) 223
 
224     def _g_create(self):

~\AppData\Local\Continuum\anaconda3\lib\site-packages\tables\leaf.py in __init__(self, parentnode, name, new, filters, byteorder, _log, track_times)
 
288
 
289
--> 290 super(Leaf, self).__init__(parentnode, name, _log)
 
291
 
292     def __len__(self):

~\AppData\Local\Continuum\anaconda3\lib\site-packages\tables\node.py in __init__(self, parentnode, name, _log)
 
264             # Create or open the node and get its object ID.
 
265             if new:
--> 266 self._v_objectid = self._g_create()
 
267             else:
 
268                 self._v_objectid = self._g_open()

~\AppData\Local\Continuum\anaconda3\lib\site-packages\tables\earray.py in _g_create(self)
 
182
 
183         # Finish the common part of the creation process
--> 184 return self._g_create_common(self._v_expectedrows)
 
185
 
186     def _check_shape_append(self, nparr):

~\AppData\Local\Continuum\anaconda3\lib\site-packages\tables\carray.py in _g_create_common(self, expectedrows)
 
250             # needed for setting attributes in some descendants later
 
251             # on
--> 252 self._v_objectid = self._create_carray(self._v_new_title)
 
253         except:  # XXX
 
254             # Problems creating the Array on disk. Close node and re-raise.

tables
\hdf5extension.pyx in tables.hdf5extension.Array._create_carray()

HDF5ExtError
: Problems creating the EArray.



Some basic information:
pytable version 3.4.3
hdf5    version 1.10.1
h5py    version 2.7.1
openssl version 1.0.2r
acoular version 19.02
Windows 7
python 3.6.4 (Anaconda3 5.1.0 64-bit)

Troubleshooting tried:
1. Some suspect that the storage space is not enough, and since it worked fine last week and this problem started just today. However, there is enough space on my harddrives.
2. Another possible reason according to a former post is "PyTables does not support HDF5 1.10.0 yet" (https://groups.google.com/forum/#!search/HDF5ExtError$20creating$20array|sort:date/pytables-users/Xu8EFCiOFYc/3XhveKPVAwAJ).
But Aanaconda won't allow me to degrade hdf5 back to 1.8.17.
3. I tested the very same code on another laptop and it ran well, with which I barely worked before.
4. I deinstalled and reinstalled Anaconda and Acoular, the error.
5. Now I get the same error running the original "Example 2" from the website. Running "Example 1" is successful.

Did someone have the same error? Any idea??
Many thanks in advance!



Acoular-users

unread,
Mar 11, 2019, 12:06:41 PM3/11/19
to Acoular-users
The problem is with the cache file. The cache files may cause problems for subsequent runs, if they are not properly closed, i.e. during a crash.
Either you don't use cache files or you can also just delete the cache file(s).
These files can be found in the ./cache folder.
Hope this helps. Please report if the problem persists.

coolco...@gmail.com

unread,
Mar 11, 2019, 12:51:14 PM3/11/19
to Acoular-users
That sounds like a solution.
I looked for ./cache folder but couldn't locate it. Could you please describe more exactly where the cache files can be found?
Thanks in advance!

Acoular-users

unread,
Mar 11, 2019, 1:44:14 PM3/11/19
to Acoular-users
It sounds like you work under some Windows-OS, where this somewhat hidden in the personal temp path.

You can get the full path from the  cache_dir  variable in acoular:

> from acoular import cache_dir
> print(cache_dir)


coolco...@gmail.com

unread,
Mar 12, 2019, 3:48:33 AM3/12/19
to Acoular-users
With the help of the codes I found the cache folder and deleted all cache files. Now it works well again.
Thank you very much!
Reply all
Reply to author
Forward
0 new messages