Bug?: Can't get data from dataset

1,297 views
Skip to first unread message

Jesse Hopkins

unread,
Nov 18, 2016, 12:32:36 PM11/18/16
to h5py
Hello h5py mailing list,

I'm trying to read data from the attached hdf5 file (for those interested, it's generated by an X-ray detector called an Eiger 1M, made by Dectris. Google will get you the rest of the way). It should be pretty straightforward. I did a little noodling around, found the right keys, and managed to get a data set of the size I expected. The problem is, I get an error when I try to retrieve data from it.

Here's the code I used to try to read the attached file:

import h5py


fname = 'align2_013_97_data_000002.h5'
my_file = h5py.File(fname)
dataset = my_file['entry']['data']['data']

print dataset

my_data = dataset[0]

my_file.close()


Here's the output when I ran it in ipython:

In [1]: run load_test.py
Summary of the h5py configuration
---------------------------------

h5py    2.6.0
HDF5    1.8.13
Python  2.7.11 | 64-bit | (default, Jun 11 2016, 03:41:56) 
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)]
sys.platform    darwin
sys.maxsize     9223372036854775807
numpy   1.10.4

<HDF5 dataset "data": shape (1, 1065, 1030), type "<u4">
---------------------------------------------------------------------------
IOError                                   Traceback (most recent call last)
/Users/jbh246/Desktop/fabio/load_test.py in <module>()
     64 print dataset
     65 
---> 66 my_data = dataset[0]
     67 
     68 my_file.close()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2687)()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2645)()

/Users/jbh246/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/h5py/_hl/dataset.pyc in __getitem__(self, args)
    480         mspace = h5s.create_simple(mshape)
    481         fspace = selection.id
--> 482         self.id.read(mspace, fspace, arr, mtype, dxpl=self._dxpl)
    483 
    484         # Patch up the output for NumPy

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2687)()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_objects.c:2645)()

h5py/h5d.pyx in h5py.h5d.DatasetID.read (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/h5d.c:3231)()

h5py/_proxy.pyx in h5py._proxy.dset_rw (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_proxy.c:1860)()

h5py/_proxy.pyx in h5py._proxy.H5PY_H5Dread (/Users/travis/build/MacPython/h5py-wheels/h5py/h5py/_proxy.c:1508)()

IOError: Can't read data (Can't open directory


Is this a bug? Is it a problem with the data file? Am I just missing something really fundamental in my implementation?

In case you missed it above, here's the output of h5py.version.info:
h5py    2.6.0
HDF5    1.8.13
Python  2.7.11 | 64-bit | (default, Jun 11 2016, 03:41:56) 
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)]
sys.platform    darwin
sys.maxsize     9223372036854775807
numpy   1.10.4

Thanks!

- Jesse
align2_013_97_data_000002.h5

Osborn, Raymond

unread,
Nov 18, 2016, 1:07:51 PM11/18/16
to h5...@googlegroups.com
It is structured like a NeXus file, and I can open it with the nexusformat module (see http://nexpy.github.io/nexpy/), but I get the same error message when accessing the data array itself. I believe that, for performance reasons, Dectris wanted to store large arrays in multiple files, similar to the HDF5 virtual data set, but without waiting for VDS to be officially released. They are not using the standard external link mechanism so they may be using low-level routines to split the data, which h5py can’t handle. I’ll contact a couple of people who might know how they are doing this.

Ray

--
You received this message because you are subscribed to the Google Groups "h5py" group.
To unsubscribe from this group and stop receiving emails from it, send an email to h5py+uns...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
<align2_013_97_data_000002.h5>

-- 
Ray Osborn, Senior Scientist
Materials Science Division
Argonne National Laboratory
Argonne, IL 60439, USA
Phone: +1 (630) 252-9011
Email: ROs...@anl.gov


Jérôme Kieffer

unread,
Nov 18, 2016, 3:47:07 PM11/18/16
to h5...@googlegroups.com
On Fri, 18 Nov 2016 09:32:36 -0800 (PST)
Jesse Hopkins <jesse.b...@gmail.com> wrote:

> I'm trying to read data from the attached hdf5 file (for those interested,
> it's generated by an X-ray detector called an Eiger 1M, made by Dectris.

You need to have the LZ4 or the LZ4+bitshuffle plugins (depending on the
firmware used on the Eiger detector) installed on your computer
to be able to read such data.

Those firmware are available on:
https://github.com/silx-kit/hdf5plugin

(or on the links of the page)

Cheers,

Jerome

Jesse Hopkins

unread,
Nov 18, 2016, 3:51:12 PM11/18/16
to h5...@googlegroups.com
That did the trick, thanks!


--
You received this message because you are subscribed to a topic in the Google Groups "h5py" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/h5py/uM7ap8hJ5EI/unsubscribe.
To unsubscribe from this group and all its topics, send an email to h5py+unsubscribe@googlegroups.com.

Jesse Hopkins

unread,
Nov 28, 2016, 4:02:24 PM11/28/16
to h5...@googlegroups.com
Hey, I'd like to reopen this discussion. I'm now trying to do the same thing on linux (Scientific Linux 6), and getting essentially the same message. The trick with hdf5plugin didn't work (as expected, since the pip documentation says that's only for windows and mac).

I did make sure the lz4 library is installed, not sure how to install the bitshuffle library.

Any help is appreciated.

Thanks!

- Jesse

Error message is below:

  File "/home/jhopkins/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/fabio/eigerimage.py", line 150, in read
    self.data = self.dataset[0][self.currentframe, :, :]
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/vagrant/pisi/tmp/h5py-2.6.0-2/work/h5py-2.6.0/h5py/_objects.c:2687)
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/vagrant/pisi/tmp/h5py-2.6.0-2/work/h5py-2.6.0/h5py/_objects.c:2645)
  File "/home/jhopkins/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/h5py/_hl/dataset.py", line 482, in __getitem__

    self.id.read(mspace, fspace, arr, mtype, dxpl=self._dxpl)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/vagrant/pisi/tmp/h5py-2.6.0-2/work/h5py-2.6.0/h5py/_objects.c:2687)
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/vagrant/pisi/tmp/h5py-2.6.0-2/work/h5py-2.6.0/h5py/_objects.c:2645)
  File "h5py/h5d.pyx", line 181, in h5py.h5d.DatasetID.read (/home/vagrant/pisi/tmp/h5py-2.6.0-2/work/h5py-2.6.0/h5py/h5d.c:3231)
  File "h5py/_proxy.pyx", line 130, in h5py._proxy.dset_rw (/home/vagrant/pisi/tmp/h5py-2.6.0-2/work/h5py-2.6.0/h5py/_proxy.c:1860)
  File "h5py/_proxy.pyx", line 84, in h5py._proxy.H5PY_H5Dread (/home/vagrant/pisi/tmp/h5py-2.6.0-2/work/h5py-2.6.0/h5py/_proxy.c:1508)
IOError: Can't read data (Can't open directory)

Jérôme Kieffer

unread,
Nov 29, 2016, 1:42:15 AM11/29/16
to h5...@googlegroups.com
On Mon, 28 Nov 2016 16:02:22 -0500
Jesse Hopkins <jesse.b...@gmail.com> wrote:

> Hey, I'd like to reopen this discussion. I'm now trying to do the same
> thing on linux (Scientific Linux 6), and getting essentially the same
> message. The trick with hdf5plugin didn't work (as expected, since
> the pip documentation says that's only for windows and mac).
>
> I did make sure the lz4 library is installed, not sure how to install
> the bitshuffle library.

There is a repository on github named "bitshuffle", it is a python
project and installs straight away on linux (which explains why we did
not package hdf5plugin on linux)

After build, copy the .so to the HDF% plugin directory

Cheers,

Jerome

Jesse Hopkins

unread,
Nov 29, 2016, 1:47:35 PM11/29/16
to h5...@googlegroups.com
So I got that built and set the HDF5_PLUGIN_PATH variable, and now I'm getting a new set of errors. Something about the filter not being registered. Thoughts?

As always, thanks for the help.


  File "/home/jhopkins/raw/SASFileIO.py", line 258, in loadFabio
    fabio_img = fabio.open(filename)
  File "/home/jhopkins/miniconda2/lib/python2.7/site-packages/fabio/openimage.py", line 154, in openimage
    obj = obj.read(obj.filename, frame)
  File "/home/jhopkins/miniconda2/lib/python2.7/site-packages/fabio/eigerimage.py", line 150, in read

    self.data = self.dataset[0][self.currentframe, :, :]
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/work/h5py/_objects.c:2696)
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/work/h5py/_objects.c:2654)
  File "/home/jhopkins/miniconda2/lib/python2.7/site-packages/h5py/_hl/dataset.py", line 482, in __getitem__

    self.id.read(mspace, fspace, arr, mtype, dxpl=self._dxpl)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/work/h5py/_objects.c:2696)
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/work/h5py/_objects.c:2654)
  File "h5py/h5d.pyx", line 181, in h5py.h5d.DatasetID.read (/home/ilan/minonda/conda-bld/work/h5py/h5d.c:3240)
  File "h5py/_proxy.pyx", line 130, in h5py._proxy.dset_rw (/home/ilan/minonda/conda-bld/work/h5py/_proxy.c:1869)
  File "h5py/_proxy.pyx", line 84, in h5py._proxy.H5PY_H5Dread (/home/ilan/minonda/conda-bld/work/h5py/_proxy.c:1517)
IOError: Can't read data (Required filter 'hdf5 lz4 filter; see http://www.hdfgroup.org/services/contributions.html' is not registered)



Cheers,

Jerome

Jérôme Kieffer

unread,
Nov 29, 2016, 3:30:57 PM11/29/16
to h5...@googlegroups.com
On Tue, 29 Nov 2016 13:47:34 -0500
Jesse Hopkins <jesse.b...@gmail.com> wrote:

> So I got that built and set the HDF5_PLUGIN_PATH variable, and now I'm
> getting a new set of errors. Something about the filter not being
> registered. Thoughts?

Apparently it is the "lz4" plugin (code 32004) which is needed and you
installed the "bitshuffle (+lz4) which code is the 32008.
They are all described here:
https://support.hdfgroup.org/services/filters.html

This probably means the plugin (filter) mechanism within hdf5 is
working but the good plugin is not found.
Dectris changed this year the firmware of their detector from the LZ4
(i.e. 32004) to the bitshuffle (i.e 32008). So you need to have both
plugins installed.

For debian 8 system, the silx-team provides packages (based on the work
of Eugen Wintersberg at Desy):
http://www.silx.org/pub/debian/binary/hdf5-plugin-lz4_0.2.0-1_amd64.deb
http://www.silx.org/pub/debian/binary/hdf5-plugin-bitshuffle_0.2.4~bpo8-1_amd64.deb

If you use another linux system running on x86_64 computer, you can unpack
the libraries from those debian packages.

The NeXus folk have also a source repository:
https://github.com/nexusformat/HDF5-External-Filter-Plugins/tree/master/LZ4

HTH,

Jerome

cedric.dece...@gmail.com

unread,
Nov 30, 2016, 4:27:23 AM11/30/16
to h5py
Hello,

I'm also trying to read data from a HDF5 file on linux and have the same kind of issue.
I can access the dataset shape and dtype but can't convert the dataset into a numpy array.

Here are my settings:
h5py    2.6.0
HDF5    1.8.17
Python  2.7.12 |Anaconda 4.0.0 (64-bit)| (default, Jul  2 2016, 17:42:40) 
[GCC 4.4.7 20120313 (Red Hat 4.4.7-1)]
sys.platform    linux2
sys.maxsize     9223372036854775807
numpy   1.11.2

The function I use is:
# @brief: Load data band from HDF5 file
# @param filename      (string)     filename
# @param group         (string)     path of the dataset within the HDF5 file
# @param dataset       (string)     name of the dataset
def loadBand(filename, group, dataset):
    with h5py.File(filename, "r") as f:
        d=f[group+dataset]
        size = d.shape
        if len(size) == 1:
            size  = size + (1,)
        # apply scaling factor + offset
        if(dataset!='SM'):
            scale = float(d.attrs['SCALE'])
            offset = float(d.attrs['OFFSET'])
            no_data = float(d.attrs['NO_DATA'])
            d = np.array(d,dtype='float32')
            nodata = (d == no_data)
            d[nodata] = NODATA
        else:
            scale = int(d.attrs['SCALE'])
            offset = int(d.attrs['OFFSET'])
            no_data = int(d.attrs['NO_DATA'])
            d = np.array(d)
            nodata = (d == no_data)
        d[~nodata] = (d[~nodata] - offset) / scale
        d = np.reshape(d, (size[0]*size[1]))
    return d

It crashes at the  d = np.array(d) step.

Logs are the following:
Traceback (most recent call last):
  File "/home/decesarec/workspaces/wf_p_geo3.1_test/bin/stepA.py", line 422, in <module>
    [sm, b2, b3, vza, sza, saa, vaa, size] = stepACommonIO.loadS1TOC(globalParameters)
  File "/home/decesarec/workspaces/wf_p_geo3.1_test/bin/stepACommonIO.py", line 65, in loadS1TOC
    sm  = loadBand(inputFilename, globalParameters['smGroup'], 'SM')
  File "/home/decesarec/workspaces/wf_p_geo3.1_test/bin/stepACommonIO.py", line 45, in loadBand
    d = np.array(d)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/work/h5py/_objects.c:2696)
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/work/h5py/_objects.c:2654)
  File "/home/decesarec/anaconda2/lib/python2.7/site-packages/h5py/_hl/dataset.py", line 678, in __array__
    self.read_direct(arr)
  File "/home/decesarec/anaconda2/lib/python2.7/site-packages/h5py/_hl/dataset.py", line 641, in read_direct
    self.id.read(mspace, fspace, dest, dxpl=self._dxpl)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/work/h5py/_objects.c:2696)
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/work/h5py/_objects.c:2654)
  File "h5py/h5d.pyx", line 181, in h5py.h5d.DatasetID.read (/home/ilan/minonda/conda-bld/work/h5py/h5d.c:3240)
  File "h5py/_proxy.pyx", line 130, in h5py._proxy.dset_rw (/home/ilan/minonda/conda-bld/work/h5py/_proxy.c:1869)
  File "h5py/_proxy.pyx", line 84, in h5py._proxy.H5PY_H5Dread (/home/ilan/minonda/conda-bld/work/h5py/_proxy.c:1517)
IOError: Can't read data (Can't open directory)

Contrary to Jesse I don't see any reference to an issue with LZ4 lib in the logs... I installed it anyway using conda but it didn't change anything....
Could someone please help me?

Many thanks,

Cedric

Jerome Kieffer

unread,
Nov 30, 2016, 5:18:08 AM11/30/16
to h5...@googlegroups.com
Hi,

Your IOerror references the fact the directory containing the plugins/filters does not exist ...
Once the directory created, you should get the error message about the missing filter.
We believe this is a bug at the HDF5 library level.

--
Jérôme Kieffer
tel +33 476 882 445

Cedric DE CESARE

unread,
Nov 30, 2016, 9:55:38 AM11/30/16
to h5...@googlegroups.com
Hi Jerome,

Thanks for your input.
I finally solved (rather bypassed) my issue by falling back to previous versions of h5py, netcdf4, pyhdf, hdf5,...

Hopefully the bug introduced in the late hdf5 lib will be soon fixed.

Cheers,

Cedric

Jesse Hopkins

unread,
Nov 30, 2016, 10:18:41 AM11/30/16
to h5...@googlegroups.com
Thanks for all the help with this. So I download that .deb file, extracted it, and added the liblz4module.so, liblz4module.so.0, and liblz4module.so.0.2.0 (the first two are symbolic links that point to the last) to my HDF5_PLUGIN_PATH , but I'm still getting the same error.

I suspect this means I need to recompile the libraries from scratch on my computer, but that's easier said than done (at least for me). I got the source from the nexus repository, but I'm not sure how to build it. It's not clear to me what the --with-hdf5 and --with-lz4 should be pointing to. I've got both installed on my machine (outputs of rpm -ql below), but I tried pointing both at bin and at lib where the files are installed the autotools install failed with the same error:
configure: error: cannot find HDF5 header files!

Any thoughts?

Thanks!

- Jesse

Full ./configure output:
[jhopkins@hopkins LZ4]$ ./configure --with-hdf5=/usr/lib64 --with-lz4lib=/usr/lib64
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
/home/jhopkins/Downloads/HDF5-External-Filter-Plugins/LZ4/bin/missing: Unknown `--is-lightweight' option
Try `/home/jhopkins/Downloads/HDF5-External-Filter-Plugins/LZ4/bin/missing --help' for more information
configure: WARNING: 'missing' script is too old or missing
checking for a thread-safe mkdir -p... /bin/mkdir -p
checking for gawk... gawk
checking whether make sets $(MAKE)... yes
checking whether make supports nested variables... yes
checking whether to enable maintainer-specific portions of Makefiles... no
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking for style of include used by make... GNU
checking dependency style of gcc... gcc3
checking for ranlib... ranlib
checking build system type... x86_64-unknown-linux-gnu
checking host system type... x86_64-unknown-linux-gnu
checking how to print strings... printf
checking for a sed that does not truncate output... /bin/sed
checking for grep that handles long lines and -e... /bin/grep
checking for egrep... /bin/grep -E
checking for fgrep... /bin/grep -F
checking for ld used by gcc... /usr/bin/ld
checking if the linker (/usr/bin/ld) is GNU ld... yes
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking whether ln -s works... yes
checking the maximum length of command line arguments... 1966080
checking whether the shell understands some XSI constructs... yes
checking whether the shell understands "+="... yes
checking how to convert x86_64-unknown-linux-gnu file names to x86_64-unknown-linux-gnu format... func_convert_file_noop
checking how to convert x86_64-unknown-linux-gnu file names to toolchain format... func_convert_file_noop
checking for /usr/bin/ld option to reload object files... -r
checking for objdump... objdump
checking how to recognize dependent libraries... pass_all
checking for dlltool... no
checking how to associate runtime and link libraries... printf %s\n
checking for ar... ar
checking for archiver @FILE support... @
checking for strip... strip
checking for ranlib... (cached) ranlib
checking command to parse /usr/bin/nm -B output from gcc object... ok
checking for sysroot... no
checking for mt... no
checking if : is a manifest tool... no
checking how to run the C preprocessor... gcc -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking for dlfcn.h... yes
checking for objdir... .libs
checking if gcc supports -fno-rtti -fno-exceptions... no
checking for gcc option to produce PIC... -fPIC -DPIC
checking if gcc PIC flag -fPIC -DPIC works... yes
checking if gcc static flag -static works... yes
checking if gcc supports -c -o file.o... yes
checking if gcc supports -c -o file.o... (cached) yes
checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
checking whether -lc should be explicitly linked in... no
checking dynamic linker characteristics... GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking whether stripping libraries is possible... yes
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... yes
checking whether to build static libraries... yes
checking for config x86_64-unknown-linux-gnu... no
checking for config x86_64-unknown-linux-gnu... no
checking for config unknown-linux-gnu... no
checking for config unknown-linux-gnu... no
checking for config x86_64-linux-gnu... no
checking for config x86_64-linux-gnu... no
checking for config x86_64-unknown... no
checking for config linux-gnu... no
checking for config linux-gnu... no
checking for config unknown... no
checking for config x86_64... no
checking whether make sets $(MAKE)... (cached) yes
checking lz4.h usability... no
checking lz4.h presence... no
checking for lz4.h... no
checking for LZ4_compress... no
checking hdf5.h usability... no
checking hdf5.h presence... no
checking for hdf5.h... no
configure: error: cannot find HDF5 header files!


Locations of hdf5 and lz4 install:

[jhopkins@hopkins LZ4]$ rpm -ql hdf5
/usr/bin/gif2h5
/usr/bin/h52gif
/usr/bin/h5copy
/usr/bin/h5debug
/usr/bin/h5diff
/usr/bin/h5dump
/usr/bin/h5import
/usr/bin/h5jam
/usr/bin/h5ls
/usr/bin/h5mkgrp
/usr/bin/h5perf_serial
/usr/bin/h5repack
/usr/bin/h5repart
/usr/bin/h5stat
/usr/bin/h5unjam
/usr/lib64/libhdf5.so.6
/usr/lib64/libhdf5.so.6.0.4
/usr/lib64/libhdf5_cpp.so.6
/usr/lib64/libhdf5_cpp.so.6.0.4
/usr/lib64/libhdf5_fortran.so.6
/usr/lib64/libhdf5_fortran.so.6.0.4
/usr/lib64/libhdf5_hl.so.6
/usr/lib64/libhdf5_hl.so.6.0.4
/usr/lib64/libhdf5_hl_cpp.so.6
/usr/lib64/libhdf5_hl_cpp.so.6.0.4
/usr/lib64/libhdf5hl_fortran.so.6
/usr/lib64/libhdf5hl_fortran.so.6.0.4
/usr/share/doc/hdf5-1.8.5.patch1
/usr/share/doc/hdf5-1.8.5.patch1/COPYING
/usr/share/doc/hdf5-1.8.5.patch1/HISTORY-1_0-1_8_0_rc3.txt
/usr/share/doc/hdf5-1.8.5.patch1/HISTORY-1_8.txt
/usr/share/doc/hdf5-1.8.5.patch1/MANIFEST
/usr/share/doc/hdf5-1.8.5.patch1/README.txt
/usr/share/doc/hdf5-1.8.5.patch1/RELEASE.txt

[jhopkins@hopkins LZ4]$ rpm -ql lz4
/usr/bin/lz4
/usr/bin/lz4c
/usr/bin/lz4cat
/usr/bin/unlz4
/usr/lib64/liblz4.so.1
/usr/lib64/liblz4.so.1.7.1
/usr/share/doc/lz4-r131
/usr/share/doc/lz4-r131/COPYING
/usr/share/doc/lz4-r131/NEWS
/usr/share/man/man1/lz4.1.gz
/usr/share/man/man1/lz4c.1.gz
/usr/share/man/man1/lz4cat.1.gz
/usr/share/man/man1/unlz4.1.gz


V. Armando Solé

unread,
Nov 30, 2016, 11:30:15 AM11/30/16
to h5py
Hi!

In the mean time we have produced manylinux versions of the plugins and created a hdf5plugin version working under linux. You can give a try at them installing the wheel:

http://ftp.esrf.fr/pub/bliss/hdf5plugin-1.3.0-py2.py3-none-any.whl

The module is harmless because it appends to HDF5_PLUGIN_PATH. However, in case of failure, please make sure there is no interference with your previous steps and unset your HDF5_PLUGIN_PATH prior to try it.

Good luck!

Armando



Jesse Hopkins

unread,
Nov 30, 2016, 1:48:22 PM11/30/16
to h5...@googlegroups.com
Armando,

That did the trick. I tested it on scientific linux 6 and ubuntu 16.04 and I was able to use the plugin package in exactly the same way as I could on mac or windows, and it let me open and read the files!

Thanks for the help. Do you expect you'll release that version to PyPI soon, or should I keep that wheel saved in case I need to install it again soon?

All the best.

- Jesse

V. Armando Solé

unread,
Nov 30, 2016, 2:38:40 PM11/30/16
to h5py
Glad that helped you!

Concerning PyPI, let's say you have made the final test :-)

I had tested it under debian 7 and 8 but I needed additional feedback.

Many thanks for the additional testing.

I have just uploaded the wheel to PyPI.

Please keep in mind that in order to make the bitshuffle plugin independent of the installed HDF5 library bitshuffle compression is disabled.

Reply all
Reply to author
Forward
0 new messages