ANN: python-blosc 1.10.1

3 views
Skip to first unread message

Francesc Alted

unread,
Dec 24, 2020, 7:43:02 AM12/24/20
to Blosc
==============================
Announcing python-blosc 1.10.1
==============================

What is new?
============

This is a maintenance release mainly to support Python wheels.
We added pyproject.toml to fix issues when building the package for
a Python version that does not have a wheel.

We are generating wheels for Intel (32 and 64 bits) and all major OS
(Win, Linux, Mac). In addition to extensions, we are distributing library
binaries in the wheels too.  This way, people willing to use the C-Blosc
library can make use of these wheels to install the necessary development
files.  For details, see:

For more info, you can have a look at the release notes in:


More docs and examples are available in the documentation site:



What is it?
===========

Blosc (http://www.blosc.org) is a high performance compressor optimized
for binary data.  It has been designed to transmit data to the processor
cache faster than the traditional, non-compressed, direct memory fetch
approach via a memcpy() OS call.  Blosc works well for compressing
numerical arrays that contain data with relatively low entropy, like
sparse data, time series, grids with regular-spaced values, etc.

python-blosc (http://python-blosc.blosc.org/) is the Python wrapper for
the Blosc compression library, with added functions (`compress_ptr()`
and `pack_array()`) for efficiently compressing NumPy arrays, minimizing
the number of memory copies during the process.  python-blosc can be
used to compress in-memory data buffers for transmission to other
machines, persistence or just as a compressed cache.

There is also a handy tool built on top of python-blosc called Bloscpack
(https://github.com/Blosc/bloscpack). It features a command line
interface that allows you to compress large binary datafiles on-disk.
It also comes with a Python API that has built-in support for
serializing and deserializing Numpy arrays both on-disk and in-memory at
speeds that are competitive with regular Pickle/cPickle machinery.


Sources repository
==================

The sources and documentation are managed through github services at:




----

  **Enjoy data!**

--
Francesc Alted
Reply all
Reply to author
Forward
0 new messages