question about least_significant_digit

20 views
Skip to first unread message

Andreas Hilboll

unread,
May 24, 2013, 5:13:57 AM5/24/13
to netcdf4...@googlegroups.com
Hi,

I have a curious question about least_significant_digit:

I found that when writing data to a variable with
least_significant_digit != None, the data is truncated using the
_quantize function in netCDF4_utils.py. On the other hand, for reading
variables, I cannot find any such scaling.

Does this mean that for reading, the underlying libnetcdf4 handles the
necessary conversion, but for writing, this is not implemented in
libnetcdf4 but has to be done by the user? I checked the unidata
website, but couldn't find any useful information about the
least_significant_digit ncattr. Is this stadardized?

Cheers, Andreas.

Andreas Hilboll

unread,
May 24, 2013, 9:33:37 AM5/24/13
to Jeff Whitaker, netcdf4...@googlegroups.com
On 24.05.2013 13:39, Jeff Whitaker wrote:
>> Andreas Hilboll <mailto:li...@hilboll.de>
>> May 24, 2013 3:13 AM
>> ------------------------------------------------------------------------
> Andreas: When you set that attribute, the data is truncated and then
> the zlib compression available in the netcdf library is more efficient.
> If you set zlib=True when writing the variable, the data is
> automatically compressed in the file, and de-compressed when you read it
> back out.
>
> Does that answer your question?

Hi Jeff, thanks for taking the time to answer :)

Actually, my question is a bit more low-level, and maybe this is
actually the wrong list to ask it. But I took a deeper look into the
_quantize function and now I think I understand what's happening.

Cheers, Andreas.

-- Andreas.
Reply all
Reply to author
Forward
0 new messages