Error compressing some data with zlib

30 views
Skip to first unread message

Jonas Severinsson

unread,
Mar 2, 2024, 2:54:42 PMMar 2
to brython
I have attached a JSON-file. In Python 3.9.2 I can do this:

import json, zlib

f = open("envelope.json", "r")
envelope = json.loads(f.read())
f.close()

json_data = json.dumps(envelope).encode("utf-8")
print("bytes length:\t\t{}".format(len(json_data)))

compressed = zlib.compress(json_data)
print("compressed length:\t{}".format(len(compressed)))


The output is:

bytes length: 470958
compressed length: 57558


In Brython 3.12.1 under Vivaldi 6.6.3271.45 I can similarly do:

json_data = json.dumps({}).encode("utf-8")
print("bytes length:\t\t{}".format(len(json_data)))

compressed = zlib.compress(json_data)
print("compressed length:\t{}".format(len(compressed)))

And the output is:

bytes length: 2
compressed length: 10


So that works. However I cannot do this (where envelope is the object from the JSON-file again):

json_data = json.dumps(envelope).encode("utf-8")
print("bytes length:\t\t{}".format(len(json_data)))

compressed = zlib.compress(json_data)
print("compressed length:\t{}".format(len(compressed)))

The first print goes through:

bytes length: 470958

So that's exactly as in regular Python. The zlib.compress line however works for about a minute followed by this error message:

    compressed = zlib.compress(json_data)
                      ^^^^^^^^^^^^^^^^^^^^^
  File "VFS.zlib.py", line 672, in compress
    payload=compressor.compress(data)+compressor.flush()
                       ^^^^^^^^^^^^^^
  File "VFS.zlib.py", line 717, in compress
    length_code,*extra_length=length_to_code(length)
TypeError: 'NoneType' object is not iterable


Am I doing something wrong? The same code works in the regular interpreter.
envelope.json

Pierre Quentel

unread,
Mar 8, 2024, 10:10:27 AMMar 8
to brython
Thanks for reporting this Jonas. I have created an issue in the tracker, it will be easier to follow.
 

Pierre Quentel

unread,
Mar 10, 2024, 6:27:27 PMMar 10
to brython
The bug was fixed in this commit.
- Pierre
Reply all
Reply to author
Forward
0 new messages