Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

how to send a json of yield list

130 views
Skip to first unread message

meInvent bbird

unread,
Oct 13, 2016, 10:14:58 PM10/13/16
to
after google a several solutions,

First method i searched has memory error
sock.send(json.dumps(StreamArray()))
Traceback (most recent call last):
File "pusher.py", line 43, in <module>
sock.send(json.dumps(StreamArray()))
File "C:\Python27\lib\json\__init__.py", line 243, in dumps
return _default_encoder.encode(obj)
File "C:\Python27\lib\json\encoder.py", line 207, in encode
chunks = self.iterencode(o, _one_shot=True)
File "C:\Python27\lib\json\encoder.py", line 270, in iterencode
return _iterencode(o, 0)
MemoryError

if use this solution, got another error

combobject = getcombinations()
sock.send(json.dumps(combobject, cls=PythonObjectEncoder))

C:\Users\martlee2\Downloads>python pusher.py tcp://*:8080
Traceback (most recent call last):
File "pusher.py", line 42, in <module>
sock.send(json.dumps(combobject, cls=PythonObjectEncoder))
File "C:\Python27\lib\json\__init__.py", line 250, in dumps
sort_keys=sort_keys, **kw).encode(obj)
File "C:\Python27\lib\json\encoder.py", line 207, in encode
chunks = self.iterencode(o, _one_shot=True)
File "C:\Python27\lib\json\encoder.py", line 270, in iterencode
return _iterencode(o, 0)
File "pusher.py", line 13, in default
return {'_python_object': pickle.dumps(obj)}
File "C:\Python27\lib\pickle.py", line 1374, in dumps
Pickler(file, protocol).dump(obj)
File "C:\Python27\lib\pickle.py", line 224, in dump
self.save(obj)
File "C:\Python27\lib\pickle.py", line 306, in save
rv = reduce(self.proto)
File "C:\Python27\lib\copy_reg.py", line 70, in _reduce_ex
raise TypeError, "can't pickle %s objects" % base.__name__
TypeError: can't pickle generator objects



#python pusher.py tcp://*:8080
import sys
import time
import zmq
import json
from json import dumps, loads, JSONEncoder, JSONDecoder
import pickle

class PythonObjectEncoder(JSONEncoder):
def default(self, obj):
if isinstance(obj, (list, dict, str, unicode, int, float, bool, type(None))):
return JSONEncoder.default(self, obj)
return {'_python_object': pickle.dumps(obj)}

def as_python_object(dct):
if '_python_object' in dct:
return pickle.loads(str(dct['_python_object']))
return dct

context = zmq.Context()
sock = context.socket(zmq.PUSH)
sock.bind(sys.argv[1])

def getcombinations():
for ii in range(1,2000):
for jj in range(1,2000):
for kk in range(1,2000):
yield [ii,jj,kk]

class StreamArray(list):
def __iter__(self):
return getcombinations()

# according to the comment below
def __len__(self):
return 1

while True:
time.sleep(1)
#sock.send(sys.argv[1] + ':' + time.ctime())
combobject = getcombinations()
sock.send(json.dumps(combobject, cls=PythonObjectEncoder))



puller.py

#python puller.py tcp://localhost:8080
import sys
import zmq
import json

context = zmq.Context()
sock = context.socket(zmq.PULL)

for arg in sys.argv[1:]:
sock.connect(arg)

while True:
message = sock.recv()
combinations = json.loads(message)
print(str(combinations))

dieter

unread,
Oct 14, 2016, 2:52:03 AM10/14/16
to
meInvent bbird <jobma...@gmail.com> writes:

> after google a several solutions,
> First method i searched has memory error
> sock.send(json.dumps(StreamArray()))
> Traceback (most recent call last):
> File "pusher.py", line 43, in <module>
> sock.send(json.dumps(StreamArray()))
> ...
> MemoryError

"MemoryError" means that the operating system could not provide
as much memory as required.

This is a bit surprising as you seem to try to dump a newly
build object. But, maybe, your objects starts huge right from the
beginning?


> if use this solution, got another error
>
> combobject = getcombinations()
> sock.send(json.dumps(combobject, cls=PythonObjectEncoder))
>
> Traceback (most recent call last):
> ...
> sock.send(json.dumps(combobject, cls=PythonObjectEncoder))
> ...
> TypeError: can't pickle generator objects

Apply "list" to your generator before dumping, i.e.
sock.send(json.dumps(list(combobject), ...))


Note in addition, that you can get a file like object from a socket
and then use "json.dump" (rather then "json.dumps"). This might (!)
better hanlde huge objects.

meInvent bbird

unread,
Oct 14, 2016, 5:11:35 AM10/14/16
to
succeed to run,

is it the yield return the whole list 2000 * 2000 * 2000 items?

as i know that yield is return [1,1,1] etc one by one once it get

if it return 2000*2000*2000 items, why?

i have to add a queue get this yield in order to succeed

but i do not understand the situation when
using queue and yield at the same time

will the speed same as the situation that no yield?

while True:
#time.sleep(1)
if q.qsize() > 0:
print("here1")
item = q.get()
print("here2")
#item = getcombinations()
sock.send(json.dumps(item))

meInvent bbird

unread,
Oct 14, 2016, 5:26:42 AM10/14/16
to
when not to use queue, it is faster now

while True:
for ii in getcombinations():
item = ii
print(item)
sock.send(json.dumps(ii))
0 new messages