Instead of a list, though, I'd like to use a generator, as I'm very memory constrained. When I try something like this:
engine.execute(MyTable.__table__.insert(),(d for d in my_dict_generator()))
...I get "AttributeError: 'list' object has no attribute 'keys'".
Is it not possible to use a generator in lieu of a list here?
Thanks,
Andy
Mike Bayer
unread,
Jun 16, 2016, 1:03:04 PM6/16/16
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to sqlal...@googlegroups.com
not directly. You should break your generator into batches:
import itertools
with engine.begin() as conn:
while True:
chunk = [elem for elem in itertools.islice(generator, 0, 10000)]
if not chunk:
break
conn.execute(table.__insert__, chunk)
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to sqlalchemy
Mike,
Thanks very much. I thought that might be the case, but was hoping to avoid even chunking, was hoping for pure streaming. I'll go with that approach.
FYI, I think I was thrown off by stream_results; I was thinking perhaps it was possible to use a generator to stream a huge insert, but I believe I misunderstood stream_results, which seems to be more about streaming large selects by the driver, instead of buffering entire result sets.