I have a function, which uses pymysql to connects to databases, queries info and saves it to csv files. Pretty often files are really big(>3GB).
What I do now is:
cnx = connection.cursor()
cnx.execute(query)
file = '{0}.csv'.format(filename.split('/')[-1:][0])
with open(file, 'w') as csv_out:
for row in cnx:
csv_out.write('{0}\r\n'.format(','.join(str(i) for i in row)))
Unfortunately, sometimes this solution suffers from connection timeouts and it is not really fast, eventually.
Could you, please, help me, to rewrite my function for it to work properly? I am not sure that fetchall() method gonna help me if I have that much of a volume. Maybe I should somehow work with sscursor and fetchall_unbuffered
() method?
Thanks in advance!