Big files

34 views
Skip to first unread message

Сергей Богословский

unread,
Sep 6, 2017, 5:10:12 AM9/6/17
to PyMySQL Users
I have a function, which uses pymysql to connects to databases, queries info and saves it to csv files. Pretty often files are really big(>3GB).

What I do now is:
    cnx = connection.cursor()
    cnx.execute(query)
    file = '{0}.csv'.format(filename.split('/')[-1:][0])

    with open(file, 'w') as csv_out:
        for row in cnx:
            csv_out.write('{0}\r\n'.format(','.join(str(i) for i in row)))

Unfortunately, sometimes this solution suffers from connection timeouts and it is not really fast, eventually. 

Could you, please, help me, to rewrite my function for it to work properly? I am not sure that fetchall() method gonna help me if I have that much of a volume. Maybe I should somehow work with sscursor and fetchall_unbuffered() method?

Thanks in advance! 

INADA Naoki

unread,
Sep 6, 2017, 8:16:37 AM9/6/17
to pymysq...@googlegroups.com
Maybe, just replace

> cnx = connection.cursor()

with

cnx = connection.cursor(pymysql.cursors.SSCursor)

helps you.

Regards,

INADA Naoki <songof...@gmail.com>
> --
> You received this message because you are subscribed to the Google Groups
> "PyMySQL Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to pymysql-user...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
Reply all
Reply to author
Forward
0 new messages