Django's best way to upload data into a postgresql database

17 views
Skip to first unread message

Guillermo Yáñez Feliú

unread,
Mar 17, 2019, 8:54:57 PM3/17/19
to Django users

Hello,


I’m working in a project that consists in converting a local postgresql database (that uses sqlalchemy as the ORM) into a web application, in which I upload excel sheets, read them, do some small cleaning and then upload selected data into a postgresql database using Django’s ORM. The idea is to have the data in a server instead of in every user’s machine.

Everything is ok but data loading is taking too long since, I think, I am using panda’s dataframes to easily structure, read and save the data. In the local version of the library, I used lists and was way faster.


I don’t if it’s related to Sqlalchemy, Django, lists or dataframes. Any suggestions on how to read spreadsheets data and upload it into a postgresql database using Django?


Thanks a lot.

Ryan Nowakowski

unread,
Mar 17, 2019, 11:36:14 PM3/17/19
to django...@googlegroups.com, Guillermo Yáñez Feliú
Send like a pretty standard optimization scenario. I'd recommend:

1. Find the bottleneck. Here if your suspect it's pandas dataframe vs lists, run a timing analysis using each.
2. Remove the bottleneck
Reply all
Reply to author
Forward
0 new messages