Web2py: Create a CSV file to download for an executesql result

805 views
Skip to first unread message

Adam Scarlat

unread,
Mar 28, 2015, 12:14:44 PM3/28/15
to web...@googlegroups.com
Hi, 

I use an external mysql database in my application and use the executesql(str(query), as_dict=True)) to run queries on it.
I want to have a download link for a csv files of different queries in my application. If I were to work locally I would have used the following mysql command:

'SELECT * FROM SOME_TABLE INTO OUTFILE '/tmp/test1.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n''

In my application a user can select certain criteria, a query is built in the controller according to the criteria and is executed with the executesql() method
which returns a dictionary like object. 

My question is: is there a way to make a download link in the view to download a csv file of results returned by the executesql() method?

Thank you!
Adam

Brian M

unread,
Mar 28, 2015, 9:07:01 PM3/28/15
to web...@googlegroups.com
Here's an old post with the solution that I use for this


Yep, I've got a processing app that spits out all sorts of csv files based on data gathered from multiple sources.

Here's a little helper function I use

def csv_export(records, column_names, fields, mode = 'dal'):
   
"""Export DAL result set, list of dicts or list of lists to CSV stream for returning to user
    Arguments:
    records = the data to be returned
    column_names (list)= the column names/headings for the first row in the CSV file
                    Example ['First Name', 'Last Name', 'Email']
    fields (list) = the names of the fields (as they appear in records) in the order they
                    should be in the CSV. Example ['f_name', 'l_name', 'email']
                    or ['table_a.f_name', 'table_a.l_name', 'table_b.email']
                    If mode = 'list' and your records are in the correct order then fields may be None
                    otherwise use [1,3,0] if you list is in a different order
    mode (string) = what type of data is in records? 'dal' (Default), 'dict' or 'list'
                    'dal' if records came from a regular dal query (Default)
                    'dict' if records are a list of dicts (for example using db.executesql() with as_dict = True)
                    'list' if records are a list of lists/tuples (for example using db.executesql() with as_dict = False)

    """


   
#create fake file object
   
import cStringIO
    file
= cStringIO.StringIO()
   
#setup csv writer
   
import csv
    csv_file
= csv.writer(file)
   
#write first row withspecified column headings/names
    csv_file
.writerow(column_names)
   
#which mode - dal or dict?
   
if mode.lower() == 'dal' or mode.lower() == 'dict':
       
for record in records:
            csv_file
.writerow([record[field] for field in fields])
   
elif mode.lower() == 'list':
       
if fields == None:
            csv_file
.writerows(records)
       
else:
           
for record in records:
                csv_file
.writerow([record[field] for field in fields])
   
return file





Then in a controller you can have something like

    
csv_stream = csv_export(processed_dataset, column_names, fields, mode = 'dict')
    response
.headers['Content-Type']='application/vnd.ms-excel'
    response
.headers['Content-Disposition']='attachment; filename=data_for_%s.csv' % date.today()
   
return csv_stream.getvalue()  



which will cause browser to download the csv file with your chosen filename


you could also turn around and save the datafile to the filesystem if you wanted.

Hope this helps!
Brian

Anthony

unread,
Mar 29, 2015, 11:08:17 AM3/29/15
to web...@googlegroups.com
If you know the column names and types ahead of time, you can do something like this:

db.define_table('mydummytable',
               
Field('field1'),
               
...,
                migrate
=False)

rows
= db.executesql(sql_statement, fields=db.mydummytable)

The purpose of the dummy table is just for use by executesql in order to parse the results into a DAL Rows object (be sure to set migrate=False, as you don't want to actually create that table in the database). When you pass a table object (or a list of field objects) as the "fields" argument to executesql, you get back a DAL Rows object, which can then be exported via CSV via:

rows.export_to_csv_file(file_object, ...)

or:

csv = rows.as_csv()

Anthony
Reply all
Reply to author
Forward
0 new messages