Hey there. Non-gov't lurker on the list for a while. Gray, the autoapi thing is pretty nifty. Have you guys checked out some of the trickery that we (
DataMade) did for
Plenario and
Illinois Sunshine? Both of those are Flask apps and there's one function in particular that we use in both places to automatically figure out if a query submitted by a user via a query string is a valid query given the structure and characteristics of the table in the database. The code in question can be
found here. This leverages the introspection stuff that you get for free in
SQLAlchemy.
So, for example, given a table that that has string fields "first_name", "last_name", a date field unoriginally called "date" and an integer field called "count", a user could give you a query string that looked like:
http://whatever.gov/api/table_name/?first_name__ilike=%bill%&last_name__ilike=clinton%&date__ge=2001-01-01&count_le=10which would get translated by that function into a SQL statement that looked something like:
SELECT * FROM table_name WHERE first_name ILIKE '%bill%' AND last_name ILIKE 'clinton%' AND date >= '2001-01-01' AND count <= 10
It would also return an error to the user if they did something silly like "first_name__ge=50" which doesn't really make sense given that "first_name" is a string field.
To get a feel for this from a user's point of view, we've documented it
here for Illinois Sunshine. To bring this full circle (and make it slightly more relevant to the original question ;P), there's also some code in there the will
serialize responses as CSV (as a bonus, that also puts the resulting CSV(s) into a zipfile. Python makes this pretty simple (which might be the reason why there are no real pre-packaged things that do this for you; it's built in)
Anyways, hopefully this isn't too far off topic and someone finds it valuable. I'd love to hear what you guys are thinking with the autoapi tool and whether or not this kind of approach is useful to you. Let me know if you have questions about how it works.
Eric