Reporting API: Ad hoc reports

155 views
Skip to first unread message

Jean-Baptiste Blanchet

unread,
Sep 12, 2013, 12:27:13 PM9/12/13
to google-doubleclick-...@googlegroups.com
Hi,

We're currently integrating DFA reporting into our product for a customer, and we have to run a lot of small, quick ad hoc reports. There doesn't appear to be a way to do it simply, so we have to use 4 steps:
- Insert the report;
- Run the report;
- Download the report;
- Delete the report.

I noticed that I can run ad hoc reports from the interface. I was wondering if you have any plans to enable 1 or 2 operations ad hoc reporting, for example by adding a body to the report.run command.

Also, do you have any plans to support xml or json reports? It would make the parsing the data a lot easier.

Joseph DiLallo (DFA API Team)

unread,
Sep 12, 2013, 12:49:42 PM9/12/13
to google-doubleclick-...@googlegroups.com
Hey Jean-Baptiste,

We've been kicking around the idea of adding a cleaner way to do ad-hoc reports. We haven't committed to implementing this yet, but it seems likely that we will at some point. Other report types has been discussed and largely dismissed. I don't think it's likely that we'll add them.

Why is it that other report formats would ease parsing? Is it just the header on the CSV files, or is there some other reason other formats would help?

Cheers,
- Joseph DiLallo, the DFA API Team

Jean-Baptiste Blanchet

unread,
Sep 12, 2013, 2:12:45 PM9/12/13
to google-doubleclick-...@googlegroups.com
Thanks for the quick response (again).

Good news for the ad hoc reports. Let us know if you need some beta testers :)

A few points for json:
- More structured (or at least it can be, depending on your implementation).
- More standard (most APIs nowadays are returning json, including Google APIs, at least the ones we use), plus your own request and response are already in json.
- A lot easier to read. For example, the pseudo-code for reading a csv report is:

- skip to the line starting with "Report Fields"
- skip one more line (the headers)
- do while the line doesn't start with "Grand Total"
   -> split the line
   -> get/parse the data

This is far from rocket science, but it's still a lot more complicated and potentially buggy then it could be in let's say javascript (with jQuery, but other libraries are available), where reading the same file in json would be:
- $.parseJson(fileString);

In .Net (which is what we are using), you have multiple solutions, for example using json.net to parse it to a DTO, a dynamic object or even to a dictionary.

This is not a priority for us at all, the code to read the csv files is already done. But it would be a nice feature in the future, and I don't think it would be complicated. You could do it à la Google Analytics, where the data is simply in a 2-dimensional array of strings.

Thanks,

JB
Reply all
Reply to author
Forward
0 new messages