Thanks for the quick response (again).
Good news for the ad hoc reports. Let us know if you need some beta testers :)
A few points for json:
- More structured (or at least it can be, depending on your implementation).
- More standard (most APIs nowadays are returning json, including Google APIs, at least the ones we use), plus your own request and response are already in json.
- A lot easier to read. For example, the pseudo-code for reading a csv report is:
- skip to the line starting with "Report Fields"
- skip one more line (the headers)
- do while the line doesn't start with "Grand Total"
-> split the line
-> get/parse the data
This is far from rocket science, but it's still a lot more complicated and potentially buggy then it could be in let's say javascript (with jQuery, but other libraries are available), where reading the same file in json would be:
- $.parseJson(fileString);
In .Net (which is what we are using), you have multiple solutions, for example using
json.net to parse it to a DTO, a dynamic object or even to a dictionary.
This is not a priority for us at all, the code to read the csv files is already done. But it would be a nice feature in the future, and I don't think it would be complicated. You could do it à la Google Analytics, where the data is simply in a 2-dimensional array of strings.
Thanks,
JB