Hi Jim,
Thank you for the reply.
The export function certainly wraps embedded commas in a column and I can open correctly in Excel, but when parsing the csv file with programming languages like Python or Ruby, it goes tricky because I need to write regular expressions to search for outer double quotes.
For example,
id, message, comment
1, "Parsing this line is pretty simple", "because this line does not contain any comma"
2, "But when lines like this, when commas are embedded in column", "it becomes a little complicated, because I need additional code for parsing the line"
3, "Also, this line is a "nightmare", because containing not only commas, but also "double quotes".", "some record has "doubled-double-quotes" like """HI GUYS""""
Well, this problem is related on developer skills, but I think BigQuery is a tool to manage and filter the big data as easy as possible (with enormous speed), and it would be really nice for you to support exporting in TSV.
Regards,
2012年4月20日金曜日 7時06分22秒 UTC+9 Jim Caputo:
Hi Hayato Tomoda,
I tested an export from BigQuery with embedded commas in a column. We wrap these columns in double quotes, which should be sufficient for opening in Excel. Can you explain exactly what's happening, or pass along a sample?
Thanks
Jim