Download data from bigquery

21 views
Skip to first unread message

srke...@hotels.com

unread,
Sep 10, 2015, 9:03:04 AM9/10/15
to AdWords API Forum
Hello there,

I am dumping the bid changes made by scripts into bigquery. Is there a way to download large amounts of data (>200mb) from bigquery into a csv using the python API for bigquery? Below is the sample snippet i am using. I am not getting all the 100k rows in the csv. I am only getting like 20k rows. Any help would be appreciated.



FLOW
= flow_from_clientsecrets('client_secrets.json',
                               scope
='https://www.googleapis.com/auth/bigquery')
def main():
  storage
= Storage('bigquery_credentials.dat')
  credentials
= storage.get()
 
if credentials is None or credentials.invalid:
   
# Run oauth2 flow with default arguments.
    credentials
= tools.run_flow(FLOW, storage, tools.argparser.parse_args([]))
  http
= httplib2.Http()
  http
= credentials.authorize(http)
  bigquery_service
= build('bigquery', 'v2', http=http)
 
try:
    query_request
= bigquery_service.jobs()
   
#table = input()
    query_data
= {"query":"SELECT * FROM [Sample.Sample] order by update_date limit 100000"}
   
    query_response
= query_request.query(projectId=PROJECT_NUMBER,
                                         body
=query_data).execute()
   
#textfile = open('text.txt','w')
print ('Query Results:')
    for row in query_response['rows']:
      result_row = []
      for field in row['f']:
        result_row.append(field['v'])
      sys.stdout = open('test.txt','a')
      output = str(result_row).replace("[", "").replace("]","").replace("'","").encode(encoding='utf_8', errors='strict')
      print(str(output).replace("b'", "").replace("'", ""))

Thanks

Anthony Madrigal

unread,
Sep 10, 2015, 10:04:10 AM9/10/15
to AdWords API Forum
Hello,

Unfortunately, it seems this question is better suited for the BigQuery API Support Team. Please navigate to this link and follow the instructions to get support from their team.

Regards,
Anthony
AdWords API Team
Reply all
Reply to author
Forward
0 new messages