BatchJobService / CompressionLevel / Upload Speed (PHP-API)?

59 views
Skip to first unread message

DS

unread,
May 9, 2017, 3:10:41 PM5/9/17
to AdWords API Forum
Hello,

I'm utilizing the BatchJobService to update AdGroups and I'm trying to figure out how to speed up my uploads.

I can upload Incrementally in batch Operation sizes of 10k each time which on average, takes about 7-8 minutes.  Across 43 campaigns / ~750,000 adGroups, the amount of time would take 3-4 hours.  I'd love to figure out how to get this down to less than an hour if possible.

I've tried playing around with the PHP API library and compressionLevel to see if it improves upload  performance but I'm not sure it's working.  I've tried sniffing the traffic between Google and my servers during the upload, but it's encrypted so I can't see if I'm fully submitting a GZIP request.  I've also added 'gzip' to my user agent but it's still not improving upload speed or upload performance.   I would have thought GZIP compression would work better because it is Text / XML.

I've also tried changing my Batch Operations upload sizes to between 1,000 to 10,000 to figure out if that would improve upload performance.

Here's the SoapSettingsBuilder / AdWordsSessionBuilder code I'm using - 
    $soapSettings = (new SoapSettingsBuilder())
        ->fromFile()
        ->withCompressionLevel(9)
        ->build();

    $session = (new AdWordsSessionBuilder())
->withClientCustomerId(REDACTED)
        ->withSoapSettings($soapSettings)        
->withDeveloperToken(REDACTED)
        ->withUserAgent('My App (gzip)')
        ->withOAuth2Credential($oAuth2Credential)
        ->build();

Maybe I'm missing something but are there any other ideas to improve performance to get the data to Google?

Thanks.

Sreelakshmi Sasidharan (AdWords API Team)

unread,
May 9, 2017, 4:56:35 PM5/9/17
to AdWords API Forum
Hi DS, 

The best practices while dealing with batch job services are as listed in this document. Having fewer larger jobs should improve the performance compared to many smaller jobs. Hence increasing the size of the jobs should be helpful. From your description, it appears that your batch job has only one type of operation which is to update AdGroup. If that is not the case, ordering the operations by operation type will also help boost the performance. Additionally, grouping together operations that target the same ad group/campaign reduces the total number of ad groups or campaigns targeted in the request, improving overall performance. I see that you have set the highest possible value for compression. Please note that higher level of compression can also degrade the performance. 

Is 7 - 8 mins per batch job the time taken to complete the job? For larger batch jobs that does not seem to be too high. If the batch jobs are getting stuck, cancelling, taking too long to complete etc, please share the batch job id's of those requests and we can take a closer look to see if there are any issues.  

Thanks,
Sreelakshmi, AdWords API Team

DS

unread,
May 9, 2017, 6:08:25 PM5/9/17
to AdWords API Forum
Hey Sreelakshmi,

I'm not talking about processing the jobs.   The processing time is quite quick.  I'm referring to uploading the jobs to Google.

I'm trying to figure out how to get my data (upload it) to Google faster.  Once it is uploaded, it processes quite quickly - but I'm unsure if compression is actually working - Am I making the Compression call correct above? or is Compression enabled by default for SOAP PUT requests including GZIP?

Also - are there any other another ways to get the data into Google faster?

Sreelakshmi Sasidharan (AdWords API Team)

unread,
May 10, 2017, 12:47:10 PM5/10/17
to AdWords API Forum
Hi DS, 

An API request having 'User-Agent: containing the string "gzip"' and 'Accept-Encoding: with the value gzip' will enable compression. From your earlier description, the compression must be enabled. You could check your SOAP request header to verify these values. If you haven't enabled logging yet, please check this document for instructions. 

The other best practices which were shared in my previous response would improve the overall performance of batch jobs. Since the batch jobs are asynchronous and overall execution time is pretty fast in your case, I am curious to know if there is any particular reason why you are trying to reduce this time? If you have sample batch job ids which are taking longer to upload, we can take a look at them. 
Reply all
Reply to author
Forward
0 new messages