Hello,
I'm a data engineer for Madup, Inc. I have a question about GoogleAdsService API.
I've tried to download an 'ad_group_criterion' report using google ads python library (v7).
Due to the volume of the data, it takes a couple of hours, but it always fails with "Page token has expired" error.
Here's a response I've got:
--------------------------------------------------------------------------------
Method: /google.ads.googleads.v7.services.GoogleAdsService/Search
Host:
googleads.googleapis.comHeaders: {
"developer-token": "REDACTED",
"login-customer-id": "my-login-customer-id",
"x-goog-api-client": "gl-python/3.8.12 grpc/1.44.0 gax/1.31.5",
"x-goog-request-params": "customer_id=my-customer-id"
}
Request: customer_id: "my-customer-id"
"query: "
SELECT
customer.id,
campaign.id,
ad_group.id, ad_group_criterion.ad_group, ad_group_criterion.age_range.type, ad_group_criterion.approval_status, ... (and many other fields)
FROM ad_group_criterion
ORDER BY
customer.id ""
page_token: "CO_Fp9XXrcrruQEQoLH0ARj3xY-r8i8iFmtQbHRhRnNEZlg4cGlRZHc3ZmhxTHcqAlY3MAA42ab4gw1A____________AQ"
page_size: 2000
Response
-------
Headers: {
""google.ads.googleads.v7.errors.googleadsfailure-bin": "
\u001d
\u0002\b\b\u0012\u0017Page token has expired.\u0012\u0016p-VkuL-ZESMoHCcQK7fVQw","
""grpc-status-details-bin": "\b\u0003\u0012%Request contains an invalid argument.\u001a~
Ctype.googleapis.com/google.ads.googleads.v7.errors.GoogleAdsFailure\u00127\u001d
\u0002\b\b\u0012\u0017Page token has expired.\u0012\u0016p-VkuL-ZESMoHCcQK7fVQw","
"request-id": "p-VkuL-ZESMoHCcQK7fVQw"
}
Fault: errors {
error_code {
request_error: EXPIRED_PAGE_TOKEN
}
message: "Page token has expired."
}
request_id: "p-VkuL-ZESMoHCcQK7fVQw"
--------------------------------------------------------------------------------
I read the
other discussions so I tried to download with page_size as 10000, but it fails again.
One weird thing is that the 'page token has expired' error occurs exactly 4 hours after the first request is sent.
From the above discussion, page_token is valid for 2 hours, but it's not related to this case because I processed every page in 7~10 secs and re-requested the next page. (Furthermore, I got the error after 4 hours from the first request, not 2 hours.)
I also tried to use 'search_stream' instead of 'search' for test on my locals, but I got an rpc error while pulling data. I think it's because of the unstable network. So I'm trying to test on our cloud infra. If it works well, I'd like to use search_stream, but anyway I'm curious why I've got that page token error.
Thanks!