Timeout Errors

2,454 views
Skip to first unread message

Spanishgringo

unread,
Aug 17, 2009, 12:40:55 PM8/17/09
to google-analytics-api - GA Data Export API
I keep getting timeout errors. Any ideas (other than reduce the max-
results parameter)? the API is supposed to support up to 10000
records per query.

Stack trace:
Network error trying to retrieve feed: http method GET against URL
https://www.google.com/analytics/feeds/data?start-index=1&max-results=10000&ids=ga%3A9265685&dimensions=ga%3AlandingPagePath&metrics=ga%3Aentrances%2Cga%3Abounces&sort=-ga%3Aentrances&filters=ga%3Amedium%3D%3Dorganic&start-date=2009-05-01&end-date=2009-05-31
timed out.
java.io.IOException: http method GET against URL
https://www.google.com/analytics/feeds/data?start-index=1&max-results=10000&ids=ga%3A9265685&dimensions=ga%3AlandingPagePath&metrics=ga%3Aentrances%2Cga%3Abounces&sort=-ga%3Aentrances&filters=ga%3Amedium%3D%3Dorganic&start-date=2009-05-01&end-date=2009-05-31
timed out.
at
com.google.appengine.api.urlfetch.URLFetchServiceImpl.handleApplicationException
(URLFetchServiceImpl.java:56)
at com.google.appengine.api.urlfetch.URLFetchServiceImpl.fetch
(URLFetchServiceImpl.java:30)
at
com.google.apphosting.utils.security.urlfetch.URLFetchServiceStreamHandler
$Connection.fetchResponse(URLFetchServiceStreamHandler.java:389)
at
com.google.apphosting.utils.security.urlfetch.URLFetchServiceStreamHandler
$Connection.getInputStream(URLFetchServiceStreamHandler.java:289)
at
com.google.apphosting.utils.security.urlfetch.URLFetchServiceStreamHandler
$Connection.getResponseCode(URLFetchServiceStreamHandler.java:131)
at com.google.gdata.client.http.HttpGDataRequest.checkResponse
(HttpGDataRequest.java:535)
at com.google.gdata.client.http.HttpGDataRequest.execute
(HttpGDataRequest.java:515)
at com.google.gdata.client.http.GoogleGDataRequest.execute
(GoogleGDataRequest.java:515)
at com.google.gdata.client.Service.getFeed(Service.java:1053)
at com.google.gdata.client.Service.getFeed(Service.java:916)
at com.google.gdata.client.GoogleService.getFeed(GoogleService.java:
631)
at com.google.gdata.client.Service.getFeed(Service.java:935)
at gaexport.GAExportServlet.getPageResults(GAExportServlet.java:128)
at gaexport.GAExportServlet.doPost(GAExportServlet.java:101)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:713)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:806)
at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:
487)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter
(ServletHandler.java:1093)
at
com.google.apphosting.utils.servlet.TransactionCleanupFilter.doFilter
(TransactionCleanupFilter.java:43)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter
(ServletHandler.java:1084)
at com.google.appengine.tools.development.StaticFileFilter.doFilter
(StaticFileFilter.java:124)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter
(ServletHandler.java:1084)
at org.mortbay.jetty.servlet.ServletHandler.handle
(ServletHandler.java:360)
at org.mortbay.jetty.security.SecurityHandler.handle
(SecurityHandler.java:216)
at org.mortbay.jetty.servlet.SessionHandler.handle
(SessionHandler.java:181)
at org.mortbay.jetty.handler.ContextHandler.handle
(ContextHandler.java:712)
at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:
405)
at com.google.apphosting.utils.jetty.DevAppEngineWebAppContext.handle
(DevAppEngineWebAppContext.java:54)
at org.mortbay.jetty.handler.HandlerWrapper.handle
(HandlerWrapper.java:139)
at com.google.appengine.tools.development.JettyContainerService
$ApiProxyHandler.handle(JettyContainerService.java:313)
at org.mortbay.jetty.handler.HandlerWrapper.handle
(HandlerWrapper.java:139)
at org.mortbay.jetty.Server.handle(Server.java:313)
at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:
506)
at org.mortbay.jetty.HttpConnection$RequestHandler.content
(HttpConnection.java:844)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:644)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:211)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:381)
at org.mortbay.io.nio.SelectChannelEndPoint.run
(SelectChannelEndPoint.java:396)
at org.mortbay.thread.BoundedThreadPool$PoolThread.run
(BoundedThreadPool.java:442)

Spanishgringo

unread,
Aug 18, 2009, 6:43:19 AM8/18/09
to google-analytics-api - GA Data Export API
I think I know the error now and it seems to live in App Engine and
not GData. The App Engine framework is using URLFetch() and that is
capped at 5 seconds before throwing an error. GA does not have enough
time to respond for most queries since the cap is 5 seconds. There
does not appear to be any way to adjust the timeout either.

The App Engine team should work more closely with the GData team to
fix these types of problems. If you are going to keep the cap at 5
seconds give us some sort of async notify service to tell us to pull
the data now that it is ready.

Nick

unread,
Aug 28, 2009, 2:22:06 PM8/28/09
to google-analytics-api - GA Data Export API
Hi There,

Yes...you're absolutely right. Both in the problem and the solution.

App Engine has a "request timeout" in which your servlet has 30
seconds to return a response after which an DeadlineExceededException
is thrown : http://code.google.com/appengine/docs/java/runtime.html#The_Request_Timer

Also App Engine's URL Fetch library only supports response 1 MB in
size. With the GA API providing 10k rows of data across 7 dimensions
and 10 metrics, you can easily blow past this limit:
http://code.google.com/appengine/docs/java/urlfetch/overview.html#Quotas_and_Limits

So in the short term, if you want to use App Engine to handle large
Google Analytics API requests, you will have to speed up the requests
from the API by shortening your date range and removing filters. To
reduce the amount of data being transmitted back from App Engine,
instead of one 10,000 row call, you can make ten 1,000 row calls.

Good Luck!
-Nick

Spanishgringo

unread,
Sep 1, 2009, 7:26:45 AM9/1/09
to google-analytics-api - GA Data Export API
The problem is that I am getting this error at times for requests of
100 rows because the limit is 5 seconds for URLFetch, not 30 seconds.
http://code.google.com/appengine/docs/java/urlfetch/overview.html

30 seconds is the time limit for the AppEngine servlet to respond.

The error I am getting is due to the AppEngine URLFetch service, which
is limited to 5 seconds. That is why certain requests for 100 records
fail as well.

The 1MB limit is for using the DataStore (per entity), not handling a
Response. The Request and Response size for my servlets can be up to
10MB.
http://code.google.com/appengine/docs/java/urlfetch/overview.html

To me, there are two pieces of the solution:
1) Up the URLFetch timelimit to 30 seconds (and perhaps bump up the
servlet response time to allow for the longer URLFetch)
2) Analytics Java API team creates new APIs to allow for generate
report call with a 200 OK response as well as a token to be used for
retrieval of the data when it is completed. Doing this would also
allow me to batch the reports in GA. When finished there could either
be a callback or I could just used some sort of polling to see if the
report is ready.

Michael


On Aug 28, 8:22 pm, Nick wrote:
> Hi There,
>
> Yes...you're absolutely right. Both in the problem and the solution.
>
> App Engine has a "requesttimeout" in which your servlet has 30
> seconds to return a response after which an DeadlineExceededException
> is thrown :http://code.google.com/appengine/docs/java/runtime.html#The_Request_T...
>
> Also App Engine's URL Fetch library only supports response 1 MB in
> size. With the GA API providing 10k rows of data across 7 dimensions
> and 10 metrics, you can easily blow past this limit:http://code.google.com/appengine/docs/java/urlfetch/overview.html#Quo...

Nick

unread,
Sep 1, 2009, 2:18:28 PM9/1/09
to google-analytics-api - GA Data Export API
Hi Michael, ok...

#2 is definitely a feature request.

#1 The Python client lib v1.2.4 supports setting the urlFetch timeout
to 10s: http://code.google.com/appengine/docs/python/urlfetch/fetchfunction.html
so we're going to have to wait until Java v1.2.2 catches up.

The servlet can return up to 10MB, but the urlFetch service is capped
at requests/responses of 1MB of data. Currently making a GA API
request can easy return results greater than 1MB (like requesting 10k
entries). You can configure the urlFetch service to either throw an
error or to truncate the response.

-Nick


On Sep 1, 4:26 am, Spanishgringo wrote:
> The problem is that I am getting this error at times for requests of
> 100 rows because the limit is 5 seconds for URLFetch, not 30 seconds.http://code.google.com/appengine/docs/java/urlfetch/overview.html
>
> 30 seconds is the time limit for the AppEngine servlet to respond.
>
> The error I am getting is due to the AppEngine URLFetch service, which
> is limited to 5 seconds.  That is why certain requests for 100 records
> fail as well.
>
> The 1MB limit is for using the DataStore (per entity), not handling a
> Response.  The Request and Response size for my servlets can be up to
> 10MB.http://code.google.com/appengine/docs/java/urlfetch/overview.html

Spanishgringo

unread,
Sep 3, 2009, 9:08:44 AM9/3/09
to google-analytics-api - GA Data Export API
Where can I make that feature request? I suppose I would do it here:
http://code.google.com/p/gdata-java-client/issues/entry

correct?

Nick

unread,
Sep 3, 2009, 1:44:46 PM9/3/09
to google-analytics-api - GA Data Export API
Yes, we're trying to move to a more transparent feature request
system. Please add it.

Nick

unread,
Sep 3, 2009, 5:58:16 PM9/3/09
to google-analytics-api - GA Data Export API
App Engine version 1.2.5 has just been released:
http://googleappengine.blogspot.com/2009/09/app-engine-sdk-125-released-for-python.html

You now can increase the URLFetch timeout to 10 seconds:
http://code.google.com/appengine/docs/java/urlfetch/overview.html#Requests

-Nick

Spanishgringo

unread,
Sep 4, 2009, 3:20:28 AM9/4/09
to google-analytics-api - GA Data Export API
Fantastic! I can't wait to try it out now. Thanks.

On Sep 3, 11:58 pm, Nick wrote:
> App Engine version 1.2.5 has just been released:http://googleappengine.blogspot.com/2009/09/app-engine-sdk-125-releas...
>
> You now can increase the URLFetch timeout to 10 seconds:http://code.google.com/appengine/docs/java/urlfetch/overview.html#Req...

Spanishgringo

unread,
Sep 4, 2009, 6:51:39 AM9/4/09
to google-analytics-api - GA Data Export API
No luck. Ten second timout still happening on almost any new report
even when max results is 1000 records. The problem I think is more in
the calculation time for Analytics rather than the time to recevie the
xml response.

I think option 2 will be the only solution for a functional Java App
Engine app that uses Analytics.

I will gladly test any thing that you guys want to try to develop.

Michael

Nick

unread,
Sep 4, 2009, 3:40:34 PM9/4/09
to google-analytics-api - GA Data Export API
Can you share the query you are using?

our API usually returns in < 10s for a 1000 row query.

-Nick

Spanishgringo

unread,
Sep 7, 2009, 4:05:03 AM9/7/09
to google-analytics-api - GA Data Export API
Here is the code. Even if I drop it down to 100 results, but the 8th
or 9th query it errors out again with a timeout.


as = new AnalyticsService("gaExportAPI_acctSample_v1.0");
as.setConnectTimeout(10000);
as.setReadTimeout(10000);
System.out.println("Version: " + as.getVersion());
String baseUrl = "https://www.google.com/analytics/feeds/data";
DataQuery query;

//------------------------------------------------------
// Client Login Authentication
//------------------------------------------------------
try {
as.setUserCredentials(userStr, pass);
} catch (AuthenticationException e) {
System.err.println("Error : " + e.getMessage());
return;
}


try {
query = new DataQuery(new URL(baseUrl));
} catch (MalformedURLException e) {
System.err.println("Malformed URL: " + baseUrl);
return;
}

// TODO parse passed URL to use Query tool
query.setIds("ga:" + profID);
query.setDimensions("ga:landingPagePath");
query.setMetrics("ga:entrances,ga:bounces");
query.setSort("-ga:entrances");
query.setFilters("ga:medium==organic");
query.setStartDate("2009-05-01");
query.setEndDate("2009-05-21");
/* set inside getPageResults
query.setMaxResults(maxNum);
query.setStartIndex(curPage);
*/

// Send our request to the Analytics API and wait for the results
to come back
try {
ArrayList<DataFeed> feedList = getPageResults(new
ArrayList<DataFeed>(), query, curPage, maxNum);
/* Object dataArray[] = feedList.toArray();
for(int i=0;1<dataArray.length;i++) {
outputFeedData((DataFeed) dataArray[i]);
}
*/
}catch (Exception e) {
System.err.println("Error trying to retrieve feed: " +
e.getMessage());
e.printStackTrace();

}



//end of doGet
}

public ArrayList<DataFeed> getPageResults(ArrayList<DataFeed> theList,
DataQuery query, int curPage, int maxNum) throws Exception {
DataFeed feed;
query.setMaxResults(maxNum);
query.setStartIndex((curPage-1)*maxNum+1 );
URL url = query.getUrl();
//URL url = new URL(reqURL);
System.out.println("URL: " + url.toString());

try {
feed = as.getFeed(url, DataFeed.class);
theList.add(feed);
System.out.println("Feed created # "+ theList.size());
System.out.println("Total Results "+ feed.getTotalResults());
if (curPage <= Math.floor(feed.getTotalResults()/maxNum+1)) {
getPageResults(theList, query, curPage+1, maxNum);
}

} catch (IOException e) {

System.err.println("Network error trying to retrieve feed: " +
e.getMessage());
e.printStackTrace();
return null;
} catch (ServiceException e) {
System.err.println("Analytics API responded with an error
message: " + e.getMessage());
return null;
}

return theList;
}

On Sep 4, 9:40 pm, Nick wrote:
> Can you share the query you are using?
>
> our API usually returns in < 10s for a 1000 row query.
>
> -Nick
>
> On Sep 4, 3:51 am, Spanishgringo wrote:
>
> > No luck.  Ten second timout still happening on almost any new report
> > even when max results is 1000 records.  The problem I think is more in
> > the calculation time for Analytics rather than the time to recevie the
> > xml response.
>
> > I think option 2 will be the only solution for a functional Java App
> > Engine app that uses Analytics.
>
> > I will gladly test any thing that you guys want to try to develop.
>
> > Michael
>
> > On Sep 4, 9:20 am, Spanishgringo wrote:
>
> > > Fantastic! I can't wait to try it out now. Thanks.
>
> > > On Sep 3, 11:58 pm, Nick wrote:
>
> > > > App Engine version 1.2.5 has just been released:http://googleappengine.blogspot.com/2009/09/app-engine-sdk-125-releas...
>
> > > > You now can increase the URLFetchtimeoutto 10 seconds:http://code.google.com/appengine/docs/java/urlfetch/overview.html#Req...

Spanishgringo

unread,
Sep 7, 2009, 4:05:21 AM9/7/09
to google-analytics-api - GA Data Export API
> Can you share the query you are using?
>
> our API usually returns in < 10s for a 1000 row query.
>
> -Nick
>
> On Sep 4, 3:51 am, Spanishgringo wrote:
>
> > No luck.  Ten second timout still happening on almost any new report
> > even when max results is 1000 records.  The problem I think is more in
> > the calculation time for Analytics rather than the time to recevie the
> > xml response.
>
> > I think option 2 will be the only solution for a functional Java App
> > Engine app that uses Analytics.
>
> > I will gladly test any thing that you guys want to try to develop.
>
> > Michael
>
> > On Sep 4, 9:20 am, Spanishgringo wrote:
>
> > > Fantastic! I can't wait to try it out now. Thanks.
>
> > > On Sep 3, 11:58 pm, Nick wrote:
>
> > > > App Engine version 1.2.5 has just been released:http://googleappengine.blogspot.com/2009/09/app-engine-sdk-125-releas...
>
> > > > You now can increase the URLFetchtimeoutto 10 seconds:http://code.google.com/appengine/docs/java/urlfetch/overview.html#Req...

Spanishgringo

unread,
Sep 15, 2009, 5:46:26 AM9/15/09
to google-analytics-api - GA Data Export API
Ping

Nick

unread,
Sep 15, 2009, 1:16:37 PM9/15/09
to google-analytics-api - GA Data Export API
Thanks for following up. I want to see this work!

I put a rough example here, fetching 1k rows, can you see if it also
times out: http://analytics-api-sample.appspot.com/

-Nick
> ...
>
> read more »

Spanishgringo

unread,
Sep 16, 2009, 3:53:38 AM9/16/09
to google-analytics-api - GA Data Export API
It works fine. Can you update the query to try the exact query that I
have in my app?

// TODO parse passed URL to use Query tool
query.setIds("ga:" + profID);
query.setDimensions("ga:landingPagePath");
query.setMetrics("ga:entrances,ga:bounces");
query.setSort("-ga:entrances");
query.setFilters("ga:medium==organic");
query.setStartDate("2009-05-01");
query.setEndDate("2009-05-21");
/* set inside getPageResults
query.setMaxResults(maxNum);
query.setStartIndex(curPage);
*/

> ...
>
> read more »

Nick

unread,
Sep 16, 2009, 2:12:25 PM9/16/09
to google-analytics-api - GA Data Export API
Alright, it's up there, try again.

I'm pretty sure your issue lies in your auth routine. App Engine isn't
stateful, so for every request, your code actually goes to the
authentication service and gets a new token. To reduce hammering on
the service, a delay happens after too many requests for the same
token are made. There is where you are getting your time out.

Instead you should store the token in a local variable or the App
Engine data store and reuse the token on subsequent requests.

Here's the Client Login method you would need to set the token:
http://code.google.com/apis/gdata/clientlogin.html#RecallAuthToken

If you do get this working, let me know. I want to see what you do
with it.
-Nick
> ...
>
> read more »

Spanishgringo

unread,
Sep 17, 2009, 6:47:37 AM9/17/09
to google-analytics-api - GA Data Export API
No luck... same timeout error even on the 1st 1000 results.

Here is the code that I updated:

try {
as.setUserCredentials(userStr, pass);
UserToken auth_token = (UserToken) as.getAuthTokenFactory
().getAuthToken(); //NEW CODE
String token = auth_token.getValue(); // token is
'12345abcde' //NEW CODE
as.setUserToken
(token); //
NEW CODE

} catch (AuthenticationException e) {
System.err.println("Error : " + e.getMessage());
return;
}

> ...
>
> read more »

Nick

unread,
Sep 17, 2009, 7:02:03 AM9/17/09
to google-analytics-api - GA Data Export API
It's not clear, my new example doesn't work for you or it works but
your code still doesn't?
> ...
>
> read more »

Nick

unread,
Sep 17, 2009, 3:01:05 PM9/17/09
to google-analytics-api - GA Data Export API
If my example works, then the issue lies in your code.

The pattern you need follows:

1. see if a token exists in the datastore.
1.a if no: go get a token with new credentials. Put it in the data
store.
1.b if yes: put the token in the service object.

2. make requests to the API.
3. if there is a 401 error on the account feed, then the token is
invalid. (Client Login tokens only last ~14 days) go to step 1.a.

Hope that helps.
-Nick
> ...
>
> read more »
Message has been deleted

Spanishgringo

unread,
Sep 18, 2009, 10:51:18 AM9/18/09
to google-analytics-api - GA Data Export API
Your example works. And sometimes mine works for 1000 records. It is
very iffy.

I am still getting the errors and I checked and the token is properly
saved.

Plus, I am sure it is not the captcha issue becasue my code works
perfectly in a local tomcat environment and I do not have any error
handling code for captcha respones.

I have uploaded the app to app engine and the performance is a bit
better but the main problem is the recursive nature of requesting the
pages since I want so many records. I think I have hit a dead end
unless you can guide me to another solution.

Can you try your solution but have it pull sets of 1000 records until
it pulls all of the records available for the query? For example, if
the query has 25000 total records, can you try your solution to keep
paging though to pull all 25000 records in increments of 1000
records?

Can you e-mail me your address and I will send to you my app engine
url for you to take a look?
> ...
>
> read more »
Reply all
Reply to author
Forward
0 new messages