There was a temporary error - RGA package - R

359 views
Skip to first unread message

Claudia Guirao

unread,
Nov 12, 2014, 9:23:51 AM11/12/14
to google-analytics...@googlegroups.com
Dear reader, 



I'm extracting data with rga package for R and it usually works perfectly. But today I am receiving a lot "There was a temporary error. Please try again later."  and  "There was an internal error"

Here is an example of the request details:

  ids= "XXXXXX"
  metrics = "ga:productListViews, ga:productListClicks, ga:productListCTR"
  dimensions ="ga:productSku, ga:productName, ga:productListName, ga:productListPosition"
  filters= ga:productListName==XXXXXX
  sort = "-ga:productListViews"
  EEproductliststats <- GAgetData(p_ids=ids, p_start.date=start.date, p_end.date=end.date, p_metrics=metrics, p_dimensions=dimensions,
                                 p_filters=filters, p_sort=sort)
  
  EEproductliststats


This is not happening with other queries not related with Enhanced Ecommerce. 

I'm trying to know if the API is now working properly with Enhanced Ecommerce or if maybe it is temporary collapsed. 

Could you look into this issue? If you need further information please ask me.

Kind regards

Claudia Guirao 

Kushan Shah

unread,
Nov 14, 2014, 1:39:04 AM11/14/14
to google-analytics...@googlegroups.com
Hi Claudia,

Does the error persist when you query a different profile?

Claudia Guirao

unread,
Nov 14, 2014, 2:12:26 AM11/14/14
to google-analytics...@googlegroups.com
Hi Kushan, 

This is not happening with other queries in the same profile neither in others i've tried. 

I suspect that is because of sth related with Enhanced Ecommerce, 

Thank you 

Kushan Shah

unread,
Nov 14, 2014, 2:41:02 AM11/14/14
to google-analytics...@googlegroups.com
Hi Claudia,

I am looking at an older thread related to a similar problem. Given the set of dimensions in your query, it seems to me that your query might be taking too long to compute on GA's backend; resulting into timeout and hence the error.

Eager to hear what others think of this.

--
You received this message because you are subscribed to a topic in the Google Groups "Google Analytics Reporting API" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/google-analytics-data-export-api/Uf8p70HXUzo/unsubscribe.
To unsubscribe from this group and all its topics, send an email to google-analytics-data-...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Best,
Kushan Shah || Web Analyst || +91-886-661-5292 || Skype: shahkushan1 || What's going on @Tatvic - eCommerce Recommender Series - Where to place your Recommendations?

Thank you for reading this far. This email may be *confidential* or *privileged*. If you received this communication by mistake, please don't forward it to anyone else, please erase all copies and attachments, and please let me know that it went to the wrong person. 

Claudia Guirao

unread,
Nov 14, 2014, 7:43:13 AM11/14/14
to google-analytics...@googlegroups.com
Hi Kushan, 

It seems you were right, I minimized the number of dimensions and... seems it works! 

Unfortunately, this is not always a solutions for me, there's any way to avoid the timeout with "big" queries? 

Thank you in advance. 

To unsubscribe from this group and all its topics, send an email to google-analytics-data-export-api+unsubscribe@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Kushan Shah

unread,
Nov 17, 2014, 1:43:55 AM11/17/14
to google-analytics...@googlegroups.com
Hi Claudia,

I'm afraid I don't have an exact answer to avoiding timeout on these types of queries but I will take a stab at this. IMO, a query would time out when

-> The dimension you're querying leads has a large number of unique elements. e.g. Product SKU in your case. This effect gets compounded whenever you add a new dimension to your query. So essentially, if you have 1K unique SKUs and 20 unique Product List Names you are looking at
1K*20 records in your response.

An alternative way to get the same data would require you to do some data processing in R and use rga's query partitioning mechanism. Steps here -
  1. Remove the metric ProductListCTR from your query since it can be calculated once the List Views and List Clicks are available.
  2. Add the date dimension to the query so that each record has the date column. This will be useful in aggregating at a later step
  3. Also, disable the sort argument since sorting can be done at a later step via R. Since we are partitioning the queries (step #3) sorting is best deferred to the last step
  4. So your query would contain - metrics = "ga:productListViews, ga:productListClicks"
      dimensions ="ga:date, ga:productSku, ga:productName, ga:productListName, ga:productListPosition"
      filters= ga:productListName==XXXXXX
  5. Set 'walk=TRUE' while firing the query via ga$getData(). This partitions the entire query into multiple (day over day) queries. This would take more time to execute and multiple requests to the API might also consume more quota.
  6. Aggregate over the date column by summing up the List Views and List Clicks for each SKU over the date range
  7. Calculate the CTR for each SKU by taking the ratio of List Clicks to List Views
  8. Sort on List Views in descending order
Please note that I haven't tried the above approach but in theory this should give you the results that you expect.

Hope that helps,

To unsubscribe from this group and all its topics, send an email to google-analytics-data-...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.
Reply all
Reply to author
Forward
0 new messages