Thanks Abraham, I wasn't aware of this upcoming change. It doesn't seem like Twitter is mandating sequential searches or tokens as you should be able to make an initial search, figure out the max_id, bundle your subsequent n requests, then maybe use since_id at the end to figure out what you missed. So it's probably slightly less concurrent?
I'll check it out. Cheers, Ryan
On Monday, May 7, 2012 2:23:52 PM UTC-7, Abraham Williams wrote:
Twitter is actually moving away from the classic pagination and to cursor and max/min based pagination.
All methods currently supporting cursoring or usage of since_id and max_id will remove support for "classic" pagination through the "page" parameters.
I think this is a pretty severe limitation. If I want to get a large collection, using tokens takes a very long time. Partial responses help a little but the impact is insignificant. So writing an app with real time interaction with Google+ content is virtually impossible if I want to comb through lots of content.
If this was like database cursors, I could fire several queries at once with each looking through a different block of results. I respectfully recommend you take a look at dynamic pagination, similar to what twitter allows (https://dev.twitter.com/docs/api/1/get/search).
Other than that, I like the API a lot!
On Thursday, April 5, 2012 2:54:17 PM UTC-7, Jenny Murphy wrote:
You must paginate using the tokens. There is no way to skip pages or to access data concurrently. Think of them like database cursors.
If performance is an issue you may want to consider using partial responses. This will decrease the payload size considerably and make things a lot faster.