Generating unique sequential ids is a hard problem in this kind of distributed system.
The cause of your issue is probably eventual consistency, but you probably will not be able to get strong consistency without crippling your datastore throughput.
If you really want sequential ids, I recommend using a sequence in cloudsql to generate them when you need one. This won't be perfect (some numbers may get skipped) but it won't create duplicates.
On Jan 28, 2016, at 7:22 AM, Louise Elmose Hedegaard <louise...@gmail.com> wrote:Hi guys,
Thank you for your answers.I do not think I have explained my problem properly though when I read your answers.I have a table "order" where I save different information about an order. The unique id of the order comes from an external system, and I want to save this unique id for the order.When an order is updated in the external system, I want to update the GAE datastore accordingly.For this reason I first check whether the order that was updated in the external system already exists in my datastore. If it already exists I update it, if it does not already exist, I create the order in the datastore.My problem is that I use this:datastore.prepare(query).asSingleEntity();which fails, as there are more than one entity with the unique id.I do not understand how several entities can be created with the same unique id, when my logic says:
--
You received this message because you are subscribed to the Google Groups "Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-appengi...@googlegroups.com.
To post to this group, send email to google-a...@googlegroups.com.
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-appengine/c774c62a-98d8-467d-b97a-c5e817644cc5%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
My guess is based on your question you haven't wrapped your head around it, and as a result your data models and subsequently data are going to get broken. Fixing them after go live will be hard, so you need to get a lot closer now.
Once you've got the basic concepts down, it should become clearer how to model this, and why this is happening.