You are on the right track, but there are a couple tricks to it. I do similar things all the time, often with millions of records/tasks. Since you pointed at java documentation, I assume you're using Java.
The simplest way to do what you want is to perform a keys-only query for your users and batch add the tasks. Guava collections transformation helps a lot here. Run a query that gives you an Iterable<Key<User>>, then transform that into an Iterable<OneUserDeferredTask>, and pass that to a method like this:
/** Allows any number of tasks; automatically partitions as necessary */
public void add(Iterable<? extends DeferredTask> payloads) {
Iterable<TaskOptions> opts = Iterables.transform(payloads, new Function<DeferredTask, TaskOptions>() {
@Override
public TaskOptions apply(DeferredTask task) {
return TaskOptions.Builder.withPayload(task);
}
});
Iterable<List<TaskOptions>> partitioned = Iterables.partition(opts, QueueConstants.maxTasksPerAdd());
for (List<TaskOptions> piece: partitioned)
queue().addDeferredTask(piece);
}
Guava partition() makes life easy for you here. This naive approach will stand up to enqueueing tens of thousands of tasks easily. However, there's one limit you may run into somewhere before 100k - any single query to the datastore times out after 60s, even if you're running from a cron job (which otherwise gives you 10m). So as your userbase grows you may start to exceed this limit.
The easiest solution is to simply run this in a loop and checkpoint the query every 10k rows or so. Get a cursor, and rerun the query from that cursor. This will keep each individual query under the 60s deadline and you will easily be able to process 100k+ records in 10m.
If you're talking about considerably more records, or you want to process this data in a hurry, you can use the __scatter__ property to create partitions that you can transform into tasks in parallel. The map/reduce framework works this way. Here's some sample code I posted recently that should give you a head start:
However, start with the most naive approach and grow from there (assuming you don't already have 100k users).
Suerte,
Jeff