I'm working on a new sync method, and I have no way from the remote database to indicate what has or hasn't changed. So I have to pull everything down and compare it for purposes of add/update/delete.
Doing a clear followed by add's creates too long of a transaction lock on the database tables for the frequency by which we're trying to update the info.
The method of looping through the remote data and determining add/update works great. Then running a delete based on a condition of .noneOf(returnedUniqueIDs) then yields a collection of the records which no longer exist in the remote source. I can loop that collection and delete records.
This works well for a recordset with about 1500 items. When I up that recordset to 10k items it seems to lockout and not finish. So, I'm curious if there's a practical limitation to the number of keys to pass into the conditional statement?
My alternative is to loop over the local database and check the remote dataset to see if it still exists then delete. Which will work, but it definitely isn't as elegant and honestly might be slower for smaller datasets.
Any thoughts?
Thanks,
Nick
var serverKeys = { // Retrieved from server
"key1": true,
"key2": true,
"key4": true
};
db.table.filter(x => !serverKeys[x.id]).delete();