Hi,
Your best approach with such a large number of rows is to resequenced the rows in the database.
1. Back up what is needed to be saved - in case it goes wrong
2. If possible, stop BCC, so no updates take place
3. Run a re-sequencing script
For step 3, you have to write it yourself (or ask a DBA'er). Beware of the versioning, you may only have to fix a single version of the data.
When deleting an organisational role in our DB, we used to re-sequence the roles list for organisations with the script below, which may be used as a starting point:
MERGE INTO DPS_ROLE_REL_ORG T1
USING (SELECT RID, NEW_SEQNR
FROM (SELECT ROWID RID,
SEQUENCE_NUM SEQNR,
ROW_NUMBER() OVER (PARTITION BY ORGANIZATION ORDER BY SEQUENCE_NUM) - 1 NEW_SEQNR
FROM DPS_ROLE_REL_ORG)
WHERE SEQNR != NEW_SEQNR) T2
ON (T1.ROWID = T2.RID)
WHEN MATCHED THEN UPDATE SET SEQUENCE_NUM = NEW_SEQNR;
You could rewrite this for a versioned child product table, and try it on a small local table - manually corrupted to act as a test set.
Hope this helps
Wilco