can somebody give me a hint, what's the best way to do data comparison in the following setup:
- I'd like to compare two DB2 databases after the data migration
- each DB has around 80 tables
- each DB is around 70 GB (size on disk)
- most of the tables are reasonably small (tens to 1 hundred thousands records)
- 5 tables in lower million records range
- 2 tables with 50+ million records
The problem is the following: if I do in memory comparison, I get OOM (understandable), if I do on disk comparison, DB Solo tries to load data from source and target and save it on disk, but that consumes about 12x the original data size multiplied by 2 for 2 databases, so 1.5TB+ and it takes ages to process (writing and then reading 1.5 TB data takes time, not to speak of having so much free disk space on a server...).
Is there a better way to do it? Like having DB Solo compare big tables row by row, without having to load all of it up front, e.g. by doing ORDER BY <primary key>?