When doing testing with archivematica with huge, greater than 1 TB files, archivematica can take hours to do the fixity and virus check.
For the Fixity, is there a tool that creates some form of a reliable checksum for huge files quickly ?
Was looking at hashdeep's multithread capabilities, but it still uses a single thread per file
Was wondering if there's a huge file checksum standard (and tool implementing the standard) that breaks the file into equal size blocks so multiple threads could do some checksum calculations, then do a final checksum calculation similar to a Merkle tree approach.