Fixity for Huge terrabyte files

29 views
Skip to first unread message

Geoffrey Brimhall

unread,
Jan 31, 2018, 6:43:58 PM1/31/18
to Archivematica Tech
When doing testing with archivematica with huge, greater than 1 TB files, archivematica can take hours to do the fixity and virus check.

For the Fixity, is there a tool that creates some form of a reliable checksum for huge files quickly ?

Was looking at hashdeep's multithread capabilities, but it still uses a single thread per file

Was wondering if there's a huge file checksum standard (and tool implementing the standard) that breaks the file into equal size blocks so multiple threads could do some checksum calculations, then do a final checksum calculation similar to a Merkle tree approach.



Reply all
Reply to author
Forward
0 new messages