Checksum validation of >1TB blobs

66 views
Skip to first unread message

Joanna White

unread,
Feb 26, 2024, 5:44:55 AM2/26/24
to Spectra Logic S3 SDKs, APIs, and RioBroker
Good morning,

I've been testing the ds3Helper for putting AV files over 1TB. We usually run a whole file checksum comparison against the Etag returned from BP, but understand this isn't possible for blobbed items. We temporarily retrieve the same file and run a whole file checksum against it for comparison, which is far from ideal but belts and braces for the time being.

Looking for advice about building a Python PUT for blobbed items that allows me to run a checksum validation against the source item for comparison against what your ds3Helper scripts might be creating, possibly in the calculate_checksum_header() function? Would appreciate some advice for what you think is the most efficient method using existing helper tools.

If there's a question already here covering the same topic then a pointer will be great too.

Many thanks,
Joanna (BFI National Archive)
Reply all
Reply to author
Forward
0 new messages