Good morning,
I've been testing the ds3Helper for putting AV files over 1TB. We usually run a whole file checksum comparison against the Etag returned from BP, but understand this isn't possible for blobbed items. We temporarily retrieve the same file and run a whole file checksum against it for comparison, which is far from ideal but belts and braces for the time being.
Looking for advice about building a Python PUT for blobbed items that allows me to run a checksum validation against the source item for comparison against what your ds3Helper scripts might be creating, possibly in the calculate_checksum_header() function? Would appreciate some advice for what you think is the most efficient method using existing helper tools.
If there's a question already here covering the same topic then a pointer will be great too.
Many thanks,
Joanna (BFI National Archive)