In article <sfp6m1$ngb$
1...@dont-email.me>, Peter Köhlmann
<
peter-k...@t-online.de> wrote:
> >>>> China Could Abuse Apple1s Child Porn Detection Tool, Experts Say
> >>>
> >>> they can't.
> >>>
> >>> read apple's white papers for how it works. it doesn't take an expert
> >>> to figure out that it's not a concern.
> >>>
> >> Just wire in a different database
> >
> > like i said, read apple's white papers. what you describe will not work.
> >
>
> Well, it has already been proven that it DOES work.
that has not been proven.
> Files with
> collisions have been made. Files which were NOT child porn.
that was done using an older neural hash algorithm that is not the same
as what's used in csam. in other words, it's meaningless.
<
https://www.vice.com/en/article/wx5yzq/apple-defends-its-anti-child-abu
se-imagery-tech-after-claims-of-hash-collisions>
Researchers claim they have probed a particular part of Apple's
new system to detect and flag child sexual abuse material, or CSAM,
and were able to trick it into saying two images that were clearly
different shared the same cryptographic fingerprint. But Apple says
this part of its system is not supposed to be secret, that the
overall system is designed to account for this to happen in general,
and that the analyzed code is not the final implementation that will
be used with the CSAM system itself and is instead a generic version
there's also an additional secondary hash done to check if there has
been any manipulation:
<
https://pbs.twimg.com/media/E9L9hj_VcAA8vBM?format=png&name=large>
...First, as an additional safeguard, the visual derivatives
themselves are matched to the known CSAM database by a second,
independent perceptual hash. This independent hash is chosen to
reject the unlikely possibility that the match threshold was exceeded
due to non-CSAM images that were adversarially perturbed to cause
false NeuralHash matches against the on-device encrypted CSAM
database. If the CSAM finding is confirmed by this independent hash,
the visual derivatives are provided to Apple human reviewers for
final confirmation.
there's still a ~30 image threshold and manual review.
there's also the fact that the database is not something china can
modify.
> As usual, you don't know your ass from a hole in the ground
that would be you.