Apple overview of how child pornography scan works

5 views
Skip to first unread message

Alan Browne

unread,
Aug 14, 2021, 10:11:39 AMAug 14
to

https://www.apple.com/child-safety/pdf/Security_Threat_Model_Review_of_Apple_Child_Safety_Features.pdf

p.5 contains the relevant piece wrt to false positives handling.

Because the system is using known material as the source reference, and,
such source material (each ref. image) needs to come from 2 sovereign
jurisdictions, and
30 instances need to be hit before alerting occurs, and
human reviewers need to verify the derivative images before any legal
action occurs, I'd be quite confident in it.
(And that's simplified...)

In essence the probability that a single random snapshot of a child gets
'flagged' is unlikely at all, and that 30 different images would need to
get flagged before the system refers the derivatives to Apple's human
verification team.

The overall feature also has various 3rd party verification parts to it
- these as much to guard against sabotage or bad faith sovereign acts as
anything else.

--
"...there are many humorous things in this world; among them the white
man's notion that he is less savage than the other savages."
-Samuel Clemens

Lewis

unread,
Aug 15, 2021, 7:01:01 PMAug 15
to
In message <0001HW.DD3ED19C...@news.astraweb.com> Ed Norton <nor...@nowhere.com> wrote:
> On Sat, 14 Aug 2021 10:11:35 -0400, Alan Browne wrote
> (in article <raQRI.930$uk4...@fx20.iad>):

>>
>>
> https://www.apple.com/child-
> safety/pdf/Security_Threat_Model_Review_of_Apple_C
>> hild_Safety_Features.pdf
>>
>> p.5 contains the relevant piece wrt to false positives handling.
>>
>> Because the system is using known material as the source reference, and,
>> such source material (each ref. image) needs to come from 2 sovereign
>> jurisdictions, and
>> 30 instances need to be hit before alerting occurs, and
>> human reviewers need to verify the derivative images before any legal
>> action occurs, I'd be quite confident in it.
>> (And that's simplified...)
>>
>> In essence the probability that a single random snapshot of a child gets
>> 'flagged' is unlikely at all, and that 30 different images would need to
>> get flagged before the system refers the derivatives to Apple's human
>> verification team.
>>
>> The overall feature also has various 3rd party verification parts to it
>> - these as much to guard against sabotage or bad faith sovereign acts as
>> anything else.
>>
>>

> These are all just arbitrary choices that Apple can change anytime it
> chooses. The fundamental fact is that Apple has no business doing this
> whether it's 30 images or 300 images.

That is your opinion, but it is not a fact. You do have a choice though,
don't put images on iCloud.

Of course, you can't put images anywhere else without them ALL being
scanned and indexed and used for ML and tagged with your information,
location, and used to track your movements, none of which Apple does
because they DO NOT decrypt your data when it is sent to iCloud. The
only tagging and scanning and "smarts" dealing with photos is all done
on YOUR device. Apple is so concerned with privacy that eventhat data on
your device is not shared between devices, EACH device runs its own
index which exists ONLY on the device. No one else does this, they five
you good indexes to quell your objections to the fact that they have a
complete and total index of every bot of data you allow them to see, and
they can do whatever the hell they want with it.

But if this all bothers you, don't put your photos on the Internet. No
where on the Internet.

--
Sometimes the only thing you could do for people was to be there.
--Soul Music

Lewis

unread,
Aug 17, 2021, 4:30:01 PMAug 17
to
In message <0001HW.DD417CC4...@news.astraweb.com> Ed Norton <nor...@nowhere.com> wrote:
> On Sun, 15 Aug 2021 19:00:58 -0400, Lewis wrote

>> But if this all bothers you, don't put your photos on the Internet. No
>> where on the Internet.

> "Everybody else does it" is not a justification.

I never said it was, did I. It is, however, a fact.

> I read all of Apple's white papers. They are all about encryption,
> algorithms, number counts, blah, blah, blah. Nowhere do they address
> why Apple should take on what is essentially a law enforcement role.

That is quite a different question. I suspect the answer will become
clear relatively soon. iOS 16, perhaps.


--
What are you, Ghouls? There are no dead students here. This week.
Reply all
Reply to author
Forward
0 new messages