Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

WhatsApp head says Apple's child safety update is a 'surveillance system'

2 views
Skip to first unread message

Jim_Higgins

unread,
Aug 7, 2021, 2:27:53 PM8/7/21
to
WhatsApp head says Apple's child safety update is a 'surveillance system'
https://www.engadget.com/whatsapp-head-apple-child-safety-surveillance-system-201956787.html

--
Psalm 12-8 NIV 1984
The wicked freely strut about when what is vile is honored among men.

Jolly Roger

unread,
Aug 7, 2021, 5:01:45 PM8/7/21
to
On 2021-08-07, Jim_Higgins <gordi...@hotmail.com> wrote:
>
> WhatsApp head says Apple's child safety update is a 'surveillance system'

Gullible idiots like you will fall for anything that fits your skewed
world view.

Facebook / WhatsApp has already been scanning their users' photos with
the same PhotoDNA database for *decades* now. Here's just one report
from an arrest made back in *2015*, for instance:

<https://www.wired.co.uk/article/whatsapp-encryption-child-abuse>
---
PhotoDNA is used on WhatsApp group images, for example, to find known
instances of child sexual abuse material. In the second half of last
year, WhatsApp also started using a Google machine learning system to
identify new instances of child sexual abuse in photos.

Group names are also scanned, using AI, for potential words related to
child sexual abuse. WhatsApp takes data from child safety groups about
the language predators use. Paedophiles often use codewords to try and
disguise their behaviour. As a result, WhatsApp has designed its systems
to try and identify if criminals are intentionally misspelling “child
porn” as “child pr0n”, for example. The company says its machine
learning systems are designed to find people highly likely to be
violating its policies. Humans then review what has been flagged to make
sure that accounts are only banned or sent to the NCMEC when there is
sufficient evidence. When WhatsApp finds concrete evidence of abuse,
such as through PhotoDNA, it automatically bans any accounts associated
with it.
---

Say some more stupid bullshit to show the world what a complete gullible
dumbass you are, Jimbo. : )

--
E-mail sent to this address may be devoured by my ravenous SPAM filter.
I often ignore posts from Google. Use a real news client instead.

JR

Your Name

unread,
Aug 7, 2021, 5:53:54 PM8/7/21
to
On 2021-08-07 18:27:52 +0000, Jim_Higgins said:

> WhatsApp head says Apple's child safety update is a 'surveillance system'
> https://www.engadget.com/whatsapp-head-apple-child-safety-surveillance-system-201956787.html
>

A. If you're not doing anything wrong, then you've got nothing to whine
about. If you are doing sometyhing wrong, then you get what you deserve.

B. You've been under surveillance near constantly since about the 1970s ...
unless you live in a wooden hut or cave ion the woods and don't use any
device nor have any contact with the civilised world.

Calum

unread,
Aug 8, 2021, 9:43:04 AM8/8/21
to
On 07/08/2021 22:01, Jolly Roger wrote:
> On 2021-08-07, Jim_Higgins <gordi...@hotmail.com> wrote:
> Gullible idiots like you will fall for anything that fits your skewed
> world view.
>
> Facebook / WhatsApp has already been scanning their users' photos with
> the same PhotoDNA database for *decades* now.

They have, but in a different, "more secure" way... at least according
to them.

<https://www.businessinsider.com/whatsapp-head-slams-apple-iphone-scanning-for-child-abuse-images-2021-8?r=US&IR=T>

Jolly Roger

unread,
Aug 8, 2021, 1:18:59 PM8/8/21
to
Bullshit. He said no such thing. In fact, the words "more secure" aren't
even mentioned in that article or his lame tweet.

He also said the Apple software would allow access to "scan all of a
user's private photos on your phone — even photos you haven't shared
with anyone." which is a blatant lie. Only photos being transferred to
Apple servers are examined.

Nice try. No cigar. Facebook, WhatsApp, Google, and a slew of others
have been using the same PhotoDNS system to monitor their users photos
for many years without a peep from people, and only now that Apple has
announced they are doing it too is there a major "outcry". Hypocrisy at
its finest on display.

And the fact is the way Apple is doing this actually preserves the
privacy of people who are not in violation better than Facebook,
WhatsApp, Google, and others, because:

* only photos uploaded or transferred to Apple’s iCloud servers are
examined; they are examined by generating a hash of the photo and
comparing that hash to a list of hashes of known child sexual abuse
photos

* only those hashes that match the hashes of known child sexual abuse
photos are flagged as potential violations by generating encrypted
safety vouchers containing metadata and visual derivatives of matched
photos

* Apple employees know absolutely nothing about images that are not
uploaded or transferred to Apple servers — nor do they know anything
about photos that do not match hashes of known child sexual abuse
photos

* the risk of the system incorrectly flagging an account is an extremely
low (1 in 1 trillion) probability of incorrectly flagging a given
account

* only accounts with safety vouchers that exceed a threshold of multiple
matches are able to be reviewed by Apple employees - until the
threshold is exceeded, the encrypted vouchers cannot be viewed by
anyone

* end users cannot access or view the database of known child sexual
abuse photos - nor can they identify which images were flagged

* only photos that were reviewed and verified to be child sexual abuse
are forwarded to authorities

Apple customers who are concerned by this system can opt out by
refraining from uploading photos to iCloud (by disabling iCloud Photos,
My Photo Stream, and iMessage). Since these are all optional services,
this is very easy to do.

Claims stating that Apple is supposedly scanning your entire device 24/7
are unfounded.

Claims that Apple is scanning every single photo on your device are also
unfounded.

Joerg Lorenz

unread,
Aug 10, 2021, 11:44:24 AM8/10/21
to
Am 07.08.21 um 20:27 schrieb Jim_Higgins:
> --
> Psalm 12-8 NIV 1984
> The wicked freely strut about when what is vile is honored among men.

Are you a bigot?


--
De gustibus non est disputandum

Calum

unread,
Aug 10, 2021, 9:31:54 PM8/10/21
to
On 08/08/2021 18:18, Jolly Roger wrote:
> On 2021-08-08, Calum <com....@nospam.scottishwildcat> wrote:
>> On 07/08/2021 22:01, Jolly Roger wrote:
>>> On 2021-08-07, Jim_Higgins <gordi...@hotmail.com> wrote:
>>> Gullible idiots like you will fall for anything that fits your skewed
>>> world view.
>>>
>>> Facebook / WhatsApp has already been scanning their users' photos with
>>> the same PhotoDNA database for *decades* now.
>>
>> They have, but in a different, "more secure" way... at least according
>> to them.
>>
>> <https://www.businessinsider.com/whatsapp-head-slams-apple-iphone-scanning-for-child-abuse-images-2021-8?r=US&IR=T>
>
> Bullshit. He said no such thing. In fact, the words "more secure" aren't
> even mentioned in that article or his lame tweet.


You're correct, he did not use the words "more secure". He said it works
"without breaking encryption", which I took to mean that he believed it
more secure than Apple's approach. But I should have double-checked the
quote before posting.

(I also said "at least according to them" because I don't believe it any
more than you do.)

> He also said the Apple software would allow access to "scan all of a
> user's private photos on your phone — even photos you haven't shared
> with anyone." which is a blatant lie. Only photos being transferred to
> Apple servers are examined.

And that's certainly true for the CSAM detection bit that everyone's
talking about. Although Apple did also announce the new "Communication
Safety in Messages" feature at the same time, which uses "on-device
machine learning" to analyze any photo that a Family Sharing account
belonging to a child sends or receives, and warns their parents about
any that appear to contain nudity. Still short of scanning "all of a
user's private photos", but certainly some that may never leave your phone.

Jolly Roger

unread,
Aug 12, 2021, 10:34:07 AM8/12/21
to
That scan never leaves your phone either.

Meanwhile Facebook and WhatsApp access and scan *ALL* of their user's
photos.

WhatsApp / Facebook are full of shit.
0 new messages