Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

iOS 15, iPadOS 15, watchOS 8 and macOS Monterey suck

1 view
Skip to first unread message

Go deface and purge CityxGuide.com, Instagram.com, Facebook.com, Twitter.com, Backpage.com, and 1backpage.com. Because they waste of electricity

unread,
Aug 28, 2021, 7:05:02 PM8/28/21
to
Serious Warning Issued For Millions Of Apple iPhone Users
Gordon Kelly
Gordon KellySenior Contributor
Consumer Tech
I write about technology's biggest companies
Follow

iPhone owners are currently facing multiple threats, but Apple’s new CSAM
detection system has generated greater controversy than all the rest
combined. And it just took another twist.

Apple, CSAM, iPhone, iPhone CSAM, iPhone upgrade, iPhone privacy, macOS
CSAM, iPad security,
iPhone owners have been warned about serious implications for their privacy
in Apple's upcoming ... [+] APPLE

MORE FROM FORBES
New iPhone iMessage Flaw Enables 'Zero Click' Hack
By Gordon Kelly
In a shocking new post, Edward Snowden had delved into Apple’s CSAM (child
sexual abuse material) detection system coming to iPhones, iPads and Macs
and states: “Apple’s new system, regardless of how anyone tries to justify
it, will permanently redefine what belongs to you, and what belongs to
them.” He also shows what you can do about it — for now.

PROMOTED


CSAM detection works by matching a user’s images to illegal material. “Under
the new design, your phone will now perform these searches on Apple’s behalf
before your photos have even reached their iCloud servers, and… if enough
"forbidden content" is discovered, law-enforcement will be notified,”
Snowden explains. “Apple plans to erase the boundary dividing which devices
work for you, and which devices work for them.”


“The day after this system goes live, it will no longer matter whether or
not Apple ever enables end-to-end encryption, because our iPhones will be
reporting their contents before our keys are even used [his emphasis],” says
Snowden. Moreover, while he cites “compelling evidence” from researchers
that Apple’s CSAM detection system is seriously flawed, he draws attention
to a much bigger point:

MORE FOR YOU
New Apple Exclusive Reveals Major iPhone 13 Featured Cancelled
New iPhone iMessage Flaw Enables ‘Zero Click’ Hack
Apple Loop: Stunning iPhone Leak, Powerful MacBook Pro Features, Massive
iPhone Warning
“Apple gets to decide whether or not their phones will monitor their owners’
infractions for the government, but it's the government that gets to decide
what constitutes an infraction... and how to handle it.”

Apple, iOS 15, MacOS Monterey, Apple CSAM, iPhone, iPhone privacy, MacBook,
MacBook privacy, iPhone security
Apple's CSAM detection system is coming to Apple iOS 15 and macOS Monterey
next month APPLE1
Furthermore, Snowden points out that the entire system is easily bypassed
which undermines the stated aim behind its creation:

“If you’re an enterprising pedophile with a basement full of CSAM-tainted
iPhones, Apple welcomes you to entirely exempt yourself from these scans by
simply flipping the ‘Disable iCloud Photos’ switch, a bypass which reveals
that this system was never designed to protect children, as they would have
you believe, but rather to protect their brand. As long as you keep that
material off their servers, and so keep Apple out of the headlines, Apple
doesn’t care.”

And, for those of you already thinking ahead, Snowden points out there is an
obvious next step to this process: governments compelling Apple to remove
the option to Disable photo uploads to iCloud.

“If Apple demonstrates the capability and willingness to continuously,
remotely search every phone for evidence of one particular type of crime,
these are questions for which they will have no answer. And yet an answer
will come — and it will come from the worst lawmakers of the worst
governments. This is not a slippery slope. It’s a cliff.”

Researchers have already pointed out all the ways this could be exploited
and the markets Apple could be removed from if it doesn’t comply with
government requests. There is already precedent here. In May, Apple was
accused of compromising on censorship and surveillance in China after
agreeing to move the personal data of its Chinese customers to the servers
of a state-owned Chinese firm. Apple also states that it provided customer
data to the US government almost 4,000 times last year.

Apple, iPhone, MacBook, Mac, iOS 15, macOS monteray, CSAM, iPhone privacy,
iPhone security, macOS privacy
Apple's Privacy statement from its official privacy homepage APPLE
“I can’t think of any other company that has so proudly, and so publicly,
distributed spyware to its own devices... There is no fundamental
technological limit to how far the precedent Apple is establishing can be
pushed, meaning the only restraint is Apple’s all-too-flexible company
policy, something governments understand all too well.”

Interestingly, Snowden doesn’t touch upon a further key threat: if Apple
gets hacked. Creating a backdoor into such a far reaching detection system
means it is possible Apple would not be aware of how its devices are being
scanned and manipulated.

“[Apple is] inventing a world in which every product you purchase owes its
highest loyalty to someone other than its owner. To put it bluntly, this is
not an innovation but a tragedy, a disaster-in-the-making.”

To date, Apple has defended its CSAM detection system saying it was poorly
communicated. But last week researchers, who worked on a similar system for
two years, concluded “the technology was dangerous” saying “we were baffled
to see that Apple had few answers for the hard questions we’d surfaced.”

CSAM will launch on iOS 15, iPadOS 15, watchOS 8 and macOS Monterey next
month. I have reached out to Apple for a comment and will update this post
when/if I receive a response.

In the meantime, I would advise all Apple fans to read Snowden’s full post
and make up your own mind.

0 new messages