Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Apple is paying contractors to personally listen to millions of private recorded Siri conversations every day which is NOT explicitly disclosed in Apple's privacy pollicy

10 views
Skip to first unread message

Arlen G. Holder

unread,
Jul 27, 2019, 2:34:49 AM7/27/19
to
Millions of extremely sensitive personal recordings are personally listened
in on by Apple's hired contractors every day!
o Workers hear drug deals, medical details and people having sex, says whistleblower
<https://www.theguardian.com/technology/2019/jul/26/apple-contractors-regularly-hear-confidential-details-on-siri-recordings>

"There have been countless instances of recordings featuring private
discussions between doctors and patients, business deals, seemingly
criminal dealings, sexual encounters and so on. These recordings are
accompanied by user data showing location, contact details, and app data."

"Apple contractors regularly hear confidential medical information, drug
deals, and recordings of couples having sex, as part of their job providing
quality control, or "grading", the company's Siri voice assistant..."

"Apple does not explicitly disclose it in its consumer-facing privacy
documentation... [and] while Amazon and Google allow customers to opt out
of some uses of their recordings, Apple doesn't offer a similar privacy
protecting option, outside of disabling Siri entirely."

"Apple says the data is used to help Siri and dictation ¡K understand you
better and recognise [sic] what you say but the company does not explicitly
state that that [sic] work is undertaken by humans who listen to the
pseudonymised [sic] recordings"

"Nowhere in Apple's privacy policy does it [Apple] mention that human
workers will be listening to and analyzing that data"

"It wasn¡¦t clear up until recently the extent to which these companies
were listening in on customers"

"Apple's system may also be more concerning for a few reasons, like the
pervasiveness of Apple products. Where Alexa is largely limited to smart
speakers, and Google Assistant to speakers and phones, Siri is also on
Apple's hugely popular Apple Watch, which is on millions of people's wrists
every waking moment. Plus, Siri on an Apple Watch activates any time a user
raises their wrist, not just when it thinks it's heard the 'Hey, Siri' wake
word phrase."

"That's a particularly bad look, given that Apple has built so much of
its reputation on selling itself as the privacy company that defends your
data in ways that Google and Amazon don't. Implicitly telling customers
that, effectively, 'the only way to have peace of mind that a random
stranger won't listen in on their accidentally triggered Siri recordings is
to stop using Siri entirely' is a bit of a mixed message from the company
that supposedly puts privacy at a premium."

Arlen G. Holder

unread,
Jul 28, 2019, 11:56:14 AM7/28/19
to
27 Jul 2019 20:41:23 GMT, Jolly Roger wrote:

> <https://techcrunch.com/2015/09/11/apple-addresses-privacy-questions-about-hey-siri-and-live-photo-features/>
>
> Being able to say the phrase at any time to activate Siri is convenient,
> but raises some questions about what Apple means by 'listening' and
> whether any of that stuff is recorded.
>
> Hey Siri is an optional feature that is enabled by an opt-in step in iOS
> 9's setup. You can choose never to enable it. If you do enable it,
> nothing is ever recorded in any way before the feature is triggered.
>
> "In no case is the device recording what the user says or sending that
> information to Apple before the feature is triggered," says Apple.
>
> Instead, audio from the microphone is continuously compared against the
> model, or pattern, of your personal way of saying 'Hey Siri' that you
> recorded during setup of the feature. Hey Siri requires a match to both
> the 'general' Hey Siri model (how your iPhone thinks the words sound)
> and the 'personalized' model of how you say it. This is to prevent other
> people's voices from triggering your phone's Hey Siri feature by
> accident.
>
> Until that match happens, no audio is ever sent off of your iPhone. All
> of that listening and processing happens locally.
>
> "The 'listening' audio, which will be continuously overwritten, will be
> used to improve Siri's response time in instances where the user
> activates Siri," says Apple. The keyword there being 'activates Siri.'
> Until you activate it, the patterns are matched locally, and the buffer
> of sound being monitored (from what I understand, just a few seconds) is
> being erased, un-sent and un-used -- and unable to be retrieved at any
> point in the future.
>
> Of course, as has always been the case with Siri, once a match is made
> and a Siri command is sent off to Apple, it's associated with your
> device using a random identifier, not your Apple ID or another
> personalized piece of info. That information is then 'approved' for use
> in improving the service, because you've made an explicit choice to ask
> Apple's remote servers to answer a query.
>
> "If a user chooses to turn off Siri, Apple will delete the User Data
> associated with the user's Siri identifier, and the learning process
> will start all over again," says Apple.
>
> Meanwhile, at Amazon and others...
>
> <https://www.bloomberg.com/news/articles/2019-04-10/is-anyone-listening-to-you-on-alexa-a-global-team-reviews-audio>
>
> Amazon Workers Are Listening to What You Tell Alexa
>
> A screenshot reviewed by Bloomberg shows that the recordings sent to the
> Alexa reviewers don't provide a user's full name and address but are
> associated with an account number, as well as the user's first name and
> the device's serial number.
>
> Occasionally the listeners pick up things Echo owners likely would
> rather stay private: a woman singing badly off key in the shower, say,
> or a child screaming for help. The teams use internal chat rooms to
> share files when they need help parsing a muddled word--or come across
> an amusing recording.
>
> <https://www.dailymail.co.uk/sciencetech/article-6910791/Alexa-listening-conversations.html>
>
> Alexa IS listening to your conversations: Web giant ADMITS clips are
> analysed by Amazon workers - including your most intimate moments
>
> <https://www.dailymail.co.uk/sciencetech/article-6956531/Amazon-employees-listening-Alexa-recordings-customers-live.html>
>
> Amazon employees listening to your Alexa recordings can also easily find
> customers' home addresses, report claims



What the apologists _always_ miss are the basic facts of the matter!
o They "claim" that the Apple ecosystem is, somehow, magically, more private.

"These recordings are accompanied by user data showing location, contact
details, and app data."

And yet, they "justify" the lack of Apple privacy - on the other ecosystems!
o *Apple advertises PRIVACY that is, supposedly, better than Google/Amazon!*

And yet - the Apple ecosystem is no more private than any other ecosystem.
o It's simply the MARKETING of Apple products that's different.

o Privacy between _all_ the venders ... is about the same.
<https://groups.google.com/d/msg/misc.phone.mobile.iphone/MiZixhidmOs/ATC1S3s4FQAJ>

Witness FACTS presented in detail here:
o What is the factual truth about PRIVACY differences or similarities between the Android & iOS mobile phone ecosystems?
<https://groups.google.com/d/msg/misc.phone.mobile.iphone/MiZixhidmOs/ATC1S3s4FQAJ>

"Apple does not explicitly disclose it in its consumer-facing privacy
documentation... [and] while Amazon and Google allow customers to opt out
of some uses of their recordings, *Apple doesn't offer a similar privacy*
*protecting option, outside of disabling Siri entirely*."

Arlen G. Holder

unread,
Aug 4, 2019, 12:16:22 AM8/4/19
to
On Sun, 28 Jul 2019 15:56:14 -0000 (UTC), Arlen G. Holder wrote:

> o Privacy between _all_ the venders ... is about the same.
> <https://groups.google.com/d/msg/misc.phone.mobile.iphone/MiZixhidmOs/ATC1S3s4FQAJ>

Just as they were forced to admit to secret throttling, and they were
forced to admit that they ignored egregious FaceTime security holes, Apple
yet again only gives a shit about privacy when the shit hits the fan.

o Apple suspends Siri program that allows employees to listen in on users'
private conversations
<https://www.rt.com/business/465730-apple-siri-suspend-privacy/>

Cult of Mac: Siri eavesdropping controversy underlines why Apple must be
more transparent
<https://www.cultofmac.com/642830/siri-eavesdropping-controversy-apple-transparenct/>

Apple Contractors Will Stop Listening to Your Siri Recordings - for now
<https://www.wired.com/story/apple-siri-recordings-facebook-facial-recognition-roundup/>

VentureBeat: Apple and Google halt human voice-data reviews over privacy
backlash, but transparency is the real issue
<https://venturebeat.com/2019/08/02/apple-and-google-halt-human-voice-data-reviews-over-privacy-backlash-but-transparency-is-the-real-issue/>

Voice assistant companies abandon snooping practices after being found out
<https://www.rt.com/news/465704-apple-amazon-alexa-spying/>

Apple and Google Workers Stop Listening to What You Ask Your Voice
Assistant, For Now
<https://www.thedailybeast.com/apple-and-google-pause-human-voice-recording-review-over-privacy-concerns>

You Can Now Disable Human Review of Your Alexa Recordings
<https://www.iclarified.com/71905/you-can-now-disable-human-review-of-your-alexa-recordings>

Hey Apple, Opt out is useless. Let people opt in
<https://www.wired.com/story/hey-apple-opt-out-is-useless/>

MacWorld: So Apple¢s going to stop listening in on your Siri requests. Now
what?
<https://www.macworld.com/article/3429817/so-apples-going-to-stop-listening-in-on-your-siri-requests-now-what.html>

Apple halts contractors listening to Siri recordings, will offer opt-out
<https://www.scmagazine.com/home/security-news/privacy-compliance/apple-announced-it-will-temporarily-suspend-its-practice-of-allowing-human-contractors-to-grade-snippets-recordings-of-siri-conversations-for-accuracy/>
0 new messages