اپل همچنین از انسانها برای گوش دادن به برخی از ضبط های سیری استفاده می کند

Translating…

“Hey Siri, are you spying on me?”

Image: jhila farzaneh / mashable

ByRaymond Wong

UPDATE: Aug. 2, 2019, 9:20 a.m. EDTApple says it’ssuspending its Siri grading program. Here’s what the company toldTechCrunch:

“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

The original story follows below.


News alert: yourSirivoice recordings may not be entirely private.

According toThe Guardian, Apple hires contractors to listen to Siri recordings in order to improve the accuracy and quality of the voice assistant.

These contractors “regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or ‘grading,'” the report claims.

LikeAmazonandGoogle, both of which alsoemploy humanstoreviewsomerecordingsfrom their respectiveAlexaandAssistant, Apple doesn’t disclose that real people review Siri recordings, either. 

Not exactly a good look for a company thatprides itself on taking privacy more seriouslythan other tech companies.

The exposé on Apple comes from an anonymous whistleblower who spoke withThe Guardian, voicing concerns on how the undisclosed Siri data could be potentially misused. 

Though, Apple’s contractors are hired to review a small portion of Siri recordings and told to only report Siri recordings for technical problems such as accidental activations, not on the content itself, the whistleblower said it’s uncomfortable to hear conversations such as ones where people are engaging in sexual acts or drug dealings.

“There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad,” the whistleblower toldThe Guardian. “It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.”

Apple told theThe Guardianthat Siri recordings are “used to help Siri and dictation … understand you better and recognize what you say.” 

Additionally, Apple dodged around the possibility that any recordings could be used to identify a person.

“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID,” Apple toldThe Guardian. “Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” 

According to Apple, less than one percent of a random subset of daily Siri activations are used for “grading.” We’ve reached out to Apple to get clarification on the process of using contractors to listen to Siri recordings and will update this story if we get a response.

Personally, I’m with Apple blogger and pundit Jason Snell, whopulls no puncheswith his reaction to the news:

It doesn’t matter to me if this is Amazon or Apple. I don’t want human beings listening to the audio these devices record. In fact, I don’t want recordings made of my audio, period—I want the audio processed and immediately discarded.

Apple boasts constantly about taking user privacy seriously. There’s one right response to this report, and it’s to change its policies and communicate them clearly. A mealy-mouthed response about how the eavesdropping is done in a secure facility without an Apple ID attached is not good enough.

The news is another strike against using voice assistants. Sure, Alexa, Google Assistant, and Siri can be very convenient, but the amount of data they collect comes at the cost privacy. 

Even if it’s a small portion of recordings (and mostly accidental activations) that Apple’s hired contractors are listening to, these recorded conversations could include easily-identifiable information such as addresses, phone numbers, etc. In the wrong hands, sensitive information could be misused or sold without your permission.

Are you willing to risk that possibility for the convenience of a voice assistant?

 

Read More