Apple Admits a Small Portion of Siri Recordings Are Heard by Humans

Apple allows Siri recordings to be heard by contractors as part of a process called "grading", which improves the efficacy of the voice assistant, a report claims. This frequently includes confidential information, such as medical history, sexual interactions, and even drug deals, a whistleblower working for one of the contractors is cited to say. Apple has responded to the report, confirming that a small portion of Siri recordings is indeed used for improvements.

The news comes at a time when Amazon and Google, both of which also offer voice assistant services, have admitted third parties have access to some voice details. Unlike them, however, Apple has built and enjoys a reputation of safeguarding the privacy of its users.
The report's claims
The Guardian cites a whistleblower at one of the contractors allegedly working for Apple to claim the Cupertino-headquartered company releases a small proportion of Siri recordings to such contractors. These contractors are expected to grade the responses on numerous factors, such as "whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri may be expected to assist with and whether or not Siri's response was acceptable."
Accidental activations of Siri, where the voice assistant mistakenly hears its wake word, are often fraught with confidential information, the whistleblower adds.
"There are uncounted instances of recordings that include personal discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters, and so on. These recordings square measure in the course of user knowledge showing location, contact details, and app data," the whistleblower is quoted to say.
While Siri is most often associated with iPhone and Mac devices, the contractor claims the Apple Watch and HomePod are in fact the most common sources of accidental activations.
"The regularity of accidental triggers on the watch is unbelievably high. The watch will record some snippets which will be thirty seconds - not that long however you'll gather a decent plan of what is happening," the whistleblower adds.
Staff is encouraged to treat recordings of accidental activations as a "technical problem", but no procedure was said to be in place to deal with sensitive information. The contractor alleges that employees are expected to hit targets as fast as possible. The report adds that the whistleblower's motivation for disclosure was based on fears of such data being misuses, as there purportedly is not much vetting on who works with the data, a high turnover rate of employees, no proper guidelines about privacy, and the possibility to identify the users.
"It would not be tough to spot the person who you are paying attention to, especially with accidental triggers - addresses, names and so on," the whistleblower added.
Finally, the report claims Apple doesn't explicitly mention Siri recordings are made available to humans, not just those that directly work for it but even contractors. The recordings are said to be made available with pseudonymized identifiers. The whistleblower emphasizes that the company should especially remove the patently false "I only listen when you are talking to me" Siri response to the query "Are you always listening?".
Apple's response
In response to The Guardian report, Apple said Siri recordings are used to "help Siri and dictation... understand you better and recognize what you say."
It adds, "A tiny portion of Siri requests square measure analyzed to boost Siri and dictation. User requests don't seem to be related to the user's Apple ID. Siri responses square measure analyzed in secure facilities and every one reviewer's square measure underneath the duty to stick to Apple's strict confidentiality needs." The Cupertino company is additionally cited to mention that less than 1 percent of daily Siri activations, and only a random subset, are used for grading. These recordings are usually only a few seconds long, the company is reported to add.

For the latest tech news follow Tech Knowledge Tunes on TwitterFacebook, and subscribe to our YouTube channel.
Previous Post
Next Post
Related Posts

0 comments: