Published in News

Siri recording your conversations

by on29 July 2019


Contractors have a good laugh about your accent

While Apple has been at the forefront of lecturing other companies about privacy it seems to have forgotten that it recorded customer conversations to its outside contractors.

The Guardian reported that contractors who review Siri recordings for accuracy and to help make improvements are hearing personal conversations.

One of the contract workers told The Guardian that Siri did sometimes record audio after mistaken activations. The wake word is the phrase “hey Siri” but the anonymous source said that it could be activated by similar-sounding words or with the noise of a zipper. They also said that when an Apple Watch is raised and speech is detected, Siri will automatically activate.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on”, the source said. “These recordings are accompanied by user data showing location, contact details, and app data.”

Apple has said that it takes steps to protect users from being connected with the recordings sent to contractors. The audio is not linked to an Apple ID and less than one percent of daily Siri activations are reviewed. It sets confidentiality requirements for those contract workers.

Apple, along with Google and Amazon, all have similar policies for the contract workers it hires to review those audio snippets. But all three voice AI makers have also been the subject of similar privacy breaches, either by whistleblowers going to the press or through errors that give users access to incorrect audio files.

Last modified on 29 July 2019
Rate this item
(0 votes)

Read more about: