Privacy
iStock

Apple puts halt on contractors listening to your Siri conversations

An opt-out option will be made available at a later date

Like GearBrain on Facebook

Apple has stopped contractors from listening to Siri recording globally. The move comes after a report claimed workers regularly heard private and confidential recordings while working to improve Apple's voice assistant.

Called Siri grading, this work involved listening to a small number of conversations held between Apple users and the company's Siri voice assistant. By listening to these recordings and understanding how Siri responded to a question or command, Apple can improve the artificial intelligence.

Read More:

Amazon and Google also perform similar grading with their own voice assistants.

The report claimed Apple contractors "regularly" heard private information, drug deals, medical details, and people having sex. Most of these recordings came through accidental triggers, where an Apple device like an iPhone or HomePod speaker thinks it heard someone say 'Hey Siri,' then listens for up to 30 seconds, before sending the audio to an Apple server, where a small number were analyzed by contractors. The Apple Watch, which can be set to activate Siri with a lift of the arm, was reportedly particularly bad at this.

In response, Apple has halted the Siri grading globally, and will soon introduce a system where users can opt out of their data being analyzed by human contractors.

Photo of the Apple Watch Series 4 Siri on the Apple Watch is particularly bad at listening when it shouldn't GearBrain

Apple said in a statement: "We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."

The Guardian, which first reported on the Siri grading system, said contractors working on the system in Ireland were not told about the halt until they were sent home from work on Friday morning (August 2). They were told the grading system "was not working" globally. Managers were told to stay on site, but were not told what the change means for their future employment.

One contractor speaking earlier to the newspaper about issues with the grading system said: "You can definitely hear a doctor and patient, talking about the medical history of the patient. Or you'd hear someone, maybe with car engine background noise - you can't say definitely, but it's a drug deal...you can definitely hear it happening. And you'd hear, like, people engaging in sexual acts that are accidentally recorded on the [Home] pod or the [Apple] Watch."

Prior to this report, Apple did not say explicitly that humans would be listening to some Siri recordings. The company instead said Siri data might be used to help Siri "understand you better and recognize what you say."

Like GearBrain on Facebook
Show Comments ()

The GearBrain

See which products "work with" either Google Assistant or Amazon Alexa by clicking on the device below.