• Apple will temporarily suspend and review a global program that allows contractors to listen to recordings of queries made to the voice assistant Siri, following a report that contractors “regularly” hear private and confidential information.
  • Multiple reports confirm that Apple is tabling the program – called grading – which allows the company to monitor user interaction with Siri for quality control.
  • The program’s suspension follows a report published by The Guardian last week that revealed that contractors involved in the review program could “regularly hear confidential medical information, drug deals, and recordings of couples having sex” often as a result of Siri being triggered by accident.
  • Apple told The Guardian that “less than 1%” of daily Siri activations are utilized by graders, and that “all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
  • Visit Business Insider’s homepage for more stories.

Apple will temporarily suspend and review a global program that allows contractors to listen to recordings of queries made to the voice assistant Siri, following a report from The Guardian that contractors “regularly” hear private and confidential information.

TechCrunch reported Thursday night that Apple is tabling the program – called grading – which allows the company to monitor user interaction with Siri used for quality control.

“We are committed to delivering a great Siri experience while protecting user privacy,” an Apple spokesperson said in a statement to The Verge. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

According to Apple, user recordings from Siri queries are saved for a six-month period “so that the recognition system can utilize them to better understand the user’s voice.” After six months, another copy of the recording “without its identifier” is saved for up to two years by Apple in order to “improve and develop” Siri functions.

Las week a report from The Guardian that revealed that contractors involved in the review program could "regularly hear confidential medical information, drug deals, and recordings of couples having sex."

Read more: Amazon workers reportedly listen to what you tell Alexa - here's how Apple and Google handle what you say to their voice assistants

An anonymous contractor expressed concern to The Guardian about the amount of "extremely sensitive personal information" picked up by Siri when its often triggered by accident by its "wake word." Contractors responsible for grading note these interactions, along with deliberate queries.

Apple told The Guardian that "less than 1%" of daily Siri activations are utilized by graders and are typically "only a few seconds long." The company told the Guardian that requests are not associated with an Apple ID and "all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements."

Apple did not immediately response to Business Insider's request for comment.