Apple employees will stop listening to accidental Siri recording

Apple employees will stop listening to accidental Siri recording

You know how occasionally Siri will speak up even when you didn’t call for her? Turns out, Apple was recording those interactions, and their contractors were listening to them.

That’s according to a report from The Guardian, finding that Apple contractors “regularly hear confidential medical information, drug deals, and recordings of couples having sex.”

To clarify, it wasn’t just accidental recordings. Apple – and all of their competitors – know that the best way to improve tools like Siri is to record what is spoken. That way, if a specific request hasn’t been pre-programmed, they can easily add it.

Apple has reviewed their policies and instituted these changes:

First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

My Privacy Tutor’s advice: Just disable Siri. The occasional time-savings for finding out the current temperature isn’t worth the risk to your privacy.