- The Washington Times - Tuesday, July 30, 2019

File this under “Yet Another Reason To Keep Siri Voice Assistant Out Of Your Home.”

A whistleblower who works for Apple, the maker of Siri software, said the voice assistant can accidentally activate during the most private of times — like when people are discussing sensitive business or medical matters. Or cutting drug deals. Or having sex.

Can you say awkward?

In an interview with The Guardian, an Apple worker requesting anonymity said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details and app data.”

That’s due to accidental activations of the gadget, and the fact Apple doesn’t exactly make clear that some recordings are passed on to contractors who check for sound quality and the accuracy of the device’s responses to queries. Apple, in privacy statements to consumers, does tip that “a small portion of Siri requests are analyzed to improve Siri and dictation.”

But Apple doesn’t explicitly state that this quality control means humans will be listening to conversations Siri users assume are private. And Apple also doesn’t explicitly tell about the risks of using its watch, which comes equipped with Siri.

“The regularity of accidental triggers on the watch is incredibly high,” the whistleblower said, The Guardian reported. “[Y]ou can definitely hear a doctor and patient, talking about the medical history of the patient. … And you’d hear, like, people engaging in sexual acts.”

Come on Apple. How about some clear product warnings.

Siri is on the company’s phone, computers, televisions and HomePod speakers. It can be disabled.

But if users aren’t properly informed of the risks of using Siri — why would they even think of disabling the device?

And that’s the big problem with all this emerging technology. Too often, privacy risks are hidden from the public. Too often, companies and developers and researchers think profit first, protections of the public second.

After all, if Apple had truthfully warned users to take off their watch before meeting with their doctors or face the risk of those conversations being shared with unknown, anonymous contractors around the world — well then, who would buy that watch?

Right.

So Apple couches its language, chooses its warning words carefully. Gives just enough to keep the legal team calm; not enough to make the marketing mavens panic.

Consumers, buyers, innocents in the public, meanwhile, only think they’re getting the coolest new tech toy. Little do they know their love-lives are maybe, just maybe being recorded and overheard. And God knows what else.

• Cheryl Chumley can be reached at cchumley@washingtontimes.com or on Twitter, @ckchumley.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.