Enlarge / "Siri, define the word "surprising.'" "Okay. Ask me to define the word 'mother' twice, then." (credit: Apple)

Voice assistants are growing in popularity, but the technology has been experiencing a parallel rise in concerns about privacy and accuracy. Apple’s Siri is the latest to enter this gray space of tech. This week, The Guardian reported that contractors who review Siri recordings for accuracy and to help make improvements may be hearing personal conversations.
One of the contract workers told The Guardian that Siri did sometimes record audio after mistaken activations. The wake word is the phrase “hey Siri,” but the anonymous source said that it could be activated by similar-sounding words or with the noise of a zipper. They also said that when an Apple Watch is raised and speech is detected, Siri will automatically activate.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the source said. “These recordings are accompanied by user data showing location, contact details, and app data.”

Read 5 remaining paragraphs | Comments


More...