<p>Last year, a former <a href="https://www.cnet.com/apple/" rel="noreferrer noopener" target="_blank">Apple</a> contractor made waves by raising concerns about how the tech giant handled <a href="https://www.cnet.com/tags/siri/" rel="noreferrer noopener" target="_blank">Siri</a> voice assistant recordings, ultimately leading the tech giant to <a href="https://www.cnet.com/news/apple-will-no-longer-keep-siri-recordings-for-people-to-listen/" rel="noreferrer noopener" target="_blank">cease listening to them without your permission.</a> Now the whistleblower is back, having sent a letter to European regulators asking them to investigate and potentially punish the company.
"I am extremely concerned that big tech companies are basically wiretapping entire populations despite European citizens being told the E.U. has one of the strongest data protection laws in the world," former Apple contractor Thomas Le Bonniec wrote in a Wednesday letter to EU regulators. "Passing a law is not good enough: it needs to be enforced upon privacy offenders."
The letter is the latest example of the fine line that tech companies must walk, balancing the need to use those recordings to improve the effectiveness and smarts of voice assistants with the need to protect the privacy of their customers. Similar criticisms and concerns have been lobbed at Google and Amazon over how they deal with their voice assistants.
Le Bonniec was a contractor with Apple until he quit in 2019, raising ethical concerns with the UK's Guardian newspaper about how the company was behaving. Le Bonniec said Apple collected and transcribed some voice recordings collected by Siri in an effort to improve the service's quality. But, he said, the recordings invaded people's privacy without their knowledge, including recordings of medical diagnoses, sexual encounters and intimate moments.
"I listened to hundreds of recordings every day, from various Apple devices (e.g. iPhones, Apple Watches,or iPads)," Le Bonniec said in his letter. "The recordings were not limited to the users of Apple devices, but also involved relatives, children, friends, colleagues, and whoever could be recorded by the device. The system recorded everything: names, addresses, messages, searches, arguments, background noises, films, and conversations. I heard people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships, or drugs with no intention to activate Siri whatsoever."
Apple last year said the recordings were analyzed in secure facilities and didn't have any additional information attached to them, such as whose account they came from. "All reviewers are under the obligation to adhere to Apple's strict confidentiality requirements," Apple said at the time. The company also promised to change the way it handled Siri, explicitly asking people for consent to share their recordings with the company's teams, and giving them an option to opt in. When reached for comment, Apple pointed to its earlier statements.
See also: Best iPhone VPNs of 2020
<figure><img src="https://cnet3.cbsistatic.com/img/v5kDsqKzVJ2M7-m6lPfhnxfelQc=/196x110/2020/01/28/ffd17b33-189b-4258-bfd6-d5887d5318b9/th-aldredapps2.jpg" /></figure> Now playing: Watch this: Let's talk about why privacy settings are a problem