Google and Siri apparently listen in to private conversations even when they aren’t supposed to, say multiple lawsuits that have been filed against them over the years. And on Thursday, a federal judge has ruled that tech biggie Apple will have to continue fighting a lawsuit that it has been slammed with by users in California, who have alleged that Siri, the company’s voice assistant, often inappropriately records private conversations.
Recieved The Green Light
Apple had requested for the lawsuit to be rejected, but Federal Judge Jeffrey S. White, from the District Court of Oakland, has allowed most of the case to go forward. At the same time, he did dismiss one piece of the lawsuit, pertaining to the harm to users’ economics.
Nevertheless, the plaintiffs, who have been trying to file the case as a class action lawsuit, have been given the go-ahead for pursuing their claims regarding Siri recording private conversations after turning on unprompted. Moreover, they have also alleged the assistant of passing over user data to third parties, which, they claim, is a violation of user privacy rights.
Just One Of Many Instances
This happens to be just one of many suits that have been brought upon Google, Apple, and Amazon amid the rising concerns regarding the violation of user privacy by voice assistants like Alexa, Siri, and Google Assistant. While the technologies have been developed to help users with day-to-day tasks like adding items to shopping lists or playing music, more often than not, they have been implicated in ‘spying’ on users by turning on even if they aren’t prompted, and listening in on (and storing) conversations.
Nevertheless, the companies have often taken to outright denying these allegations and refuting such claims. Take, for example, the statement made by Amazon spokesperson Faith Eischen, who says that Amazon collects and store audio only when its devices detect the “wake word,” with only a small proportion of the data being reviewed manually.

Apple has already provided court filings to back its claims, and Google too, is ready to fight the case in court.
However, Stanford associate professor Noah Goodman has said that while such tech are designed to detect their wake word, the same is a challenging task considering how human voices differ depending on the person. He adds that it is unlikely that these companies will be able rid “to get rid of false alarms completely.”
Source: The Washington Post