TL;DR
- Google has filed a preliminary class action settlement agreeing to pay $68M in an Assistant privacy suit.
- Plaintiffs alleged that they were targeted with ads based on conversations that Assistant was never supposed to be listening to.
- Despite its willingness to settle, Google denies any wrongdoing.
How much of your life have you lived with a smart speaker listening in on you? The original Google Home turns 10 later this year, and Amazon Echo’s been around even longer. For as convenient as this kind of smart home hardware can be, are we sacrificing too much of our privacy by having it around? That old question is back on our minds again this week, as we get word about some big progress in a class-action Google Assistant privacy lawsuit.
Don’t want to miss the best from Android Authority?
Smart speakers like those used by Google Assistant (and now Gemini) are supposed to offer a number of features to protect our privacy. Beyond things like mic-cutoff switches to mute input, we rely on their wake-word detection to only listen and respond to us when we directly address them.
Several years back, however, Assistant users took issue with the way that their words were being used to target them with advertising, and after a lengthy path through the courts, Google has finally agreed to pay $68M to settle the suit (via Reuters).
Specifically, plaintiffs objected to how Google was using speech heard during false activations of Assistant.
When we search for a product with Google, it’s understood that we’re likely to get served relevant advertising. And that extends to things you talk about with Assistant. However, the plaintiffs in this action noticed that they were getting ads based not on anything they explicitly asked Google about, but conversations that Assistant appeared to record after mishearing an “OK, Google” prompt that never actually occurred.
The preliminary settlement still requires judicial approval, but this signals Google’s interest in putting the years-long saga to rest. Google has not admitted any wrongdoing, and earlier claimed that it “never promises that the Assistant will activate only when plaintiffs intend it to.”
Thank you for being part of our community. Read our Comment Policy before posting.
