If you were holding a competition for the scummiest business model, then data brokers would be very high up the list. These companies make money by buying personal data from app and website owners and selling it to companies who want to spam us.
A US Senator has now drawn attention to the latest sketchy practice by these companies: making it harder for us to opt out by hiding that option from search results …
The dark world of data brokers
Data brokers are companies that buy personal data from a wide range of sources. Much of it is gathered from internet browsing history and app usage.
Data is supposed to be anonymized – that is, it should be possible for someone buying the data to know that you are, for example, a 30-40 year old man living in California who owns an iPhone 15 Pro Max and travels regularly to Las Vegas, but it should not be possible to specifically identify you by name.
However, countless tests and studies have shown that we now collect such a huge range of data that it is often trivial for a buyer to identify specific individuals and even US troop movements in war zones.
Location data, sold by the developers of many mobile apps, makes this especially easy; how many people leave your home address each morning and travel to your workplace, for example?
Hiding opt-out information from search engines
Unsurprisingly for such an unsavory activity, data brokers are not exactly noted for their compliance with the law. For example, when individual states have enacted privacy legislation which would limit their activities, a large number of them have simply failed to register.
Hope that their activities would be constrained by federal law were dashed when the Trump administration killed plans to do this.
A new investigation headed up by US Senator Maggie Hassan has revealed that at least 35 data brokers have been using robots.txt files to instruct search engines not to index their opt-out pages. Wired reports:
The investigation found dozens of registered brokers obscuring their opt-out tools by hiding them from Google and other search results. Consumer advocates called it a “clever work-around” that undermines privacy rights and may qualify as an illegal dark pattern—a design decision that, according to California’s privacy regulator, erodes consumer “autonomy, decision making, or choice when asserting their privacy rights or consenting.”
Hassan wants the firms to justify the placement of their opt‑out pages; acknowledge whether they used code to block search indexing and, if so, against how many users; pledge to remove any such code by September 3; and provide Congress with recent audit results and steps taken since the investigation, if any, to improve user access.
Highlighted accessories
Photo by Marek Piwnicki on Unsplash
FTC: We use income earning auto affiliate links. More.