Dataleak, waarom iedereen een target kan zijn

De Guardian is een engelstalig medium, translate zou kunnen helpen

Best een eye opener.


Eens kijken of dit keer de media en politiek wakker schudt nu ze zelf het doelwit zijn en geconfronteerd worden met de gevolgen van surveillance en backdoors.

Gerelateerd artikel, kopie van deze post

Lees verder

Inside the Industry That Unmasks People at Scale

Unique IDs linked to phones are supposed to be anonymous. But there’s an entire industry that links them to real people and their address.

Joseph Cox

By Joseph Cox

July 14, 2021, 3:00pm

Screen Shot 2021-02-24 at 3

Hacking. Disinformation. Surveillance. CYBER is Motherboard’s podcast and reporting on the dark underbelly of the internet.


Tech companies have repeatedly reassured the public that trackers used to follow smartphone users through apps are anonymous or at least pseudonymous, not directly identifying the person using the phone. But what they don’t mention is that an entire overlooked industry exists to purposefully and explicitly shatter that anonymity.

They do this by linking mobile advertising IDs (MAIDs) collected by apps to a person’s full name, physical address, and other personal identifiable information (PII). Motherboard confirmed this by posing as a potential customer to a company that offers linking MAIDs to PII.

“If shady data brokers are selling this information, it makes a mockery of advertisers’ claims that the truckloads of data about Americans that they collect and sell is anonymous,” Senator Ron Wyden told Motherboard in a statement.

“We have one of the largest repositories of current, fresh MAIDS<>PII in the USA,” Brad Mack, CEO of data broker BIGDBM told us when we asked about the capabilities of the product while posing as a customer. “All BIGDBM USA data assets are connected to each other,” Mack added, explaining that MAIDs are linked to full name, physical address, and their phone, email address, and IP address if available. The dataset also includes other information, “too numerous to list here,” Mack wrote.

A MAID is a unique identifier a phone’s operating system gives to its users’ individual device. For Apple, that is the IDFA, which Apple has recently moved to largely phase out. For Google, that is the AAID, or Android Advertising ID. Apps often grab a user’s MAID and provide that to a host of third parties. In one leaked dataset from a location tracking firm called Predicio previously obtained by Motherboard, the data included users of a Muslim prayer app’s precise locations. That data was somewhat pseudonymized, because it didn’t contain the specific users’ name, but it did contain their MAID. Because of firms like BIGDBM, another company that buys the sort of data Predicio had could take that or similar data and attempt to unmask the people in the dataset simply by paying a fee.

“Anyone and everyone who has a phone and has installed an app that has ads, currently is at risk of being de-anonymized via unscrupulous companies,” Zach Edwards, a researcher who has closely followed the supply chain of various sources of data, told Motherboard in an online chat. “There are significant risks for members of law enforcement, elected officials, members of the military and other high-risk individuals from foreign surveillance when data brokers are able to ingest data from the advertising bidstream,” he added, referring to the process where some third parties obtain data on smartphone users via the placement of adverts.

This de-anonymization industry uses various terms to describe their product, including “identity resolution” and “identity graph.” Other companies claiming to offer a similar service as BIGDBM include FullContact, which says it has 223 billion data points for the U.S., as well as profiles on over 275 million adults in the U.S.

“Our whole-person Identity Graph provides both personal and professional attributes of an individual, as well as online and offline identifiers,” marketing material from FullContact available online reads, adding that can include names, addresses, social IDs, and MAIDs.

“MAIDs were built for the marketing and advertising community, and are tied to an individual mobile device, which makes them precise in identifying specific people,” the material adds.

On a listing advertising its capability to link MAIDs to personal information, BIGDBM says “The BIGDBM Mobile file was developed from online providers, publishers and a variety of data feeds we currently obtain from a multitude of sources.” That listing did not list the specific types of PII that BIGDBM offers, so Motherboard posed as a potential customer interested in sourcing such data for a stealth startup.

BIGDBM did not respond to multiple requests for comment. FullContact did not respond to a list of questions, including whether its MAIDs and PII is collected with consent, and what sort of protections FullContact has in place to stop abuse of its capability to unmask the person behind a MAID.

Edwards said that the existence of companies that explicitly link MAIDs to personal information may provide issues under privacy legislation.

“This real-world research proves that the current ad tech bid stream, which reveals mobile IDs within them, is a pseudonymous data flow, and therefore not-compliant with GDPR,” Edwards told Motherboard in an online chat.

“It’s an anonymous identifier, but has been used extensively to report on user behaviour and enable marketing techniques like remarketing,” a post on the website of the Internet Advertising Bureau, a trade group for the ad tech industry, reads, referring to MAIDs.

In April Apple launched iOS 14.5, which introduced sweeping changes to how apps can track phone users by making each app explicitly ask for permission to track them. That move has resulted in a dramatic dip in the amount of data available to third parties, with just 4 percent of U.S. users opting-in. Google said it plans to implement a similar opt-in measure broadly across the Android ecosystem in early 2022.

Apple and Google acknowledged requests for comment but did not provide a statement on whether they have a policy against companies unmasking the real people behind MAIDs.

Senator Wyden’s statement added “I have serious concerns that Americans’ personal data is available to foreign governments that could use it to harm U.S. national security. That’s why I’ve proposed strong consumer privacy legislation, and a bill to prevent companies based in unfriendly foreign nations from purchasing Americans’ personal data.”

Grappig hoe dit soort redeneringen maar een kant op gaan… Alleen als het een gevaar van buiten is moet er wat aan gedaan worden… Met de eigen data oogst is natuurlijk helemaal niets mis…?

In de USA gelden privacy-regels altijd expliciet ter bescherming van eigen inwoners. Met gegevens over buitenlanders mogen Amerikaanse bedrijven alles doen.
Dus is de opmerking:

niet heel verrassend.
Europa heeft de AVG, maar daar hoeven Amerikanen zich niets van aan te trekken (in hun eigen land).

Vandaar mijn tip:
doe alles wat je kunt om geen info over jezelf aan die kant van de oceaan te krijgen. Data daar is vogelvrij.

1 like

niet heel verrassend.
Europa heeft de AVG, maar daar hoeven Amerikanen zich niets van aan te trekken (in hun eigen land).

Dit ligt iets genuanceerder… Zelfs een Amerikaan die in de EU reist en verblijft heeft bescherming van AVG, EU burgers kunnen eisen van elk bedrijf dat in de EU handelt hoe er met de data omgegaan wordt.
Het probleem ontstaat als daar een juridische procedure uit volgt. Dan ben je afhankelijk van de goedheid van de gedaagde partij… (waarom klinkt dat raar), Safe Harbour en de opvolger bleken wassen neuzen zonder afdwingbare rechten.
Waardoor het verdrag hierover met de USA waardeloos is geworden aldus een uitspraak van het EU Hoog gerechtshof.

Maar effectief is data afstaan aan een USA-bedrijf ZELFS in de EU een risico. Want de US overheid kan het gewoon opeisen als een bedrijf een kantoor in de US aanhoudt.

Dank je @Noci voor jouw verheldering.
Dit is voor ons allemaal goed.