You probably never have heard of Acxiom, but he probably knows you: the Arkansas firm complaints to have data on 2.5 billion people in the world. And in the United States, if anyone is interested in this information, there is virtually no restriction on their ability to buy and then use it.
Step into the data brokerage industry, the multi-billion dollar economy of selling the intimate details of consumers and citizens. Much of the privacy talk has rightly pointed to Facebook, Twitter, YouTube, and TikTok, which collect user information directly. But a much larger ecosystem of buying, licensing, selling and sharing data exists around these platforms. Data brokerage firms are intermediaries in surveillance capitalism – buying, aggregating and repackaging data from various other companies, all with the aim of selling it or distributing it further.
Data brokerage is a threat to democracy. Without strong national guarantees of confidentiality, entire databases of citizen information are ready to be purchased, whether for predatory loan companies, law enforcement agencies, or even foreign actors. malicious. Federal privacy bills that do not pay enough attention to data brokering will therefore fail to tackle a huge chunk of the data surveillance economy and leave rights behind. civilians, national security and public-private borders vulnerable in the process.
Big data brokers, like Acxiom, CoreLogic, and Epsilon, tout the detail of their data on millions, if not billions, of people. CoreLogic, for example, ad its real estate and real estate information on 99.9 percent of the US population. Acxiom promotes Over 11,000 “data attributes,” from auto loan information to travel preferences, on 2.5 billion people (all to help brands connect with people in an “ethical” way, he adds. -he). This level of data collection and aggregation allows for remarkably specific profiling.
Need to run ads targeting poor families in rural areas? Check out the “Rural and Barely Making It” dataset from a data broker. Or how about racial profiling of financial vulnerability? Purchase the “Ethnic Second-City Strugglers” dataset from another company. These are just a few of the disturbing headlines captured in a 2013 Senate report on industry data products, which have only grown since. Many other brokers advertise their ability to identify subgroups out of subgroups of individuals using criteria such as race, sex, marital status and income level, all sensitive characteristics of which citizens probably didn’t know they would end up in a database – let alone put up for sale.
These companies often acquire the information through purchases, licenses, or other sharing agreements with third parties. Oracle, for example, “owns and works with” more than 80 data brokers, according to a 2019 report Financial Times report, bringing together information on everything from consumer purchases to internet behavior. However, many companies also collect publicly searchable data from the internet and then aggregate it to sell or share. “People search” websites often fall into the latter category – compiling public records (property filings, court documents, voting records, etc.) on people, then letting anyone on the Internet search for their information. .
All of these uncontrolled practices violate civil rights. The companies that boast of owning thousands of data points on millions or billions of people – all for sale to anyone who buys – are themselves the aggregation of unbridled surveillance power. This is particularly dangerous for the less powerful. As centuries of surveillance have done in the United States undeniably clear, the impact of storing individuals’ personal information will be harder on people who are already oppressed or marginalized: the poor, black and brown communities, indigenous populations, LGBTQ + people, undocumented immigrants. Specific “people search” websites may post addresses and thus activate domestic violence or doxing. Strong financial incentives to sell data, with virtually no limitations, give these companies every reason to share their data with others, including those who use it for harmful purposes.